Sample records for autonomous relative navigation

  1. Autonomous Relative Navigation for Formation-Flying Satellites Using GPS

    NASA Technical Reports Server (NTRS)

    Gramling, Cheryl; Carpenter, J. Russell; Long, Anne; Kelbel, David; Lee, Taesul

    2000-01-01

    The Goddard Space Flight Center is currently developing advanced spacecraft systems to provide autonomous navigation and control of formation flyers. This paper discusses autonomous relative navigation performance for a formation of four eccentric, medium-altitude Earth-orbiting satellites using Global Positioning System (GPS) Standard Positioning Service (SPS) and "GPS-like " intersatellite measurements. The performance of several candidate relative navigation approaches is evaluated. These analyses indicate that an autonomous relative navigation position accuracy of 1meter root-mean-square can be achieved by differencing high-accuracy filtered solutions if only measurements from common GPS space vehicles are used in the independently estimated solutions.

  2. Relative Navigation of Formation Flying Satellites

    NASA Technical Reports Server (NTRS)

    Long, Anne; Kelbel, David; Lee, Taesul; Leung, Dominic; Carpenter, Russell; Gramling, Cheryl; Bauer, Frank (Technical Monitor)

    2002-01-01

    The Guidance, Navigation, and Control Center (GNCC) at Goddard Space Flight Center (GSFC) has successfully developed high-accuracy autonomous satellite navigation systems using the National Aeronautics and Space Administration's (NASA's) space and ground communications systems and the Global Positioning System (GPS). In addition, an autonomous navigation system that uses celestial object sensor measurements is currently under development and has been successfully tested using real Sun and Earth horizon measurements.The GNCC has developed advanced spacecraft systems that provide autonomous navigation and control of formation flyers in near-Earth, high-Earth, and libration point orbits. To support this effort, the GNCC is assessing the relative navigation accuracy achievable for proposed formations using GPS, intersatellite crosslink, ground-to-satellite Doppler, and celestial object sensor measurements. This paper evaluates the performance of these relative navigation approaches for three proposed missions with two or more vehicles maintaining relatively tight formations. High-fidelity simulations were performed to quantify the absolute and relative navigation accuracy as a function of navigation algorithm and measurement type. Realistically-simulated measurements were processed using the extended Kalman filter implemented in the GPS Enhanced Inboard Navigation System (GEONS) flight software developed by GSFC GNCC. Solutions obtained by simultaneously estimating all satellites in the formation were compared with the results obtained using a simpler approach based on differencing independently estimated state vectors.

  3. Angles-only navigation for autonomous orbital rendezvous

    NASA Astrophysics Data System (ADS)

    Woffinden, David C.

    The proposed thesis of this dissertation has both a practical element and theoretical component which aim to answer key questions related to the use of angles-only navigation for autonomous orbital rendezvous. The first and fundamental principle to this work argues that an angles-only navigation filter can determine the relative position and orientation (pose) between two spacecraft to perform the necessary maneuvers and close proximity operations for autonomous orbital rendezvous. Second, the implementation of angles-only navigation for on-orbit applications is looked upon with skeptical eyes because of its perceived limitation of determining the relative range between two vehicles. This assumed, yet little understood subtlety can be formally characterized with a closed-form analytical observability criteria which specifies the necessary and sufficient conditions for determining the relative position and velocity with only angular measurements. With a mathematical expression of the observability criteria, it can be used to (1) identify the orbital rendezvous trajectories and maneuvers that ensure the relative position and velocity are observable for angles-only navigation, (2) quantify the degree or level of observability and (3) compute optimal maneuvers that maximize observability. In summary, the objective of this dissertation is to provide both a practical and theoretical foundation for the advancement of autonomous orbital rendezvous through the use of angles-only navigation.

  4. Evaluation of Relative Navigation Algorithms for Formation-Flying Satellites

    NASA Technical Reports Server (NTRS)

    Kelbel, David; Lee, Taesul; Long, Anne; Carpenter, J. Russell; Gramling, Cheryl

    2001-01-01

    Goddard Space Flight Center is currently developing advanced spacecraft systems to provide autonomous navigation and control of formation flyers. This paper discusses autonomous relative navigation performance for formations in eccentric, medium, and high-altitude Earth orbits using Global Positioning System (GPS) Standard Positioning Service (SPS) and intersatellite range measurements. The performance of several candidate relative navigation approaches is evaluated. These analyses indicate that the relative navigation accuracy is primarily a function of the frequency of acquisition and tracking of the GPS signals. A relative navigation position accuracy of 0.5 meters root-mean-square (RMS) can be achieved for formations in medium-attitude eccentric orbits that can continuously track at least one GPS signal. A relative navigation position accuracy of better than 75 meters RMS can be achieved for formations in high-altitude eccentric orbits that have sparse tracking of the GPS signals. The addition of round-trip intersatellite range measurements can significantly improve relative navigation accuracy for formations with sparse tracking of the GPS signals.

  5. INL Autonomous Navigation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2005-03-30

    The INL Autonomous Navigation System provides instructions for autonomously navigating a robot. The system permits high-speed autonomous navigation including obstacle avoidance, waypoing navigation and path planning in both indoor and outdoor environments.

  6. Autonomous vision-based navigation for proximity operations around binary asteroids

    NASA Astrophysics Data System (ADS)

    Gil-Fernandez, Jesus; Ortega-Hernando, Guillermo

    2018-02-01

    Future missions to small bodies demand higher level of autonomy in the Guidance, Navigation and Control system for higher scientific return and lower operational costs. Different navigation strategies have been assessed for ESA's asteroid impact mission (AIM). The main objective of AIM is the detailed characterization of binary asteroid Didymos. The trajectories for the proximity operations shall be intrinsically safe, i.e., no collision in presence of failures (e.g., spacecraft entering safe mode), perturbations (e.g., non-spherical gravity field), and errors (e.g., maneuver execution error). Hyperbolic arcs with sufficient hyperbolic excess velocity are designed to fulfil the safety, scientific, and operational requirements. The trajectory relative to the asteroid is determined using visual camera images. The ground-based trajectory prediction error at some points is comparable to the camera Field Of View (FOV). Therefore, some images do not contain the entire asteroid. Autonomous navigation can update the state of the spacecraft relative to the asteroid at higher frequency. The objective of the autonomous navigation is to improve the on-board knowledge compared to the ground prediction. The algorithms shall fit in off-the-shelf, space-qualified avionics. This note presents suitable image processing and relative-state filter algorithms for autonomous navigation in proximity operations around binary asteroids.

  7. Autonomous vision-based navigation for proximity operations around binary asteroids

    NASA Astrophysics Data System (ADS)

    Gil-Fernandez, Jesus; Ortega-Hernando, Guillermo

    2018-06-01

    Future missions to small bodies demand higher level of autonomy in the Guidance, Navigation and Control system for higher scientific return and lower operational costs. Different navigation strategies have been assessed for ESA's asteroid impact mission (AIM). The main objective of AIM is the detailed characterization of binary asteroid Didymos. The trajectories for the proximity operations shall be intrinsically safe, i.e., no collision in presence of failures (e.g., spacecraft entering safe mode), perturbations (e.g., non-spherical gravity field), and errors (e.g., maneuver execution error). Hyperbolic arcs with sufficient hyperbolic excess velocity are designed to fulfil the safety, scientific, and operational requirements. The trajectory relative to the asteroid is determined using visual camera images. The ground-based trajectory prediction error at some points is comparable to the camera Field Of View (FOV). Therefore, some images do not contain the entire asteroid. Autonomous navigation can update the state of the spacecraft relative to the asteroid at higher frequency. The objective of the autonomous navigation is to improve the on-board knowledge compared to the ground prediction. The algorithms shall fit in off-the-shelf, space-qualified avionics. This note presents suitable image processing and relative-state filter algorithms for autonomous navigation in proximity operations around binary asteroids.

  8. Autonomous navigation and obstacle avoidance for unmanned surface vehicles

    NASA Astrophysics Data System (ADS)

    Larson, Jacoby; Bruch, Michael; Ebken, John

    2006-05-01

    The US Navy and other Department of Defense (DoD) and Department of Homeland Security (DHS) organizations are increasingly interested in the use of unmanned surface vehicles (USVs) for a variety of missions and applications. In order for USVs to fill these roles, they must be capable of a relatively high degree of autonomous navigation. Space and Naval Warfare Systems Center, San Diego is developing core technologies required for robust USV operation in a real-world environment, primarily focusing on autonomous navigation, obstacle avoidance, and path planning.

  9. A SINS/SRS/GNS Autonomous Integrated Navigation System Based on Spectral Redshift Velocity Measurements.

    PubMed

    Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke

    2018-04-09

    In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology.

  10. Visual Odometry for Autonomous Deep-Space Navigation

    NASA Technical Reports Server (NTRS)

    Robinson, Shane; Pedrotty, Sam

    2016-01-01

    Visual Odometry fills two critical needs shared by all future exploration architectures considered by NASA: Autonomous Rendezvous and Docking (AR&D), and autonomous navigation during loss of comm. To do this, a camera is combined with cutting-edge algorithms (called Visual Odometry) into a unit that provides accurate relative pose between the camera and the object in the imagery. Recent simulation analyses have demonstrated the ability of this new technology to reliably, accurately, and quickly compute a relative pose. This project advances this technology by both preparing the system to process flight imagery and creating an activity to capture said imagery. This technology can provide a pioneering optical navigation platform capable of supporting a wide variety of future missions scenarios: deep space rendezvous, asteroid exploration, loss-of-comm.

  11. Embedded Relative Navigation Sensor Fusion Algorithms for Autonomous Rendezvous and Docking Missions

    NASA Technical Reports Server (NTRS)

    DeKock, Brandon K.; Betts, Kevin M.; McDuffie, James H.; Dreas, Christine B.

    2008-01-01

    bd Systems (a subsidiary of SAIC) has developed a suite of embedded relative navigation sensor fusion algorithms to enable NASA autonomous rendezvous and docking (AR&D) missions. Translational and rotational Extended Kalman Filters (EKFs) were developed for integrating measurements based on the vehicles' orbital mechanics and high-fidelity sensor error models and provide a solution with increased accuracy and robustness relative to any single relative navigation sensor. The filters were tested tinough stand-alone covariance analysis, closed-loop testing with a high-fidelity multi-body orbital simulation, and hardware-in-the-loop (HWIL) testing in the Marshall Space Flight Center (MSFC) Flight Robotics Laboratory (FRL).

  12. A SINS/SRS/GNS Autonomous Integrated Navigation System Based on Spectral Redshift Velocity Measurements

    PubMed Central

    Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke

    2018-01-01

    In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology. PMID:29642549

  13. Open-Loop Performance of COBALT Precision Landing Payload on a Commercial Sub-Orbital Rocket

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina I.; Carson, John M., III; Amzajerdian, Farzin; Seubert, Carl R.; Lovelace, Ronney S.; McCarthy, Megan M.; Tse, Teming; Stelling, Richard; Collins, Steven M.

    2018-01-01

    An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) platform was conducted onboard the Masten Xodiac suborbital rocket testbed. The COBALT platform integrates NASA Guidance, Navigation and Control (GN&C) sensing technologies for autonomous, precise soft landing, including the Navigation Doppler Lidar (NDL) velocity and range sensor and the Lander Vision System (LVS) Terrain Relative Navigation (TRN) system. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a navigation solution that is independent of GPS and suitable for future, autonomous, planetary, landing systems. COBALT was a passive payload during the open loop tests. COBALT's sensors were actively taking data and processing it in real time, but the Xodiac rocket flew with its own GPS-navigation system as a risk reduction activity in the maturation of the technologies towards space flight. A future closed-loop test campaign is planned where the COBALT navigation solution will be used to fly its host vehicle.

  14. Precise laser gyroscope for autonomous inertial navigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, A G; Molchanov, A V; Izmailov, E A

    2015-01-31

    Requirements to gyroscopes of strapdown inertial navigation systems for aircraft application are formulated. The construction of a ring helium – neon laser designed for autonomous navigation is described. The processes that determine the laser service life and the relation between the random error of the angular velocity measurement and the surface relief features of the cavity mirrors are analysed. The results of modelling one of the promising approaches to processing the laser gyroscope signals are presented. (laser gyroscopes)

  15. Autonomous Navigation Using Celestial Objects

    NASA Technical Reports Server (NTRS)

    Folta, David; Gramling, Cheryl; Leung, Dominic; Belur, Sheela; Long, Anne

    1999-01-01

    In the twenty-first century, National Aeronautics and Space Administration (NASA) Enterprises envision frequent low-cost missions to explore the solar system, observe the universe, and study our planet. Satellite autonomy is a key technology required to reduce satellite operating costs. The Guidance, Navigation, and Control Center (GNCC) at the Goddard Space Flight Center (GSFC) currently sponsors several initiatives associated with the development of advanced spacecraft systems to provide autonomous navigation and control. Autonomous navigation has the potential both to increase spacecraft navigation system performance and to reduce total mission cost. By eliminating the need for routine ground-based orbit determination and special tracking services, autonomous navigation can streamline spacecraft ground systems. Autonomous navigation products can be included in the science telemetry and forwarded directly to the scientific investigators. In addition, autonomous navigation products are available onboard to enable other autonomous capabilities, such as attitude control, maneuver planning and orbit control, and communications signal acquisition. Autonomous navigation is required to support advanced mission concepts such as satellite formation flying. GNCC has successfully developed high-accuracy autonomous navigation systems for near-Earth spacecraft using NASA's space and ground communications systems and the Global Positioning System (GPS). Recently, GNCC has expanded its autonomous navigation initiative to include satellite orbits that are beyond the regime in which use of GPS is possible. Currently, GNCC is assessing the feasibility of using standard spacecraft attitude sensors and communication components to provide autonomous navigation for missions including: libration point, gravity assist, high-Earth, and interplanetary orbits. The concept being evaluated uses a combination of star, Sun, and Earth sensor measurements along with forward-link Doppler measurements from the command link carrier to autonomously estimate the spacecraft's orbit and reference oscillator's frequency. To support autonomous attitude determination and control and maneuver planning and control, the orbit determination accuracy should be on the order of kilometers in position and centimeters per second in velocity. A less accurate solution (one hundred kilometers in position) could be used for acquisition purposes for command and science downloads. This paper provides performance results for both libration point orbiting and high Earth orbiting satellites as a function of sensor measurement accuracy, measurement types, measurement frequency, initial state errors, and dynamic modeling errors.

  16. Lunar far side surface navigation using Linked Autonomous Interplanetary Satellite Orbit Navigation (LiAISON)

    NASA Astrophysics Data System (ADS)

    Hesar, Siamak G.; Parker, Jeffrey S.; Leonard, Jason M.; McGranaghan, Ryan M.; Born, George H.

    2015-12-01

    We study the application of Linked Autonomous Interplanetary Satellite Orbit Navigation (LiAISON) to track vehicles on the far side of the lunar surface. The LiAISON architecture is demonstrated to achieve accurate orbit determination solutions for various mission scenarios in the Earth-Moon system. Given the proper description of the force field, LiAISON is capable of producing absolute orbit determination solutions using relative satellite-to-satellite tracking observations alone. The lack of direct communication between Earth-based tracking stations and the far side of the Moon provides an ideal opportunity for implementing LiAISON. This paper presents a novel approach to use the LiAISON architecture to perform autonomous navigation of assets on the lunar far side surface. Relative measurements between a spacecraft placed in an EML-2 halo orbit and lunar surface asset(s) are simulated and processed. Comprehensive simulation results show that absolute states of the surface assets are observable with an achieved accuracy of the position estimate on the order of tens of meters.

  17. Guidance and Navigation Software Architecture Design for the Autonomous Multi-Agent Physically Interacting Spacecraft (AMPHIS) Test Bed

    DTIC Science & Technology

    2006-12-01

    NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI

  18. Development of a Commercially Viable, Modular Autonomous Robotic Systems for Converting any Vehicle to Autonomous Control

    NASA Technical Reports Server (NTRS)

    Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.

    1994-01-01

    A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.

  19. Open-Loop Flight Testing of COBALT Navigation and Sensor Technologies for Precise Soft Landing

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Restrepo, Caroline I.; Seubert, Carl R.; Amzajerdian, Farzin; Pierrottet, Diego F.; Collins, Steven M.; O'Neal, Travis V.; Stelling, Richard

    2017-01-01

    An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) payload was conducted onboard the Masten Xodiac suborbital rocket testbed. The payload integrates two complementary sensor technologies that together provide a spacecraft with knowledge during planetary descent and landing to precisely navigate and softly touchdown in close proximity to targeted surface locations. The two technologies are the Navigation Doppler Lidar (NDL), for high-precision velocity and range measurements, and the Lander Vision System (LVS) for map-relative state esti- mates. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a very precise Terrain Relative Navigation (TRN) solution that is suitable for future, autonomous planetary landing systems that require precise and soft landing capabilities. During the open-loop flight campaign, the COBALT payload acquired measurements and generated a precise navigation solution, but the Xodiac vehicle planned and executed its maneuvers based on an independent, GPS-based navigation solution. This minimized the risk to the vehicle during the integration and testing of the new navigation sensing technologies within the COBALT payload.

  20. Terrain Navigation Concepts for Autonomous Vehicles,

    DTIC Science & Technology

    1984-06-01

    AD-fi144 994 TERRAIN NAVIGATION CONCEPTS FOR AUTONOMOUS VEHICLES (U) i/i I ARMY ENGINEER OPOGRAPHIC LABS FORT BELVOIR VA R D LEIGHTY JUN 84 ETL-R@65...FUNCTIONS The pacing problem for developing autonomous vehicles that can efficiently move to designated locations in the real world in the perfor- mance...autonomous functions can serve as general terrain navigation requirements for our discussion of autonomous vehicles . LEIGHTY Can we build a vehicular system

  1. Open-Loop Flight Testing of COBALT GN&C Technologies for Precise Soft Landing

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Amzajerdian, Farzin; Seubert, Carl R.; Restrepo, Carolina I.

    2017-01-01

    A terrestrial, open-loop (OL) flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) platform was conducted onboard the Masten Xodiac suborbital rocket testbed, with support through the NASA Advanced Exploration Systems (AES), Game Changing Development (GCD), and Flight Opportunities (FO) Programs. The COBALT platform integrates NASA Guidance, Navigation and Control (GN&C) sensing technologies for autonomous, precise soft landing, including the Navigation Doppler Lidar (NDL) velocity and range sensor and the Lander Vision System (LVS) Terrain Relative Navigation (TRN) system. A specialized navigation filter running onboard COBALT fuzes the NDL and LVS data in real time to produce a precise navigation solution that is independent of the Global Positioning System (GPS) and suitable for future, autonomous planetary landing systems. The OL campaign tested COBALT as a passive payload, with COBALT data collection and filter execution, but with the Xodiac vehicle Guidance and Control (G&C) loops closed on a Masten GPS-based navigation solution. The OL test was performed as a risk reduction activity in preparation for an upcoming 2017 closed-loop (CL) flight campaign in which Xodiac G&C will act on the COBALT navigation solution and the GPS-based navigation will serve only as a backup monitor.

  2. ALHAT COBALT: CoOperative Blending of Autonomous Landing Technology

    NASA Technical Reports Server (NTRS)

    Carson, John M.

    2015-01-01

    The COBALT project is a flight demonstration of two NASA ALHAT (Autonomous precision Landing and Hazard Avoidance Technology) capabilities that are key for future robotic or human landing GN&C (Guidance, Navigation and Control) systems. The COBALT payload integrates the Navigation Doppler Lidar (NDL) for ultraprecise velocity and range measurements with the Lander Vision System (LVS) for Terrain Relative Navigation (TRN) position estimates. Terrestrial flight tests of the COBALT payload in an open-loop and closed-loop GN&C configuration will be conducted onboard a commercial, rocket-propulsive Vertical Test Bed (VTB) at a test range in Mojave, CA.

  3. Autonomous navigation - The ARMMS concept. [Autonomous Redundancy and Maintenance Management Subsystem

    NASA Technical Reports Server (NTRS)

    Wood, L. J.; Jones, J. B.; Mease, K. D.; Kwok, J. H.; Goltz, G. L.; Kechichian, J. A.

    1984-01-01

    A conceptual design is outlined for the navigation subsystem of the Autonomous Redundancy and Maintenance Management Subsystem (ARMMS). The principal function of this navigation subsystem is to maintain the spacecraft over a specified equatorial longitude to within + or - 3 deg. In addition, the navigation subsystem must detect and correct internal faults. It comprises elements for a navigation executive and for orbit determination, trajectory, maneuver planning, and maneuver command. Each of these elements is described. The navigation subsystem is to be used in the DSCS III spacecraft.

  4. An Autonomous Navigation Algorithm for High Orbit Satellite Using Star Sensor and Ultraviolet Earth Sensor

    PubMed Central

    Baohua, Li; Wenjie, Lai; Yun, Chen; Zongming, Liu

    2013-01-01

    An autonomous navigation algorithm using the sensor that integrated the star sensor (FOV1) and ultraviolet earth sensor (FOV2) is presented. The star images are sampled by FOV1, and the ultraviolet earth images are sampled by the FOV2. The star identification algorithm and star tracking algorithm are executed at FOV1. Then, the optical axis direction of FOV1 at J2000.0 coordinate system is calculated. The ultraviolet image of earth is sampled by FOV2. The center vector of earth at FOV2 coordinate system is calculated with the coordinates of ultraviolet earth. The autonomous navigation data of satellite are calculated by integrated sensor with the optical axis direction of FOV1 and the center vector of earth from FOV2. The position accuracy of the autonomous navigation for satellite is improved from 1000 meters to 300 meters. And the velocity accuracy of the autonomous navigation for satellite is improved from 100 m/s to 20 m/s. At the same time, the period sine errors of the autonomous navigation for satellite are eliminated. The autonomous navigation for satellite with a sensor that integrated ultraviolet earth sensor and star sensor is well robust. PMID:24250261

  5. An autonomous navigation algorithm for high orbit satellite using star sensor and ultraviolet earth sensor.

    PubMed

    Baohua, Li; Wenjie, Lai; Yun, Chen; Zongming, Liu

    2013-01-01

    An autonomous navigation algorithm using the sensor that integrated the star sensor (FOV1) and ultraviolet earth sensor (FOV2) is presented. The star images are sampled by FOV1, and the ultraviolet earth images are sampled by the FOV2. The star identification algorithm and star tracking algorithm are executed at FOV1. Then, the optical axis direction of FOV1 at J2000.0 coordinate system is calculated. The ultraviolet image of earth is sampled by FOV2. The center vector of earth at FOV2 coordinate system is calculated with the coordinates of ultraviolet earth. The autonomous navigation data of satellite are calculated by integrated sensor with the optical axis direction of FOV1 and the center vector of earth from FOV2. The position accuracy of the autonomous navigation for satellite is improved from 1000 meters to 300 meters. And the velocity accuracy of the autonomous navigation for satellite is improved from 100 m/s to 20 m/s. At the same time, the period sine errors of the autonomous navigation for satellite are eliminated. The autonomous navigation for satellite with a sensor that integrated ultraviolet earth sensor and star sensor is well robust.

  6. Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) Project Status as of May 2010

    NASA Technical Reports Server (NTRS)

    Striepe, Scott A.; Epp, Chirold D.; Robertson, Edward A.

    2010-01-01

    This paper includes the current status of NASA s Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) Project. The ALHAT team has completed several flight tests and two major design analysis cycles. These tests and analyses examine terrain relative navigation sensors, hazard detection and avoidance sensors and algorithms, and hazard relative navigation algorithms, and the guidance and navigation system using these ALHAT functions. The next flight test is scheduled for July 2010. The paper contains results from completed flight tests and analysis cycles. ALHAT system status, upcoming tests and analyses is also addressed. The current ALHAT plans as of May 2010 are discussed. Application of the ALHAT system to landing on bodies other than the Moon is included

  7. Real-Time Hazard Detection and Avoidance Demonstration for a Planetary Lander

    NASA Technical Reports Server (NTRS)

    Epp, Chirold D.; Robertson, Edward A.; Carson, John M., III

    2014-01-01

    The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. In addition to precision landing close to a pre-mission defined landing location, the ALHAT System must be capable of autonomously identifying and avoiding surface hazards in real-time to enable a safe landing under any lighting conditions. This paper provides an overview of the recent results of the ALHAT closed loop hazard detection and avoidance flight demonstrations on the Morpheus Vertical Testbed (VTB) at the Kennedy Space Center, including results and lessons learned. This effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).

  8. Autonomous RPRV Navigation, Guidance and Control

    NASA Technical Reports Server (NTRS)

    Johnston, Donald E.; Myers, Thomas T.; Zellner, John W.

    1983-01-01

    Dryden Flight Research Center has the responsibility for flight testing of advanced remotely piloted research vehicles (RPRV) to explore highly maneuverable aircraft technology, and to test advanced structural concepts, and related aeronautical technologies which can yield important research results with significant cost benefits. The primary purpose is to provide the preliminary design of an upgraded automatic approach and landing control system and flight director display to improve landing performance and reduce pilot workload. A secondary purpose is to determine the feasibility of an onboard autonomous navigation, orbit, and landing capability for safe vehicle recovery in the event of loss of telemetry uplink communication with the vehicles. The current RPRV approach and landing method, the proposed automatic and manual approach and autoland system, and an autonomous navigation, orbit, and landing system concept which is based on existing operational technology are described.

  9. Relative Navigation of Formation-Flying Satellites

    NASA Technical Reports Server (NTRS)

    Long, Anne; Kelbel, David; Lee, Taesul; Leung, Dominic; Carpenter, J. Russell; Grambling, Cheryl

    2002-01-01

    This paper compares autonomous relative navigation performance for formations in eccentric, medium and high-altitude Earth orbits using Global Positioning System (GPS) Standard Positioning Service (SPS), crosslink, and celestial object measurements. For close formations, the relative navigation accuracy is highly dependent on the magnitude of the uncorrelated measurement errors. A relative navigation position accuracy of better than 10 centimeters root-mean-square (RMS) can be achieved for medium-altitude formations that can continuously track at least one GPS signal. A relative navigation position accuracy of better than 15 meters RMS can be achieved for high-altitude formations that have sparse tracking of the GPS signals. The addition of crosslink measurements can significantly improve relative navigation accuracy for formations that use sparse GPS tracking or celestial object measurements for absolute navigation.

  10. Developing Autonomous Vehicles That Learn to Navigate by Mimicking Human Behavior

    DTIC Science & Technology

    2006-09-28

    navigate in an unstructured environment to a specific target or location. 15. SUBJECT TERMS autonomous vehicles , fuzzy logic, learning behavior...ANSI-Std Z39-18 Developing Autonomous Vehicles That Learn to Navigate by Mimicking Human Behavior FINAL REPORT 9/28/2006 Dean B. Edwards Department...the future, as greater numbers of autonomous vehicles are employed, it is hoped that lower LONG-TERM GOALS Use LAGR (Learning Applied to Ground Robots

  11. Autonomous formation flying based on GPS — PRISMA flight results

    NASA Astrophysics Data System (ADS)

    D'Amico, Simone; Ardaens, Jean-Sebastien; De Florio, Sergio

    2013-01-01

    This paper presents flight results from the early harvest of the Spaceborne Autonomous Formation Flying Experiment (SAFE) conducted in the frame of the Swedish PRISMA technology demonstration mission. SAFE represents one of the first demonstrations in low Earth orbit of an advanced guidance, navigation and control system for dual-spacecraft formations. Innovative techniques based on differential GPS-based navigation and relative orbital elements control are validated and tuned in orbit to fulfill the typical requirements of future distributed scientific instruments for remote sensing.

  12. Autonomous Deep-Space Optical Navigation Project

    NASA Technical Reports Server (NTRS)

    D'Souza, Christopher

    2014-01-01

    This project will advance the Autonomous Deep-space navigation capability applied to Autonomous Rendezvous and Docking (AR&D) Guidance, Navigation and Control (GNC) system by testing it on hardware, particularly in a flight processor, with a goal of limited testing in the Integrated Power, Avionics and Software (IPAS) with the ARCM (Asteroid Retrieval Crewed Mission) DRO (Distant Retrograde Orbit) Autonomous Rendezvous and Docking (AR&D) scenario. The technology, which will be harnessed, is called 'optical flow', also known as 'visual odometry'. It is being matured in the automotive and SLAM (Simultaneous Localization and Mapping) applications but has yet to be applied to spacecraft navigation. In light of the tremendous potential of this technique, we believe that NASA needs to design a optical navigation architecture that will use this technique. It is flexible enough to be applicable to navigating around planetary bodies, such as asteroids.

  13. A System for Fast Navigation of Autonomous Vehicles

    DTIC Science & Technology

    1991-09-01

    AD-A243 523 4, jj A System for Fast Navigation of Autonomous Vehicles Sanjiv Singh, Dai Feng, Paul Keller, Gary Shaffer, Wen Fan Shi, Dong Hun Shin...FUNDING NUMBERS A System for Fast Navigation of Autonomous Vehicles 6. AUTHOR(S) S. Singh, D. Feng, P. Keller, G. Shaffer, W.F. Shi, D.H. Shin, J. West...common in the control of autonomous vehicles to establish the necessary kinematic models but to ignore an explicit representation of the vehicle dynamics

  14. Development of Navigation Doppler Lidar for Future Landing Mission

    NASA Technical Reports Server (NTRS)

    Amzajerdian, Farzin; Hines, Glenn D.; Petway, Larry B.; Barnes, Bruce W.; Pierrottet, Diego F.; Carson, John M., III

    2016-01-01

    A coherent Navigation Doppler Lidar (NDL) sensor has been developed under the Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project to support future NASA missions to planetary bodies. This lidar sensor provides accurate surface-relative altitude and vector velocity data during the descent phase that can be used by an autonomous Guidance, Navigation, and Control (GN&C) system to precisely navigate the vehicle from a few kilometers above the ground to a designated location and execute a controlled soft touchdown. The operation and performance of the NDL was demonstrated through closed-loop flights onboard the rocket-propelled Morpheus vehicle in 2014. In Morpheus flights, conducted at the NASA Kennedy Space Center, the NDL data was used by an autonomous GN&C system to navigate and land the vehicle precisely at the selected location surrounded by hazardous rocks and craters. Since then, development efforts for the NDL have shifted toward enhancing performance, optimizing design, and addressing spaceflight size and mass constraints and environmental and reliability requirements. The next generation NDL, with expanded operational envelope and significantly reduced size, will be demonstrated in 2017 through a new flight test campaign onboard a commercial rocketpropelled test vehicle.

  15. Flight Testing ALHAT Precision Landing Technologies Integrated Onboard the Morpheus Rocket Vehicle

    NASA Technical Reports Server (NTRS)

    Carson, John M. III; Robertson, Edward A.; Trawny, Nikolas; Amzajerdian, Farzin

    2015-01-01

    A suite of prototype sensors, software, and avionics developed within the NASA Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project were terrestrially demonstrated onboard the NASA Morpheus rocket-propelled Vertical Testbed (VTB) in 2014. The sensors included a LIDAR-based Hazard Detection System (HDS), a Navigation Doppler LIDAR (NDL) velocimeter, and a long-range Laser Altimeter (LAlt) that enable autonomous and safe precision landing of robotic or human vehicles on solid solar system bodies under varying terrain lighting conditions. The flight test campaign with the Morpheus vehicle involved a detailed integration and functional verification process, followed by tether testing and six successful free flights, including one night flight. The ALHAT sensor measurements were integrated into a common navigation solution through a specialized ALHAT Navigation filter that was employed in closed-loop flight testing within the Morpheus Guidance, Navigation and Control (GN&C) subsystem. Flight testing on Morpheus utilized ALHAT for safe landing site identification and ranking, followed by precise surface-relative navigation to the selected landing site. The successful autonomous, closed-loop flight demonstrations of the prototype ALHAT system have laid the foundation for the infusion of safe, precision landing capabilities into future planetary exploration missions.

  16. Autonomous Navigation Improvements for High-Earth Orbiters Using GPS

    NASA Technical Reports Server (NTRS)

    Long, Anne; Kelbel, David; Lee, Taesul; Garrison, James; Carpenter, J. Russell; Bauer, F. (Technical Monitor)

    2000-01-01

    The Goddard Space Flight Center is currently developing autonomous navigation systems for satellites in high-Earth orbits where acquisition of the GPS signals is severely limited This paper discusses autonomous navigation improvements for high-Earth orbiters and assesses projected navigation performance for these satellites using Global Positioning System (GPS) Standard Positioning Service (SPS) measurements. Navigation performance is evaluated as a function of signal acquisition threshold, measurement errors, and dynamic modeling errors using realistic GPS signal strength and user antenna models. These analyses indicate that an autonomous navigation position accuracy of better than 30 meters root-mean-square (RMS) can be achieved for high-Earth orbiting satellites using a GPS receiver with a very stable oscillator. This accuracy improves to better than 15 meters RMS if the GPS receiver's signal acquisition threshold can be reduced by 5 dB-Hertz to track weaker signals.

  17. Experiments in autonomous robotics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamel, W.R.

    1987-01-01

    The Center for Engineering Systems Advanced Research (CESAR) is performing basic research in autonomous robotics for energy-related applications in hazardous environments. The CESAR research agenda includes a strong experimental component to assure practical evaluation of new concepts and theories. An evolutionary sequence of mobile research robots has been planned to support research in robot navigation, world sensing, and object manipulation. A number of experiments have been performed in studying robot navigation and path planning with planar sonar sensing. Future experiments will address more complex tasks involving three-dimensional sensing, dexterous manipulation, and human-scale operations.

  18. On-Orbit Autonomous Assembly from Nanosatellites

    NASA Technical Reports Server (NTRS)

    Murchison, Luke S.; Martinez, Andres; Petro, Andrew

    2015-01-01

    The On-Orbit Autonomous Assembly from Nanosatellites (OAAN) project will demonstrate autonomous control algorithms for rendezvous and docking maneuvers; low-power reconfigurable magnetic docking technology; and compact, lightweight and inexpensive precision relative navigation using carrier-phase differential (CD) GPS with a three-degree of freedom ground demonstration. CDGPS is a specific relative position determination method that measures the phase of the GPS carrier wave to yield relative position data accurate to.4 inch (1 centimeter). CDGPS is a technology commonly found in the surveying industry. The development and demonstration of these technologies will fill a current gap in the availability of proven autonomous rendezvous and docking systems for small satellites.

  19. An Inertial Dual-State State Estimator for Precision Planetary Landing with Hazard Detection and Avoidance

    NASA Technical Reports Server (NTRS)

    Bishop, Robert H.; DeMars, Kyle; Trawny, Nikolas; Crain, Tim; Hanak, Chad; Carson, John M.; Christian, John

    2016-01-01

    The navigation filter architecture successfully deployed on the Morpheus flight vehicle is presented. The filter was developed as a key element of the NASA Autonomous Landing and Hazard Avoidance Technology (ALHAT) project and over the course of 15 free fights was integrated into the Morpheus vehicle, operations, and flight control loop. Flight testing completed by demonstrating autonomous hazard detection and avoidance, integration of an altimeter, surface relative velocity (velocimeter) and hazard relative navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman flter software, and landing within 2 meters of the vertical testbed GPS-based navigation solution at the safe landing site target. Morpheus followed a trajectory that included an ascent phase followed by a partial descent-to-landing, although the proposed filter architecture is applicable to more general planetary precision entry, descent, and landings. The main new contribution is the incorporation of a sophisticated hazard relative navigation sensor-originally intended to locate safe landing sites-into the navigation system and employed as a navigation sensor. The formulation of a dual-state inertial extended Kalman filter was designed to address the precision planetary landing problem when viewed as a rendezvous problem with an intended landing site. For the required precision navigation system that is capable of navigating along a descent-to-landing trajectory to a precise landing, the impact of attitude errors on the translational state estimation are included in a fully integrated navigation structure in which translation state estimation is combined with attitude state estimation. The map tie errors are estimated as part of the process, thereby creating a dual-state filter implementation. Also, the filter is implemented using inertial states rather than states relative to the target. External measurements include altimeter, velocimeter, star camera, terrain relative navigation sensor, and a hazard relative navigation sensor providing information regarding hazards on a map generated on-the-fly.

  20. Compact autonomous navigation system (CANS)

    NASA Astrophysics Data System (ADS)

    Hao, Y. C.; Ying, L.; Xiong, K.; Cheng, H. Y.; Qiao, G. D.

    2017-11-01

    Autonomous navigation of Satellite and constellation has series of benefits, such as to reduce operation cost and ground station workload, to avoid the event of crises of war and natural disaster, to increase spacecraft autonomy, and so on. Autonomous navigation satellite is independent of ground station support. Many systems are developed for autonomous navigation of satellite in the past 20 years. Along them American MANS (Microcosm Autonomous Navigation System) [1] of Microcosm Inc. and ERADS [2] [3] (Earth Reference Attitude Determination System) of Honeywell Inc. are well known. The systems anticipate a series of good features of autonomous navigation and aim low cost, integrated structure, low power consumption and compact layout. The ERADS is an integrated small 3-axis attitude sensor system with low cost and small volume. It has the Earth center measurement accuracy higher than the common IR sensor because the detected ultraviolet radiation zone of the atmosphere has a brightness gradient larger than that of the IR zone. But the ERADS is still a complex system because it has to eliminate many problems such as making of the sapphire sphere lens, birefringence effect of sapphire, high precision image transfer optical fiber flattener, ultraviolet intensifier noise, and so on. The marginal sphere FOV of the sphere lens of the ERADS is used to star imaging that may be bring some disadvantages., i.e. , the image energy and attitude measurements accuracy may be reduced due to the tilt image acceptance end of the fiber flattener in the FOV. Besides Japan, Germany and Russia developed visible earth sensor for GEO [4] [5]. Do we have a way to develop a cheaper/easier and more accurate autonomous navigation system that can be used to all LEO spacecraft, especially, to LEO small and micro satellites? To return this problem we provide a new type of the system—CANS (Compact Autonomous Navigation System) [6].

  1. Robust Planning for Autonomous Navigation of Mobile Robots in Unstructured, Dynamic Environments: An LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    EISLER, G. RICHARD

    This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstratemore » the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.« less

  2. Autonomous assistance navigation for robotic wheelchairs in confined spaces.

    PubMed

    Cheein, Fernando Auat; Carelli, Ricardo; De la Cruz, Celso; Muller, Sandra; Bastos Filho, Teodiano F

    2010-01-01

    In this work, a visual interface for the assistance of a robotic wheelchair's navigation is presented. The visual interface is developed for the navigation in confined spaces such as narrows corridors or corridor-ends. The interface performs two navigation modus: non-autonomous and autonomous. The non-autonomous driving of the robotic wheelchair is made by means of a hand-joystick. The joystick directs the motion of the vehicle within the environment. The autonomous driving is performed when the user of the wheelchair has to turn (90, 90 or 180 degrees) within the environment. The turning strategy is performed by a maneuverability algorithm compatible with the kinematics of the wheelchair and by the SLAM (Simultaneous Localization and Mapping) algorithm. The SLAM algorithm provides the interface with the information concerning the environment disposition and the pose -position and orientation-of the wheelchair within the environment. Experimental and statistical results of the interface are also shown in this work.

  3. Fuzzy Behavior Modulation with Threshold Activation for Autonomous Vehicle Navigation

    NASA Technical Reports Server (NTRS)

    Tunstel, Edward

    2000-01-01

    This paper describes fuzzy logic techniques used in a hierarchical behavior-based architecture for robot navigation. An architectural feature for threshold activation of fuzzy-behaviors is emphasized, which is potentially useful for tuning navigation performance in real world applications. The target application is autonomous local navigation of a small planetary rover. Threshold activation of low-level navigation behaviors is the primary focus. A preliminary assessment of its impact on local navigation performance is provided based on computer simulations.

  4. Autonomous navigation system based on GPS and magnetometer data

    NASA Technical Reports Server (NTRS)

    Julie, Thienel K. (Inventor); Richard, Harman R. (Inventor); Bar-Itzhack, Itzhack Y. (Inventor)

    2004-01-01

    This invention is drawn to an autonomous navigation system using Global Positioning System (GPS) and magnetometers for low Earth orbit satellites. As a magnetometer is reliable and always provides information on spacecraft attitude, rate, and orbit, the magnetometer-GPS configuration solves GPS initialization problem, decreasing the convergence time for navigation estimate and improving the overall accuracy. Eventually the magnetometer-GPS configuration enables the system to avoid costly and inherently less reliable gyro for rate estimation. Being autonomous, this invention would provide for black-box spacecraft navigation, producing attitude, orbit, and rate estimates without any ground input with high accuracy and reliability.

  5. Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles.

    PubMed

    Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F

    2016-09-16

    Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV's navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results.

  6. Design, Development and Testing of the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam) Guidance, Navigation and Control System

    NASA Technical Reports Server (NTRS)

    Wagenknecht, J.; Fredrickson, S.; Manning, T.; Jones, B.

    2003-01-01

    Engineers at NASA Johnson Space Center have designed, developed, and tested a nanosatellite-class free-flyer intended for future external inspection and remote viewing of human spaceflight activities. The technology demonstration system, known as the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam), has been integrated into the approximate form and function of a flight system. The primary focus has been to develop a system capable of providing external views of the International Space Station. The Mini AERCam system is spherical-shaped and less than eight inches in diameter. It has a full suite of guidance, navigation, and control hardware and software, and is equipped with two digital video cameras and a high resolution still image camera. The vehicle is designed for either remotely piloted operations or supervised autonomous operations. Tests have been performed in both a six degree-of-freedom closed-loop orbital simulation and on an air-bearing table. The Mini AERCam system can also be used as a test platform for evaluating algorithms and relative navigation for autonomous proximity operations and docking around the Space Shuttle Orbiter or the ISS.

  7. Relative navigation and attitude determination using a GPS/INS integrated system near the International Space Station

    NASA Astrophysics Data System (ADS)

    Um, Jaeyong

    2001-08-01

    The Space Integrated GPS/INS (SIGI) sensor is the primary navigation and attitude determination source for the International Space Station (ISS). The SIGI was successfully demonstrated on-orbit for the first time in the SIGI Orbital Attitude Readiness (SOAR) demonstration on the Space Shuttle Atlantis in May 2000. Numerous proximity operations near the ISS have been and will be performed over the lifetime of the Station. The development of an autonomous relative navigation system is needed to improve the safety and efficiency of vehicle operations near the ISS. A hardware simulation study was performed for the GPS-based relative navigation using the state vector difference approach and the interferometric approach in the absence of multipath. The interferometric approach, where the relative states are estimated directly, showed comparable results for a 1 km baseline. One of the most pressing current technical issues is the design of an autonomous relative navigation system in the proximity of the ISS, where GPS signals are blocked and maneuvers happen frequently. An integrated GPS/INS system is investigated for the possibility of a fully autonomous relative navigation system. Another application of GPS measurements is determination of the vehicle's orientation in space. This study used the SOAR experiment data to characterize the SICI's on-orbit performance for attitude determination. A cold start initialization algorithm was developed for integer ambiguity resolution in any initial orientation. The original algorithm that was used in the SIGI had an operational limitation in the integer ambiguity resolution, which was developed for terrestrial applications, and limited its effectiveness in space. The new algorithm was tested using the SOAR data and has been incorporated in the current SIGI flight software. The attitude estimation performance was examined using two different GPS/INS integration algorithms. The GPS/INS attitude solution using the SOAR data was as accurate as 0.06 deg (RMS) in 3-axis with multipath mitigation. Other improvements to the attitude determination algorithm were the development of a faster integer ambiguity resolution method and the incorporation of line bias modeling.

  8. Autonomous Collision-Free Navigation of Microvehicles in Complex and Dynamically Changing Environments.

    PubMed

    Li, Tianlong; Chang, Xiaocong; Wu, Zhiguang; Li, Jinxing; Shao, Guangbin; Deng, Xinghong; Qiu, Jianbin; Guo, Bin; Zhang, Guangyu; He, Qiang; Li, Longqiu; Wang, Joseph

    2017-09-26

    Self-propelled micro- and nanoscale robots represent a rapidly emerging and fascinating robotics research area. However, designing autonomous and adaptive control systems for operating micro/nanorobotics in complex and dynamically changing environments, which is a highly demanding feature, is still an unmet challenge. Here we describe a smart microvehicle for precise autonomous navigation in complicated environments and traffic scenarios. The fully autonomous navigation system of the smart microvehicle is composed of a microscope-coupled CCD camera, an artificial intelligence planner, and a magnetic field generator. The microscope-coupled CCD camera provides real-time localization of the chemically powered Janus microsphere vehicle and environmental detection for path planning to generate optimal collision-free routes, while the moving direction of the microrobot toward a reference position is determined by the external electromagnetic torque. Real-time object detection offers adaptive path planning in response to dynamically changing environments. We demonstrate that the autonomous navigation system can guide the vehicle movement in complex patterns, in the presence of dynamically changing obstacles, and in complex biological environments. Such a navigation system for micro/nanoscale vehicles, relying on vision-based close-loop control and path planning, is highly promising for their autonomous operation in complex dynamic settings and unpredictable scenarios expected in a variety of realistic nanoscale scenarios.

  9. Integrated polarization-dependent sensor for autonomous navigation

    NASA Astrophysics Data System (ADS)

    Liu, Ze; Zhang, Ran; Wang, Zhiwen; Guan, Le; Li, Bin; Chu, Jinkui

    2015-01-01

    Based on the navigation strategy of insects utilizing the polarized skylight, an integrated polarization-dependent sensor for autonomous navigation is presented. The navigation sensor has the features of compact structure, high precision, strong robustness, and a simple manufacture technique. The sensor is composed by integrating a complementary-metal-oxide-semiconductor sensor with a multiorientation nanowire grid polarizer. By nanoimprint lithography, the multiorientation nanowire polarizer is fabricated in one step and the alignment error is eliminated. The statistical theory is added to the interval-division algorithm to calculate the polarization angle of the incident light. The laboratory and outdoor tests for the navigation sensor are implemented and the errors of the measured angle are ±0.02 deg and ±1.3 deg, respectively. The results show that the proposed sensor has potential for application in autonomous navigation.

  10. New vision system and navigation algorithm for an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Tann, Hokchhay; Shakya, Bicky; Merchen, Alex C.; Williams, Benjamin C.; Khanal, Abhishek; Zhao, Jiajia; Ahlgren, David J.

    2013-12-01

    Improvements were made to the intelligence algorithms of an autonomously operating ground vehicle, Q, which competed in the 2013 Intelligent Ground Vehicle Competition (IGVC). The IGVC required the vehicle to first navigate between two white lines on a grassy obstacle course, then pass through eight GPS waypoints, and pass through a final obstacle field. Modifications to Q included a new vision system with a more effective image processing algorithm for white line extraction. The path-planning algorithm adopted the vision system, creating smoother, more reliable navigation. With these improvements, Q successfully completed the basic autonomous navigation challenge, finishing tenth out of over 50 teams.

  11. Autonomous celestial navigation based on Earth ultraviolet radiance and fast gradient statistic feature extraction

    NASA Astrophysics Data System (ADS)

    Lu, Shan; Zhang, Hanmo

    2016-01-01

    To meet the requirement of autonomous orbit determination, this paper proposes a fast curve fitting method based on earth ultraviolet features to obtain accurate earth vector direction, in order to achieve the high precision autonomous navigation. Firstly, combining the stable characters of earth ultraviolet radiance and the use of transmission model software of atmospheric radiation, the paper simulates earth ultraviolet radiation model on different time and chooses the proper observation band. Then the fast improved edge extracting method combined Sobel operator and local binary pattern (LBP) is utilized, which can both eliminate noises efficiently and extract earth ultraviolet limb features accurately. And earth's centroid locations on simulated images are estimated via the least square fitting method using part of the limb edges. Taken advantage of the estimated earth vector direction and earth distance, Extended Kalman Filter (EKF) is applied to realize the autonomous navigation finally. Experiment results indicate the proposed method can achieve a sub-pixel earth centroid location estimation and extremely enhance autonomous celestial navigation precision.

  12. Free-Flight Terrestrial Rocket Lander Demonstration for NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) System

    NASA Technical Reports Server (NTRS)

    Rutishauser, David K.; Epp, Chirold; Robertson, Ed

    2012-01-01

    The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. Since its inception in 2006, the ALHAT Project has executed four field test campaigns to characterize and mature sensors and algorithms that support real-time hazard detection and global/local precision navigation for planetary landings. The driving objective for Government Fiscal Year 2012 (GFY2012) is to successfully demonstrate autonomous, real-time, closed loop operation of the ALHAT system in a realistic free flight scenario on Earth using the Morpheus lander developed at the Johnson Space Center (JSC). This goal represents an aggressive target consistent with a lean engineering culture of rapid prototyping and development. This culture is characterized by prioritizing early implementation to gain practical lessons learned and then building on this knowledge with subsequent prototyping design cycles of increasing complexity culminating in the implementation of the baseline design. This paper provides an overview of the ALHAT/Morpheus flight demonstration activities in GFY2012, including accomplishments, current status, results, and lessons learned. The ALHAT/Morpheus effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).

  13. Recursive Gradient Estimation Using Splines for Navigation of Autonomous Vehicles.

    DTIC Science & Technology

    1985-07-01

    AUTONOMOUS VEHICLES C. N. SHEN DTIC " JULY 1985 SEP 1 219 85 V US ARMY ARMAMENT RESEARCH AND DEVELOPMENT CENTER LARGE CALIBER WEAPON SYSTEMS LABORATORY I...GRADIENT ESTIMATION USING SPLINES FOR NAVIGATION OF AUTONOMOUS VEHICLES Final S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(q) 8. CONTRACT OR GRANT NUMBER...which require autonomous vehicles . Essential to these robotic vehicles is an adequate and efficient computer vision system. A potentially more

  14. Design and Development of the WVU Advanced Technology Satellite for Optical Navigation

    NASA Astrophysics Data System (ADS)

    Straub, Miranda

    In order to meet the demands of future space missions, it is beneficial for spacecraft to have the capability to support autonomous navigation. This is true for both crewed and uncrewed vehicles. For crewed vehicles, autonomous navigation would allow the crew to safely navigate home in the event of a communication system failure. For uncrewed missions, autonomous navigation reduces the demand on ground-based infrastructure and could allow for more flexible operation. One promising technique for achieving these goals is through optical navigation. To this end, the present work considers how camera images of the Earth's surface could enable autonomous navigation of a satellite in low Earth orbit. Specifically, this study will investigate the use of coastlines and other natural land-water boundaries for navigation. Observed coastlines can be matched to a pre-existing coastline database in order to determine the location of the spacecraft. This paper examines how such measurements may be processed in an on-board extended Kalman filter (EKF) to provide completely autonomous estimates of the spacecraft state throughout the duration of the mission. In addition, future work includes implementing this work on a CubeSat mission within the WVU Applied Space Exploration Lab (ASEL). The mission titled WVU Advanced Technology Satellite for Optical Navigation (WATSON) will provide students with an opportunity to experience the life cycle of a spacecraft from design through operation while hopefully meeting the primary and secondary goals defined for mission success. The spacecraft design process, although simplified by CubeSat standards, will be discussed in this thesis as well as the current results of laboratory testing with the CubeSat model in the ASEL.

  15. High Speed Lunar Navigation for Crewed and Remotely Piloted Vehicles

    NASA Technical Reports Server (NTRS)

    Pedersen, L.; Allan, M.; To, V.; Utz, H.; Wojcikiewicz, W.; Chautems, C.

    2010-01-01

    Increased navigation speed is desirable for lunar rovers, whether autonomous, crewed or remotely operated, but is hampered by the low gravity, high contrast lighting and rough terrain. We describe lidar based navigation system deployed on NASA's K10 autonomous rover and to increase the terrain hazard situational awareness of the Lunar Electric Rover crew.

  16. Image Dependent Relative Formation Navigation for Autonomous Aerial Refueling

    DTIC Science & Technology

    2011-03-01

    and local variations of the Earth’s surface make a mathematical model difficult to create and use. The definition of an equipotential surface ...controlled with flight control surfaces attached to it. To refuel using this method, the receiver pilot flies the aircraft to within a defined refueling...I-frame would unnecessarily complicate aircraft navigation that, by definition, is limited to altitudes relatively close to the surface of the Earth

  17. Mobile Robot Designed with Autonomous Navigation System

    NASA Astrophysics Data System (ADS)

    An, Feng; Chen, Qiang; Zha, Yanfang; Tao, Wenyin

    2017-10-01

    With the rapid development of robot technology, robots appear more and more in all aspects of life and social production, people also ask more requirements for the robot, one is that robot capable of autonomous navigation, can recognize the road. Take the common household sweeping robot as an example, which could avoid obstacles, clean the ground and automatically find the charging place; Another example is AGV tracking car, which can following the route and reach the destination successfully. This paper introduces a new type of robot navigation scheme: SLAM, which can build the environment map in a totally strange environment, and at the same time, locate its own position, so as to achieve autonomous navigation function.

  18. Precision Landing and Hazard Avoidance Doman

    NASA Technical Reports Server (NTRS)

    Robertson, Edward A.; Carson, John M., III

    2016-01-01

    The Precision Landing and Hazard Avoidance (PL&HA) domain addresses the development, integration, testing, and spaceflight infusion of sensing, processing, and GN&C functions critical to the success and safety of future human and robotic exploration missions. PL&HA sensors also have applications to other mission events, such as rendezvous and docking. Autonomous PL&HA builds upon the core GN&C capabilities developed to enable soft, controlled landings on the Moon, Mars, and other solar system bodies. Through the addition of a Terrain Relative Navigation (TRN) function, precision landing within tens of meters of a map-based target is possible. The addition of a 3-D terrain mapping lidar sensor improves the probability of a safe landing via autonomous, real-time Hazard Detection and Avoidance (HDA). PL&HA significantly improves the probability of mission success and enhances access to sites of scientific interest located in challenging terrain. PL&HA can also utilize external navigation aids, such as navigation satellites and surface beacons. Advanced Lidar Sensors High precision ranging, velocimetry, and 3-D terrain mapping Terrain Relative Navigation (TRN) TRN compares onboard reconnaissance data with real-time terrain imaging data to update the S/C position estimate Hazard Detection and Avoidance (HDA) Generates a high-resolution, 3-D terrain map in real-time during the approach trajectory to identify safe landing targets Inertial Navigation During Terminal Descent High precision surface relative sensors enable accurate inertial navigation during terminal descent and a tightly controlled touchdown within meters of the selected safe landing target.

  19. A Feedforward Control Approach to the Local Navigation Problem for Autonomous Vehicles

    DTIC Science & Technology

    1994-05-02

    AD-A282 787 " A Feedforward Control Approach to the Local Navigation Problem for Autonomous Vehicles Alonzo Kelly CMU-RI-TR-94-17 The Robotics...follow, or a direction to prefer, it cannot generate its own strategic goals. Therefore, it solves the local planning problem for autonomous vehicles . The... autonomous vehicles . It is intelligent because it uses range images that are generated from either a laser rangefinder or a stereo triangulation

  20. An Autonomous Gps-Denied Unmanned Vehicle Platform Based on Binocular Vision for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    Qin, M.; Wan, X.; Shao, Y. Y.; Li, S. Y.

    2018-04-01

    Vision-based navigation has become an attractive solution for autonomous navigation for planetary exploration. This paper presents our work of designing and building an autonomous vision-based GPS-denied unmanned vehicle and developing an ARFM (Adaptive Robust Feature Matching) based VO (Visual Odometry) software for its autonomous navigation. The hardware system is mainly composed of binocular stereo camera, a pan-and tilt, a master machine, a tracked chassis. And the ARFM-based VO software system contains four modules: camera calibration, ARFM-based 3D reconstruction, position and attitude calculation, BA (Bundle Adjustment) modules. Two VO experiments were carried out using both outdoor images from open dataset and indoor images captured by our vehicle, the results demonstrate that our vision-based unmanned vehicle is able to achieve autonomous localization and has the potential for future planetary exploration.

  1. Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles

    PubMed Central

    Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F.

    2016-01-01

    Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV’s navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results. PMID:27649203

  2. Target Trailing With Safe Navigation With Colregs for Maritime Autonomous Surface Vehicles

    NASA Technical Reports Server (NTRS)

    Kuwata, Yoshiaki (Inventor); Aghazarian, Hrand (Inventor); Huntsberger, Terrance L. (Inventor); Howard, Andrew B. (Inventor); Wolf, Michael T. (Inventor); Zarzhitsky, Dimitri V. (Inventor)

    2014-01-01

    Systems and methods for operating autonomous waterborne vessels in a safe manner. The systems include hardware for identifying the locations and motions of other vessels, as well as the locations of stationary objects that represent navigation hazards. By applying a computational method that uses a maritime navigation algorithm for avoiding hazards and obeying COLREGS using Velocity Obstacles to the data obtained, the autonomous vessel computes a safe and effective path to be followed in order to accomplish a desired navigational end result, while operating in a manner so as to avoid hazards and to maintain compliance with standard navigational procedures defined by international agreement. The systems and methods have been successfully demonstrated on water with radar and stereo cameras as the perception sensors, and integrated with a higher level planner for trailing a maneuvering target.

  3. Navigation for the new millennium: Autonomous navigation for Deep Space 1

    NASA Technical Reports Server (NTRS)

    Reidel, J. E.; Bhaskaran, S.; Synnott, S. P.; Desai, S. D.; Bollman, W. E.; Dumont, P. J.; Halsell, C. A.; Han, D.; Kennedy, B. M.; Null, G. W.; hide

    1997-01-01

    The autonomous optical navigation system technology for the Deep Space 1 (DS1) mission is reported on. The DS1 navigation system will be the first to use autonomous navigation in deep space. The systems tasks are to: perform interplanetary cruise orbit determination using images of distant asteroids; control and maintain the orbit of the spacecraft with an ion propulsion system and conventional thrusters, and perform late knowledge updates of target position during close flybys in order to facilitate high quality data return from asteroid MaAuliffe and comet West-Kohoutek-Ikemura. To accomplish these tasks, the following functions are required: picture planning; image processing; dynamical modeling and integration; planetary ephemeris and star catalog handling; orbit determination; data filtering and estimation; maneuver estimation, and spacecraft ephemeris updating. These systems and functions are described and preliminary performance data are presented.

  4. Autonomous Navigation of Small Uavs Based on Vehicle Dynamic Model

    NASA Astrophysics Data System (ADS)

    Khaghani, M.; Skaloud, J.

    2016-03-01

    This paper presents a novel approach to autonomous navigation for small UAVs, in which the vehicle dynamic model (VDM) serves as the main process model within the navigation filter. The proposed method significantly increases the accuracy and reliability of autonomous navigation, especially for small UAVs with low-cost IMUs on-board. This is achieved with no extra sensor added to the conventional INS/GNSS setup. This improvement is of special interest in case of GNSS outages, where inertial coasting drifts very quickly. In the proposed architecture, the solution to VDM equations provides the estimate of position, velocity, and attitude, which is updated within the navigation filter based on available observations, such as IMU data or GNSS measurements. The VDM is also fed with the control input to the UAV, which is available within the control/autopilot system. The filter is capable of estimating wind velocity and dynamic model parameters, in addition to navigation states and IMU sensor errors. Monte Carlo simulations reveal major improvements in navigation accuracy compared to conventional INS/GNSS navigation system during the autonomous phase, when satellite signals are not available due to physical obstruction or electromagnetic interference for example. In case of GNSS outages of a few minutes, position and attitude accuracy experiences improvements of orders of magnitude compared to inertial coasting. It means that during such scenario, the position-velocity-attitude (PVA) determination is sufficiently accurate to navigate the UAV to a home position without any signal that depends on vehicle environment.

  5. Performance Characterization of a Landmark Measurement System for ARRM Terrain Relative Navigation

    NASA Technical Reports Server (NTRS)

    Shoemaker, Michael A.; Wright, Cinnamon; Liounis, Andrew J.; Getzandanner, Kenneth M.; Van Eepoel, John M.; DeWeese, Keith D.

    2016-01-01

    This paper describes the landmark measurement system being developed for terrain relative navigation on NASAs Asteroid Redirect Robotic Mission (ARRM),and the results of a performance characterization study given realistic navigational and model errors. The system is called Retina, and is derived from the stereo-photoclinometry methods widely used on other small-body missions. The system is simulated using synthetic imagery of the asteroid surface and discussion is given on various algorithmic design choices. Unlike other missions, ARRMs Retina is the first planned autonomous use of these methods during the close-proximity and descent phase of the mission.

  6. Performance Characterization of a Landmark Measurement System for ARRM Terrain Relative Navigation

    NASA Technical Reports Server (NTRS)

    Shoemaker, Michael; Wright, Cinnamon; Liounis, Andrew; Getzandanner, Kenneth; Van Eepoel, John; Deweese, Keith

    2016-01-01

    This paper describes the landmark measurement system being developed for terrain relative navigation on NASAs Asteroid Redirect Robotic Mission (ARRM),and the results of a performance characterization study given realistic navigational and model errors. The system is called Retina, and is derived from the stereophotoclinometry methods widely used on other small-body missions. The system is simulated using synthetic imagery of the asteroid surface and discussion is given on various algorithmic design choices. Unlike other missions, ARRMs Retina is the first planned autonomous use of these methods during the close-proximity and descent phase of the mission.

  7. Linked Autonomous Interplanetary Satellite Orbit Navigation

    NASA Technical Reports Server (NTRS)

    Parker, Jeffrey S.; Anderson, Rodney L.; Born, George H.; Leonard, Jason M.; McGranaghan, Ryan M.; Fujimoto, Kohei

    2013-01-01

    A navigation technology known as LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation) has been known to produce very impressive navigation results for scenarios involving two or more cooperative satellites near the Moon, such that at least one satellite must be in an orbit significantly perturbed by the Earth, such as a lunar halo orbit. The two (or more) satellites track each other using satellite-to-satellite range and/or range-rate measurements. These relative measurements yield absolute orbit navigation when one of the satellites is in a lunar halo orbit, or the like. The geometry between a lunar halo orbiter and a GEO satellite continuously changes, which dramatically improves the information content of a satellite-to-satellite tracking signal. The geometrical variations include significant out-of-plane shifts, as well as inplane shifts. Further, the GEO satellite is almost continuously in view of a lunar halo orbiter. High-fidelity simulations demonstrate that LiAISON technology improves the navigation of GEO orbiters by an order of magnitude, relative to standard ground tracking. If a GEO satellite is navigated using LiAISON- only tracking measurements, its position is typically known to better than 10 meters. If LiAISON measurements are combined with simple radiometric ground observations, then the satellite s position is typically known to better than 3 meters, which is substantially better than the current state of GEO navigation. There are two features of LiAISON that are novel and advantageous compared with conventional satellite navigation. First, ordinary satellite-to-satellite tracking data only provides relative navigation of each satellite. The novelty is the placement of one navigation satellite in an orbit that is significantly perturbed by both the Earth and the Moon. A navigation satellite can track other satellites elsewhere in the Earth-Moon system and acquire knowledge about both satellites absolute positions and velocities, as well as relative positions and velocities in space. The second novelty is that ordinarily one requires many satellites in order to achieve full navigation of any given customer s position and velocity over time. With LiAISON navigation, only a single navigation satellite is needed, provided that the satellite is significantly affected by the gravity of the Earth and the Moon. That single satellite can track another satellite elsewhere in the Earth- Moon system and obtain absolute knowledge of both satellites states.

  8. Real-time visual mosaicking and navigation on the seafloor

    NASA Astrophysics Data System (ADS)

    Richmond, Kristof

    Remote robotic exploration holds vast potential for gaining knowledge about extreme environments accessible to humans only with great difficulty. Robotic explorers have been sent to other solar system bodies, and on this planet into inaccessible areas such as caves and volcanoes. In fact, the largest unexplored land area on earth lies hidden in the airless cold and intense pressure of the ocean depths. Exploration in the oceans is further hindered by water's high absorption of electromagnetic radiation, which both inhibits remote sensing from the surface, and limits communications with the bottom. The Earth's oceans thus provide an attractive target for developing remote exploration capabilities. As a result, numerous robotic vehicles now routinely survey this environment, from remotely operated vehicles piloted over tethers from the surface to torpedo-shaped autonomous underwater vehicles surveying the mid-waters. However, these vehicles are limited in their ability to navigate relative to their environment. This limits their ability to return to sites with precision without the use of external navigation aids, and to maneuver near and interact with objects autonomously in the water and on the sea floor. The enabling of environment-relative positioning on fully autonomous underwater vehicles will greatly extend their power and utility for remote exploration in the furthest reaches of the Earth's waters---even under ice and under ground---and eventually in extraterrestrial liquid environments such as Europa's oceans. This thesis presents an operational, fielded system for visual navigation of underwater robotic vehicles in unexplored areas of the seafloor. The system does not depend on external sensing systems, using only instruments on board the vehicle. As an area is explored, a camera is used to capture images and a composite view, or visual mosaic, of the ocean bottom is created in real time. Side-to-side visual registration of images is combined with dead-reckoned navigation information in a framework allowing the creation and updating of large, locally consistent mosaics. These mosaics are used as maps in which the vehicle can navigate and localize itself with respect to points in the environment. The system achieves real-time performance in several ways. First, wherever possible, direct sensing of motion parameters is used in place of extracting them from visual data. Second, trajectories are chosen to enable a hierarchical search for side-to-side links which limits the amount of searching performed without sacrificing robustness. Finally, the map estimation is formulated as a sparse, linear information filter allowing rapid updating of large maps. The visual navigation enabled by the work in this thesis represents a new capability for remotely operated vehicles, and an enabling capability for a new generation of autonomous vehicles which explore and interact with remote, unknown and unstructured underwater environments. The real-time mosaic can be used on current tethered vehicles to create pilot aids and provide a vehicle user with situational awareness of the local environment and the position of the vehicle within it. For autonomous vehicles, the visual navigation system enables precise environment-relative positioning and mapping, without requiring external navigation systems, opening the way for ever-expanding autonomous exploration capabilities. The utility of this system was demonstrated in the field at sites of scientific interest using the ROVs Ventana and Tiburon operated by the Monterey Bay Aquarium Research Institute. A number of sites in and around Monterey Bay, California were mosaicked using the system, culminating in a complete imaging of the wreck site of the USS Macon , where real-time visual mosaics containing thousands of images were generated while navigating using only sensor systems on board the vehicle.

  9. Reactive navigation for autonomous guided vehicle using neuro-fuzzy techniques

    NASA Astrophysics Data System (ADS)

    Cao, Jin; Liao, Xiaoqun; Hall, Ernest L.

    1999-08-01

    A Neuro-fuzzy control method for navigation of an Autonomous Guided Vehicle robot is described. Robot navigation is defined as the guiding of a mobile robot to a desired destination or along a desired path in an environment characterized by as terrain and a set of distinct objects, such as obstacles and landmarks. The autonomous navigate ability and road following precision are mainly influenced by its control strategy and real-time control performance. Neural network and fuzzy logic control techniques can improve real-time control performance for mobile robot due to its high robustness and error-tolerance ability. For a mobile robot to navigate automatically and rapidly, an important factor is to identify and classify mobile robots' currently perceptual environment. In this paper, a new approach of the current perceptual environment feature identification and classification, which are based on the analysis of the classifying neural network and the Neuro- fuzzy algorithm, is presented. The significance of this work lies in the development of a new method for mobile robot navigation.

  10. Autonomous satellite navigation using starlight refraction angle measurements

    NASA Astrophysics Data System (ADS)

    Ning, Xiaolin; Wang, Longhua; Bai, Xinbei; Fang, Jiancheng

    2013-05-01

    An on-board autonomous navigation capability is required to reduce the operation costs and enhance the navigation performance of future satellites. Autonomous navigation by stellar refraction is a type of autonomous celestial navigation method that uses high-accuracy star sensors instead of Earth sensors to provide information regarding Earth's horizon. In previous studies, the refraction apparent height has typically been used for such navigation. However, the apparent height cannot be measured directly by a star sensor and can only be calculated by the refraction angle and an atmospheric refraction model. Therefore, additional errors are introduced by the uncertainty and nonlinearity of atmospheric refraction models, which result in reduced navigation accuracy and reliability. A new navigation method based on the direct measurement of the refraction angle is proposed to solve this problem. Techniques for the determination of the refraction angle are introduced, and a measurement model for the refraction angle is established. The method is tested and validated by simulations. When the starlight refraction height ranges from 20 to 50 km, a positioning accuracy of better than 100 m can be achieved for a low-Earth-orbit (LEO) satellite using the refraction angle, while the positioning accuracy of the traditional method using the apparent height is worse than 500 m under the same conditions. Furthermore, an analysis of the factors that affect navigation accuracy, including the measurement accuracy of the refraction angle, the number of visible refracted stars per orbit and the installation azimuth of star sensor, is presented. This method is highly recommended for small satellites in particular, as no additional hardware besides two star sensors is required.

  11. GPS navigation algorithms for Autonomous Airborne Refueling of Unmanned Air Vehicles

    NASA Astrophysics Data System (ADS)

    Khanafseh, Samer Mahmoud

    Unmanned Air Vehicles (UAVs) have recently generated great interest because of their potential to perform hazardous missions without risking loss of life. If autonomous airborne refueling is possible for UAVs, mission range and endurance will be greatly enhanced. However, concerns about UAV-tanker proximity, dynamic mobility and safety demand that the relative navigation system meets stringent requirements on accuracy, integrity, and continuity. In response, this research focuses on developing high-performance GPS-based navigation architectures for Autonomous Airborne Refueling (AAR) of UAVs. The AAR mission is unique because of the potentially severe sky blockage introduced by the tanker. To address this issue, a high-fidelity dynamic sky blockage model was developed and experimentally validated. In addition, robust carrier phase differential GPS navigation algorithms were derived, including a new method for high-integrity reacquisition of carrier cycle ambiguities for recently-blocked satellites. In order to evaluate navigation performance, world-wide global availability and sensitivity covariance analyses were conducted. The new navigation algorithms were shown to be sufficient for turn-free scenarios, but improvement in performance was necessary to meet the difficult requirements for a general refueling mission with banked turns. Therefore, several innovative methods were pursued to enhance navigation performance. First, a new theoretical approach was developed to quantify the position-domain integrity risk in cycle ambiguity resolution problems. A mechanism to implement this method with partially-fixed cycle ambiguity vectors was derived, and it was used to define tight upper bounds on AAR navigation integrity risk. A second method, where a new algorithm for optimal fusion of measurements from multiple antennas was developed, was used to improve satellite coverage in poor visibility environments such as in AAR. Finally, methods for using data-link extracted measurements as an additional inter-vehicle ranging measurement were also introduced. The algorithms and methods developed in this work are generally applicable to realize high-performance GPS-based navigation in partially obstructed environments. Navigation performance for AAR was quantified through covariance analysis, and it was shown that the stringent navigation requirements for this application are achievable. Finally, a real-time implementation of the algorithms was developed and successfully validated in autopiloted flight tests.

  12. Autonomous satellite navigation with the Global Positioning System

    NASA Technical Reports Server (NTRS)

    Fuchs, A. J.; Wooden, W. H., II; Long, A. C.

    1977-01-01

    This paper discusses the potential of using the Global Positioning System (GPS) to provide autonomous navigation capability to NASA satellites in the 1980 era. Some of the driving forces motivating autonomous navigation are presented. These include such factors as advances in attitude control systems, onboard science annotation, and onboard gridding of imaging data. Simulation results which demonstrate baseline orbit determination accuracies using GPS data on Seasat, Landsat-D, and the Solar Maximum Mission are presented. Emphasis is placed on identifying error sources such as GPS time, GPS ephemeris, user timing biases, and user orbit dynamics, and in a parametric sense on evaluating their contribution to the orbit determination accuracies.

  13. Approach-Phase Precision Landing with Hazard Relative Navigation: Terrestrial Test Campaign Results of the Morpheus/ALHAT Project

    NASA Technical Reports Server (NTRS)

    Crain, Timothy P.; Bishop, Robert H.; Carson, John M., III; Trawny, Nikolas; Hanak, Chad; Sullivan, Jacob; Christian, John; DeMars, Kyle; Campbell, Tom; Getchius, Joel

    2016-01-01

    The Morpheus Project began in late 2009 as an ambitious e ort code-named Project M to integrate three ongoing multi-center NASA technology developments: humanoid robotics, liquid oxygen/liquid methane (LOX/LCH4) propulsion and Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) into a single engineering demonstration mission to be own to the Moon by 2013. The humanoid robot e ort was redirected to a deploy- ment of Robonaut 2 on the International Space Station in February of 2011 while Morpheus continued as a terrestrial eld test project integrating the existing ALHAT Project's tech- nologies into a sub-orbital ight system using the world's rst LOX/LCH4 main propulsion and reaction control system fed from the same blowdown tanks. A series of 33 tethered tests with the Morpheus 1.0 vehicle and Morpheus 1.5 vehicle were conducted from April 2011 - December 2013 before successful, sustained free ights with the primary Vertical Testbed (VTB) navigation con guration began with Free Flight 3 on December 10, 2013. Over the course of the following 12 free ights and 3 tethered ights, components of the ALHAT navigation system were integrated into the Morpheus vehicle, operations, and ight control loop. The ALHAT navigation system was integrated and run concurrently with the VTB navigation system as a reference and fail-safe option in ight (see touchdown position esti- mate comparisons in Fig. 1). Flight testing completed with Free Flight 15 on December 15, 2014 with a completely autonomous Hazard Detection and Avoidance (HDA), integration of surface relative and Hazard Relative Navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman lter software, and landing within 2 meters of the VTB GPS-based navigation solution at the safe landing site target. This paper describes the Mor- pheus joint VTB/ALHAT navigation architecture, the sensors utilized during the terrestrial ight campaign, issues resolved during testing, and the navigation results from the ight tests.

  14. Autonomous GPS/INS navigation experiment for Space Transfer Vehicle

    NASA Technical Reports Server (NTRS)

    Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. W.

    1993-01-01

    An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.

  15. Autonomous GPS/INS navigation experiment for Space Transfer Vehicle (STV)

    NASA Technical Reports Server (NTRS)

    Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. Wayne

    1991-01-01

    An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.

  16. Autonomous GPS/INS navigation experiment for Space Transfer Vehicle

    NASA Astrophysics Data System (ADS)

    Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. W.

    1993-07-01

    An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.

  17. LABRADOR: a learning autonomous behavior-based robot for adaptive detection and object retrieval

    NASA Astrophysics Data System (ADS)

    Yamauchi, Brian; Moseley, Mark; Brookshire, Jonathan

    2013-01-01

    As part of the TARDEC-funded CANINE (Cooperative Autonomous Navigation in a Networked Environment) Program, iRobot developed LABRADOR (Learning Autonomous Behavior-based Robot for Adaptive Detection and Object Retrieval). LABRADOR was based on the rugged, man-portable, iRobot PackBot unmanned ground vehicle (UGV) equipped with an explosives ordnance disposal (EOD) manipulator arm and a custom gripper. For LABRADOR, we developed a vision-based object learning and recognition system that combined a TLD (track-learn-detect) filter based on object shape features with a color-histogram-based object detector. Our vision system was able to learn in real-time to recognize objects presented to the robot. We also implemented a waypoint navigation system based on fused GPS, IMU (inertial measurement unit), and odometry data. We used this navigation capability to implement autonomous behaviors capable of searching a specified area using a variety of robust coverage strategies - including outward spiral, random bounce, random waypoint, and perimeter following behaviors. While the full system was not integrated in time to compete in the CANINE competition event, we developed useful perception, navigation, and behavior capabilities that may be applied to future autonomous robot systems.

  18. Autonomous Navigation for Autonomous Underwater Vehicles Based on Information Filters and Active Sensing

    PubMed Central

    He, Bo; Zhang, Hongjin; Li, Chao; Zhang, Shujing; Liang, Yan; Yan, Tianhong

    2011-01-01

    This paper addresses an autonomous navigation method for the autonomous underwater vehicle (AUV) C-Ranger applying information-filter-based simultaneous localization and mapping (SLAM), and its sea trial experiments in Tuandao Bay (Shangdong Province, P.R. China). Weak links in the information matrix in an extended information filter (EIF) can be pruned to achieve an efficient approach-sparse EIF algorithm (SEIF-SLAM). All the basic update formulae can be implemented in constant time irrespective of the size of the map; hence the computational complexity is significantly reduced. The mechanical scanning imaging sonar is chosen as the active sensing device for the underwater vehicle, and a compensation method based on feedback of the AUV pose is presented to overcome distortion of the acoustic images due to the vehicle motion. In order to verify the feasibility of the navigation methods proposed for the C-Ranger, a sea trial was conducted in Tuandao Bay. Experimental results and analysis show that the proposed navigation approach based on SEIF-SLAM improves the accuracy of the navigation compared with conventional method; moreover the algorithm has a low computational cost when compared with EKF-SLAM. PMID:22346682

  19. Autonomous navigation for autonomous underwater vehicles based on information filters and active sensing.

    PubMed

    He, Bo; Zhang, Hongjin; Li, Chao; Zhang, Shujing; Liang, Yan; Yan, Tianhong

    2011-01-01

    This paper addresses an autonomous navigation method for the autonomous underwater vehicle (AUV) C-Ranger applying information-filter-based simultaneous localization and mapping (SLAM), and its sea trial experiments in Tuandao Bay (Shangdong Province, P.R. China). Weak links in the information matrix in an extended information filter (EIF) can be pruned to achieve an efficient approach-sparse EIF algorithm (SEIF-SLAM). All the basic update formulae can be implemented in constant time irrespective of the size of the map; hence the computational complexity is significantly reduced. The mechanical scanning imaging sonar is chosen as the active sensing device for the underwater vehicle, and a compensation method based on feedback of the AUV pose is presented to overcome distortion of the acoustic images due to the vehicle motion. In order to verify the feasibility of the navigation methods proposed for the C-Ranger, a sea trial was conducted in Tuandao Bay. Experimental results and analysis show that the proposed navigation approach based on SEIF-SLAM improves the accuracy of the navigation compared with conventional method; moreover the algorithm has a low computational cost when compared with EKF-SLAM.

  20. Autonomous satellite navigation by stellar refraction

    NASA Technical Reports Server (NTRS)

    Gounley, R.; White, R.; Gai, E.

    1983-01-01

    This paper describes an error analysis of an autonomous navigator using refraction measurements of starlight passing through the upper atmosphere. The analysis is based on a discrete linear Kalman filter. The filter generated steady-state values of navigator performance for a variety of test cases. Results of these simulations show that in low-earth orbit position-error standard deviations of less than 0.100 km may be obtained using only 40 star sightings per orbit.

  1. Neuro-fuzzy controller to navigate an unmanned vehicle.

    PubMed

    Selma, Boumediene; Chouraqui, Samira

    2013-12-01

    A Neuro-fuzzy control method for an Unmanned Vehicle (UV) simulation is described. The objective is guiding an autonomous vehicle to a desired destination along a desired path in an environment characterized by a terrain and a set of distinct objects, such as obstacles like donkey traffic lights and cars circulating in the trajectory. The autonomous navigate ability and road following precision are mainly influenced by its control strategy and real-time control performance. Fuzzy Logic Controller can very well describe the desired system behavior with simple "if-then" relations owing the designer to derive "if-then" rules manually by trial and error. On the other hand, Neural Networks perform function approximation of a system but cannot interpret the solution obtained neither check if its solution is plausible. The two approaches are complementary. Combining them, Neural Networks will allow learning capability while Fuzzy-Logic will bring knowledge representation (Neuro-Fuzzy). In this paper, an artificial neural network fuzzy inference system (ANFIS) controller is described and implemented to navigate the autonomous vehicle. Results show several improvements in the control system adjusted by neuro-fuzzy techniques in comparison to the previous methods like Artificial Neural Network (ANN).

  2. Non-destructive inspection in industrial equipment using robotic mobile manipulation

    NASA Astrophysics Data System (ADS)

    Maurtua, Iñaki; Susperregi, Loreto; Ansuategui, Ander; Fernández, Ane; Ibarguren, Aitor; Molina, Jorge; Tubio, Carlos; Villasante, Cristobal; Felsch, Torsten; Pérez, Carmen; Rodriguez, Jorge R.; Ghrissi, Meftah

    2016-05-01

    MAINBOT project has developed service robots based applications to autonomously execute inspection tasks in extensive industrial plants in equipment that is arranged horizontally (using ground robots) or vertically (climbing robots). The industrial objective has been to provide a means to help measuring several physical parameters in multiple points by autonomous robots, able to navigate and climb structures, handling non-destructive testing sensors. MAINBOT has validated the solutions in two solar thermal plants (cylindrical-parabolic collectors and central tower), that are very demanding from mobile manipulation point of view mainly due to the extension (e.g. a thermal solar plant of 50Mw, with 400 hectares, 400.000 mirrors, 180 km of absorber tubes, 140m height tower), the variability of conditions (outdoor, day-night), safety requirements, etc. Once the technology was validated in simulation, the system was deployed in real setups and different validation tests carried out. In this paper two of the achievements related with the ground mobile inspection system are presented: (1) Autonomous navigation localization and planning algorithms to manage navigation in huge extensions and (2) Non-Destructive Inspection operations: thermography based detection algorithms to provide automatic inspection abilities to the robots.

  3. Single-Frequency GPS Relative Navigation in a High Ionosphere Orbital Environment

    NASA Technical Reports Server (NTRS)

    Conrad, Patrick R.; Naasz, Bo J.

    2007-01-01

    The Global Positioning System (GPS) provides a convenient source for space vehicle relative navigation measurements, especially for low Earth orbit formation flying and autonomous rendezvous mission concepts. For single-frequency GPS receivers, ionospheric path delay can be a significant error source if not properly mitigated. In particular, ionospheric effects are known to cause significant radial position error bias and add dramatically to relative state estimation error if the onboard navigation software does not force the use of measurements from common or shared GPS space vehicles. Results from GPS navigation simulations are presented for a pair of space vehicles flying in formation and using GPS pseudorange measurements to perform absolute and relative orbit determination. With careful measurement selection techniques relative state estimation accuracy to less than 20 cm with standard GPS pseudorange processing and less than 10 cm with single-differenced pseudorange processing is shown.

  4. Learning for autonomous navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter.

  5. Multi-Spacecraft Autonomous Positioning System

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan

    2015-01-01

    As the number of spacecraft in simultaneous operation continues to grow, there is an increased dependency on ground-based navigation support. The current baseline system for deep space navigation utilizes Earth-based radiometric tracking, requiring long-duration observations to perform orbit determination and generate a state update. The age, complexity, and high utilization of the ground assets pose a risk to spacecraft navigation performance. In order to perform complex operations at large distances from Earth, such as extraterrestrial landing and proximity operations, autonomous systems are required. With increasingly complex mission operations, the need for frequent and Earth-independent navigation capabilities is further reinforced. The Multi-spacecraft Autonomous Positioning System (MAPS) takes advantage of the growing interspacecraft communication network and infrastructure to allow for Earth-autonomous state measurements to enable network-based space navigation. A notional concept of operations is given in figure 1. This network is already being implemented and routinely used in Martian communications through the use of the Mars Reconnaissance Orbiter and Mars Odyssey spacecraft as relays for surface assets. The growth of this communications architecture is continued through MAVEN, and future potential commercial Mars telecom orbiters. This growing network provides an initial Marslocal capability for inter-spacecraft communication and navigation. These navigation updates are enabled by cross-communication between assets in the network, coupled with onboard navigation estimation routines to integrate packet travel time to generate ranging measurements. Inter-spacecraft communication allows for frequent state broadcasts and time updates from trusted references. The architecture is a software-based solution, enabling its implementation on a wide variety of current assets, with the operational constraints and measurement accuracy determined by onboard systems.

  6. Satellite Imagery Assisted Road-Based Visual Navigation System

    NASA Astrophysics Data System (ADS)

    Volkova, A.; Gibbens, P. W.

    2016-06-01

    There is a growing demand for unmanned aerial systems as autonomous surveillance, exploration and remote sensing solutions. Among the key concerns for robust operation of these systems is the need to reliably navigate the environment without reliance on global navigation satellite system (GNSS). This is of particular concern in Defence circles, but is also a major safety issue for commercial operations. In these circumstances, the aircraft needs to navigate relying only on information from on-board passive sensors such as digital cameras. An autonomous feature-based visual system presented in this work offers a novel integral approach to the modelling and registration of visual features that responds to the specific needs of the navigation system. It detects visual features from Google Earth* build a feature database. The same algorithm then detects features in an on-board cameras video stream. On one level this serves to localise the vehicle relative to the environment using Simultaneous Localisation and Mapping (SLAM). On a second level it correlates them with the database to localise the vehicle with respect to the inertial frame. The performance of the presented visual navigation system was compared using the satellite imagery from different years. Based on comparison results, an analysis of the effects of seasonal, structural and qualitative changes of the imagery source on the performance of the navigation algorithm is presented. * The algorithm is independent of the source of satellite imagery and another provider can be used

  7. Autonomous Navigation for Deep Space Missions

    NASA Technical Reports Server (NTRS)

    Bhaskaran, Shyam

    2012-01-01

    Navigation (determining where the spacecraft is at any given time, controlling its path to achieve desired targets), performed using ground-in- the-loop techniques: (1) Data includes 2-way radiometric (Doppler, range), interferometric (Delta- Differential One-way Range), and optical (images of natural bodies taken by onboard camera) (2) Data received on the ground, processed to determine orbit, commands sent to execute maneuvers to control orbit. A self-contained, onboard, autonomous navigation system can: (1) Eliminate delays due to round-trip light time (2) Eliminate the human factors in ground-based processing (3) Reduce turnaround time from navigation update to minutes, down to seconds (4) React to late-breaking data. At JPL, we have developed the framework and computational elements of an autonomous navigation system, called AutoNav. It was originally developed as one of the technologies for the Deep Space 1 mission, launched in 1998; subsequently used on three other spacecraft, for four different missions. The primary use has been on comet missions to track comets during flybys, and impact one comet.

  8. Study of the Navigation Method for a Snake Robot Based on the Kinematics Model with MEMS IMU.

    PubMed

    Zhao, Xu; Dou, Lihua; Su, Zhong; Liu, Ning

    2018-03-16

    A snake robot is a type of highly redundant mobile robot that significantly differs from a tracked robot, wheeled robot and legged robot. To address the issue of a snake robot performing self-localization in the application environment without assistant orientation, an autonomous navigation method is proposed based on the snake robot's motion characteristic constraints. The method realized the autonomous navigation of the snake robot with non-nodes and an external assistant using its own Micro-Electromechanical-Systems (MEMS) Inertial-Measurement-Unit (IMU). First, it studies the snake robot's motion characteristics, builds the kinematics model, and then analyses the motion constraint characteristics and motion error propagation properties. Second, it explores the snake robot's navigation layout, proposes a constraint criterion and the fixed relationship, and makes zero-state constraints based on the motion features and control modes of a snake robot. Finally, it realizes autonomous navigation positioning based on the Extended-Kalman-Filter (EKF) position estimation method under the constraints of its motion characteristics. With the self-developed snake robot, the test verifies the proposed method, and the position error is less than 5% of Total-Traveled-Distance (TDD). In a short-distance environment, this method is able to meet the requirements of a snake robot in order to perform autonomous navigation and positioning in traditional applications and can be extended to other familiar multi-link robots.

  9. Visual identification and similarity measures used for on-line motion planning of autonomous robots in unknown environments

    NASA Astrophysics Data System (ADS)

    Martínez, Fredy; Martínez, Fernando; Jacinto, Edwar

    2017-02-01

    In this paper we propose an on-line motion planning strategy for autonomous robots in dynamic and locally observable environments. In this approach, we first visually identify geometric shapes in the environment by filtering images. Then, an ART-2 network is used to establish the similarity between patterns. The proposed algorithm allows that a robot establish its relative location in the environment, and define its navigation path based on images of the environment and its similarity to reference images. This is an efficient and minimalist method that uses the similarity of landmark view patterns to navigate to the desired destination. Laboratory tests on real prototypes demonstrate the performance of the algorithm.

  10. DIDO Optimization of a Lunar Landing Trajectory with Respect to Autonomous Landing Hazard Avoidance Technology

    DTIC Science & Technology

    2009-09-01

    22 b. Hazard Detection and Avoidance ( HDA )...............................22 c. Hazard Relative Navigation (HRN...Navigation (HRN) and Hazard Detection and Avoidance ( HDA ). In addition to the TRN and HDA sensors used during these phases, which will be discussed...and Avoidance ( HDA ) During the HAD phase, the expected landing site is examined and evaluated, and a new site may be selected. Using the HDA

  11. Simulating the Liaison Navigation Concept in a Geo + Earth-Moon Halo Constellation

    NASA Technical Reports Server (NTRS)

    Fujimoto, K.; Leonard, J. M.; McGranaghan, R. M.; Parker, J. S.; Anderson, R. L.; Born, G. H.

    2012-01-01

    Linked Autonomous Interplanetary Satellite Orbit Navigation, or LiAISON, is a novel satellite navigation technique where relative radiometric measurements between two or more spacecraft in a constellation are processed to obtain the absolute state of all spacecraft. The method leverages the asymmetry of the gravity field that the constellation exists in. This paper takes a step forward in developing a high fidelity navigation simulation for the LiAISON concept in an Earth-Moon constellation. In particular, we aim to process two-way Doppler measurements between a satellite in GEO orbit and another in a halo orbit about the Earth-Moon L1 point.

  12. Mamdani Fuzzy System for Indoor Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Khan, M. K. A. Ahamed; Rashid, Razif; Elamvazuthi, I.

    2011-06-01

    Several control algorithms for autonomous mobile robot navigation have been proposed in the literature. Recently, the employment of non-analytical methods of computing such as fuzzy logic, evolutionary computation, and neural networks has demonstrated the utility and potential of these paradigms for intelligent control of mobile robot navigation. In this paper, Mamdani fuzzy system for an autonomous mobile robot is developed. The paper begins with the discussion on the conventional controller and then followed by the description of fuzzy logic controller in detail.

  13. Draper Laboratory small autonomous aerial vehicle

    NASA Astrophysics Data System (ADS)

    DeBitetto, Paul A.; Johnson, Eric N.; Bosse, Michael C.; Trott, Christian A.

    1997-06-01

    The Charles Stark Draper Laboratory, Inc. and students from Massachusetts Institute of Technology and Boston University have cooperated to develop an autonomous aerial vehicle that won the 1996 International Aerial Robotics Competition. This paper describes the approach, system architecture and subsystem designs for the entry. This entry represents a combination of many technology areas: navigation, guidance, control, vision processing, human factors, packaging, power, real-time software, and others. The aerial vehicle, an autonomous helicopter, performs navigation and control functions using multiple sensors: differential GPS, inertial measurement unit, sonar altimeter, and a flux compass. The aerial transmits video imagery to the ground. A ground based vision processor converts the image data into target position and classification estimates. The system was designed, built, and flown in less than one year and has provided many lessons about autonomous vehicle systems, several of which are discussed. In an appendix, our current research in augmenting the navigation system with vision- based estimates is presented.

  14. Preliminary navigation accuracy analysis for the TDRSS Onboard Navigation System (TONS) experiment on EP/EUVE

    NASA Technical Reports Server (NTRS)

    Gramling, C. J.; Long, A. C.; Lee, T.; Ottenstein, N. A.; Samii, M. V.

    1991-01-01

    A Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) is currently being developed by NASA to provide a high accuracy autonomous navigation capability for users of TDRSS and its successor, the Advanced TDRSS (ATDRSS). The fully autonomous user onboard navigation system will support orbit determination, time determination, and frequency determination, based on observation of a continuously available, unscheduled navigation beacon signal. A TONS experiment will be performed in conjunction with the Explorer Platform (EP) Extreme Ultraviolet Explorer (EUVE) mission to flight quality TONS Block 1. An overview is presented of TONS and a preliminary analysis of the navigation accuracy anticipated for the TONS experiment. Descriptions of the TONS experiment and the associated navigation objectives, as well as a description of the onboard navigation algorithms, are provided. The accuracy of the selected algorithms is evaluated based on the processing of realistic simulated TDRSS one way forward link Doppler measurements. The analysis process is discussed and the associated navigation accuracy results are presented.

  15. Automated Rendezvous and Docking Sensor Testing at the Flight Robotics Laboratory

    NASA Technical Reports Server (NTRS)

    Mitchell, J.; Johnston, A.; Howard, R.; Williamson, M.; Brewster, L.; Strack, D.; Cryan, S.

    2007-01-01

    The Exploration Systems Architecture defines missions that require rendezvous, proximity operations, and docking (RPOD) of two spacecraft both in Low Earth Orbit (LEO) and in Low Lunar Orbit (LLO). Uncrewed spacecraft must perform automated and/or autonomous rendezvous, proximity operations and docking operations (commonly known as Automated Rendezvous and Docking, AR&D). The crewed versions may also perform AR&D, possibly with a different level of automation and/or autonomy, and must also provide the crew with relative navigation information for manual piloting. The capabilities of the RPOD sensors are critical to the success of the Exploration Program. NASA has the responsibility to determine whether the Crew Exploration Vehicle (CEV) contractor-proposed relative navigation sensor suite will meet the CEV requirements. The relatively low technology readiness of relative navigation sensors for AR&D has been carried as one of the CEV Projects top risks. The AR&D Sensor Technology Project seeks to reduce this risk by increasing technology maturation of selected relative navigation sensor technologies through testing and simulation, and to allow the CEV Project to assess the relative navigation sensors.

  16. Automated Rendezvous and Docking Sensor Testing at the Flight Robotics Laboratory

    NASA Technical Reports Server (NTRS)

    Howard, Richard T.; Williamson, Marlin L.; Johnston, Albert S.; Brewster, Linda L.; Mitchell, Jennifer D.; Cryan, Scott P.; Strack, David; Key, Kevin

    2007-01-01

    The Exploration Systems Architecture defines missions that require rendezvous, proximity operations, and docking (RPOD) of two spacecraft both in Low Earth Orbit (LEO) and in Low Lunar Orbit (LLO). Uncrewed spacecraft must perform automated and/or autonomous rendezvous, proximity operations and docking operations (commonly known as Automated Rendezvous and Docking, (AR&D).) The crewed versions of the spacecraft may also perform AR&D, possibly with a different level of automation and/or autonomy, and must also provide the crew with relative navigation information for manual piloting. The capabilities of the RPOD sensors are critical to the success of the Exploration Program. NASA has the responsibility to determine whether the Crew Exploration Vehicle (CEV) contractor-proposed relative navigation sensor suite will meet the CEV requirements. The relatively low technology readiness of relative navigation sensors for AR&D has been carried as one of the CEV Projects top risks. The AR&D Sensor Technology Project seeks to reduce this risk by increasing technology maturation of selected relative navigation sensor technologies through testing and simulation, and to allow the CEV Project to assess the relative navigation sensors.

  17. Spatial abstraction for autonomous robot navigation.

    PubMed

    Epstein, Susan L; Aroor, Anoop; Evanusa, Matthew; Sklar, Elizabeth I; Parsons, Simon

    2015-09-01

    Optimal navigation for a simulated robot relies on a detailed map and explicit path planning, an approach problematic for real-world robots that are subject to noise and error. This paper reports on autonomous robots that rely on local spatial perception, learning, and commonsense rationales instead. Despite realistic actuator error, learned spatial abstractions form a model that supports effective travel.

  18. SLAM algorithm applied to robotics assistance for navigation in unknown environments.

    PubMed

    Cheein, Fernando A Auat; Lopez, Natalia; Soria, Carlos M; di Sciascio, Fernando A; Pereira, Fernando Lobo; Carelli, Ricardo

    2010-02-17

    The combination of robotic tools with assistance technology determines a slightly explored area of applications and advantages for disability or elder people in their daily tasks. Autonomous motorized wheelchair navigation inside an environment, behaviour based control of orthopaedic arms or user's preference learning from a friendly interface are some examples of this new field. In this paper, a Simultaneous Localization and Mapping (SLAM) algorithm is implemented to allow the environmental learning by a mobile robot while its navigation is governed by electromyographic signals. The entire system is part autonomous and part user-decision dependent (semi-autonomous). The environmental learning executed by the SLAM algorithm and the low level behaviour-based reactions of the mobile robot are robotic autonomous tasks, whereas the mobile robot navigation inside an environment is commanded by a Muscle-Computer Interface (MCI). In this paper, a sequential Extended Kalman Filter (EKF) feature-based SLAM algorithm is implemented. The features correspond to lines and corners -concave and convex- of the environment. From the SLAM architecture, a global metric map of the environment is derived. The electromyographic signals that command the robot's movements can be adapted to the patient's disabilities. For mobile robot navigation purposes, five commands were obtained from the MCI: turn to the left, turn to the right, stop, start and exit. A kinematic controller to control the mobile robot was implemented. A low level behavior strategy was also implemented to avoid robot's collisions with the environment and moving agents. The entire system was tested in a population of seven volunteers: three elder, two below-elbow amputees and two young normally limbed patients. The experiments were performed within a closed low dynamic environment. Subjects took an average time of 35 minutes to navigate the environment and to learn how to use the MCI. The SLAM results have shown a consistent reconstruction of the environment. The obtained map was stored inside the Muscle-Computer Interface. The integration of a highly demanding processing algorithm (SLAM) with a MCI and the communication between both in real time have shown to be consistent and successful. The metric map generated by the mobile robot would allow possible future autonomous navigation without direct control of the user, whose function could be relegated to choose robot destinations. Also, the mobile robot shares the same kinematic model of a motorized wheelchair. This advantage can be exploited for wheelchair autonomous navigation.

  19. NASA Office of Aeronautics and Space Technology Summer Workshop. Volume 3: Navigation, guidance and control panel

    NASA Technical Reports Server (NTRS)

    1975-01-01

    User technology requirements are identified in relation to needed technology advancement for future space missions in the areas of navigation, guidance, and control. Emphasis is placed on: reduction of mission support cost by 50% through autonomous operation, a ten-fold increase in mission output through improved pointing and control, and a hundred-fold increase in human productivity in space through large-scale teleoperator applications.

  20. Study of the Navigation Method for a Snake Robot Based on the Kinematics Model with MEMS IMU

    PubMed Central

    Dou, Lihua; Su, Zhong; Liu, Ning

    2018-01-01

    A snake robot is a type of highly redundant mobile robot that significantly differs from a tracked robot, wheeled robot and legged robot. To address the issue of a snake robot performing self-localization in the application environment without assistant orientation, an autonomous navigation method is proposed based on the snake robot’s motion characteristic constraints. The method realized the autonomous navigation of the snake robot with non-nodes and an external assistant using its own Micro-Electromechanical-Systems (MEMS) Inertial-Measurement-Unit (IMU). First, it studies the snake robot’s motion characteristics, builds the kinematics model, and then analyses the motion constraint characteristics and motion error propagation properties. Second, it explores the snake robot’s navigation layout, proposes a constraint criterion and the fixed relationship, and makes zero-state constraints based on the motion features and control modes of a snake robot. Finally, it realizes autonomous navigation positioning based on the Extended-Kalman-Filter (EKF) position estimation method under the constraints of its motion characteristics. With the self-developed snake robot, the test verifies the proposed method, and the position error is less than 5% of Total-Traveled-Distance (TDD). In a short-distance environment, this method is able to meet the requirements of a snake robot in order to perform autonomous navigation and positioning in traditional applications and can be extended to other familiar multi-link robots. PMID:29547515

  1. New Vectorial Propulsion System and Trajectory Control Designs for Improved AUV Mission Autonomy.

    PubMed

    Masmitja, Ivan; Gonzalez, Julian; Galarza, Cesar; Gomariz, Spartacus; Aguzzi, Jacopo; Del Rio, Joaquin

    2018-04-17

    Autonomous Underwater Vehicles (AUV) are proving to be a promising platform design for multidisciplinary autonomous operability with a wide range of applications in marine ecology and geoscience. Here, two novel contributions towards increasing the autonomous navigation capability of a new AUV prototype (the Guanay II) as a mix between a propelled vehicle and a glider are presented. Firstly, a vectorial propulsion system has been designed to provide full vehicle maneuverability in both horizontal and vertical planes. Furthermore, two controllers have been designed, based on fuzzy controls, to provide the vehicle with autonomous navigation capabilities. Due to the decoupled system propriety, the controllers in the horizontal plane have been designed separately from the vertical plane. This class of non-linear controllers has been used to interpret linguistic laws into different zones of functionality. This method provided good performance, used as interpolation between different rules or linear controls. Both improvements have been validated through simulations and field tests, displaying good performance results. Finally, the conclusion of this work is that the Guanay II AUV has a solid controller to perform autonomous navigation and carry out vertical immersions.

  2. An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles.

    PubMed

    Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo

    2017-03-25

    Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance.

  3. An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles

    PubMed Central

    Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo

    2017-01-01

    Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance. PMID:28346346

  4. Autonomous vehicle navigation utilizing fuzzy controls concepts for a next generation wheelchair.

    PubMed

    Hansen, J D; Barrett, S F; Wright, C H G; Wilcox, M

    2008-01-01

    Three different positioning techniques were investigated to create an autonomous vehicle that could accurately navigate towards a goal: Global Positioning System (GPS), compass dead reckoning, and Ackerman steering. Each technique utilized a fuzzy logic controller that maneuvered a four-wheel car towards a target. The reliability and the accuracy of the navigation methods were investigated by modeling the algorithms in software and implementing them in hardware. To implement the techniques in hardware, positioning sensors were interfaced to a remote control car and a microprocessor. The microprocessor utilized the sensor measurements to orient the car with respect to the target. Next, a fuzzy logic control algorithm adjusted the front wheel steering angle to minimize the difference between the heading and bearing. After minimizing the heading error, the car maintained a straight steering angle along its path to the final destination. The results of this research can be used to develop applications that require precise navigation. The design techniques can also be implemented on alternate platforms such as a wheelchair to assist with autonomous navigation.

  5. 76 FR 21772 - Navigation Safety Advisory Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-18

    ..., routing measures, marine information, diving safety, and aids to navigation systems. Agenda The NAVSAC... discussion of autonomous unmanned vessels and discuss their implications for the Inland Navigation Rules. A... DEPARTMENT OF HOMELAND SECURITY Coast Guard [Docket No. USCG-2011-0204] Navigation Safety Advisory...

  6. Flight Mechanics/Estimation Theory Symposium. [with application to autonomous navigation and attitude/orbit determination

    NASA Technical Reports Server (NTRS)

    Fuchs, A. J. (Editor)

    1979-01-01

    Onboard and real time image processing to enhance geometric correction of the data is discussed with application to autonomous navigation and attitude and orbit determination. Specific topics covered include: (1) LANDSAT landmark data; (2) star sensing and pattern recognition; (3) filtering algorithms for Global Positioning System; and (4) determining orbital elements for geostationary satellites.

  7. Developments in Acoustic Navigation and Communication for High-Latitude Ocean Research

    NASA Astrophysics Data System (ADS)

    Gobat, J.; Lee, C.

    2006-12-01

    Developments in autonomous platforms (profiling floats, drifters, long-range gliders and propeller-driven vehicles) offer the possibility of unprecedented access to logistically difficult polar regions that challenge conventional techniques. Currently, however, navigation and telemetry for these platforms rely on satellite positioning and communications poorly suited for high-latitude applications where ice cover restricts access to the sea surface. A similar infrastructure offering basin-wide acoustic geolocation and telemetry would allow the community to employ autonomous platforms to address previously intractable problems in Arctic oceanography. Two recent efforts toward the development of such an infrastructure are reported here. As part of an observational array monitoring fluxes through Davis Strait, development of real-time RAFOS acoustic navigation for gliders has been ongoing since autumn 2004. To date, test deployments have been conducted in a 260 Hz field in the Pacific and 780 Hz fields off Norway and in Davis Strait. Real-time navigation accuracy of ~1~km is achievable. Autonomously navigating gliders will operate under ice cover beginning in autumn 2006. In addition to glider navigation development, the Davis Strait array moorings carry fixed RAFOS recorders to study propagation over a range of distances under seasonally varying ice cover. Results from the under-ice propagation and glider navigation experiments are presented. Motivated by the need to coordinate these types of development efforts, an international group of acousticians, autonomous platform developers, high-latitude oceanographers and marine mammal researchers gathered in Seattle, U.S.A. from 27 February -- 1 March 2006 for an NSF Office of Polar Programs sponsored Acoustic Navigation and Communication for High-latitude Ocean Research (ANCHOR) workshop. Workshop participants focused on summarizing the current state of knowledge concerning Arctic acoustics, navigation and communications, developing an overarching system specification to guide community-wide engineering efforts and establishing an active community and steering group to guide long-term engineering efforts and ensure interoperability. This presentation will summarize ANCHOR workshop findings.

  8. The use of x-ray pulsar-based navigation method for interplanetary flight

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Guo, Xingcan; Yang, Yong

    2009-07-01

    As interplanetary missions are increasingly complex, the existing unique mature interplanetary navigation method mainly based on radiometric tracking techniques of Deep Space Network can not meet the rising demands of autonomous real-time navigation. This paper studied the applications for interplanetary flights of a new navigation technology under rapid development-the X-ray pulsar-based navigation for spacecraft (XPNAV), and valued its performance with a computer simulation. The XPNAV is an excellent autonomous real-time navigation method, and can provide comprehensive navigation information, including position, velocity, attitude, attitude rate and time. In the paper the fundamental principles and time transformation of the XPNAV were analyzed, and then the Delta-correction XPNAV blending the vehicles' trajectory dynamics with the pulse time-of-arrival differences at nominal and estimated spacecraft locations within an Unscented Kalman Filter (UKF) was discussed with a background mission of Mars Pathfinder during the heliocentric transferring orbit. The XPNAV has an intractable problem of integer pulse phase cycle ambiguities similar to the GPS carrier phase navigation. This article innovatively proposed the non-ambiguity assumption approach based on an analysis of the search space array method to resolve pulse phase cycle ambiguities between the nominal position and estimated position of the spacecraft. The simulation results show that the search space array method are computationally intensive and require long processing time when the position errors are large, and the non-ambiguity assumption method can solve ambiguity problem quickly and reliably. It is deemed that autonomous real-time integrated navigation system of the XPNAV blending with DSN, celestial navigation, inertial navigation and so on will be the development direction of interplanetary flight navigation system in the future.

  9. INS integrated motion analysis for autonomous vehicle navigation

    NASA Technical Reports Server (NTRS)

    Roberts, Barry; Bazakos, Mike

    1991-01-01

    The use of inertial navigation system (INS) measurements to enhance the quality and robustness of motion analysis techniques used for obstacle detection is discussed with particular reference to autonomous vehicle navigation. The approach to obstacle detection used here employs motion analysis of imagery generated by a passive sensor. Motion analysis of imagery obtained during vehicle travel is used to generate range measurements to points within the field of view of the sensor, which can then be used to provide obstacle detection. Results obtained with an INS integrated motion analysis approach are reviewed.

  10. COBALT: A GN&C Payload for Testing ALHAT Capabilities in Closed-Loop Terrestrial Rocket Flights

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Amzajerdian, Farzin; Hines, Glenn D.; O'Neal, Travis V.; Robertson, Edward A.; Seubert, Carl; Trawny, Nikolas

    2016-01-01

    The COBALT (CoOperative Blending of Autonomous Landing Technology) payload is being developed within NASA as a risk reduction activity to mature, integrate and test ALHAT (Autonomous precision Landing and Hazard Avoidance Technology) systems targeted for infusion into near-term robotic and future human space flight missions. The initial COBALT payload instantiation is integrating the third-generation ALHAT Navigation Doppler Lidar (NDL) sensor, for ultra high-precision velocity plus range measurements, with the passive-optical Lander Vision System (LVS) that provides Terrain Relative Navigation (TRN) global-position estimates. The COBALT payload will be integrated onboard a rocket-propulsive terrestrial testbed and will provide precise navigation estimates and guidance planning during two flight test campaigns in 2017 (one open-loop and closed- loop). The NDL is targeting performance capabilities desired for future Mars and Moon Entry, Descent and Landing (EDL). The LVS is already baselined for TRN on the Mars 2020 robotic lander mission. The COBALT platform will provide NASA with a new risk-reduction capability to test integrated EDL Guidance, Navigation and Control (GN&C) components in closed-loop flight demonstrations prior to the actual mission EDL.

  11. An Efficient Model-Based Image Understanding Method for an Autonomous Vehicle.

    DTIC Science & Technology

    1997-09-01

    The problem discussed in this dissertation is the development of an efficient method for visual navigation of autonomous vehicles . The approach is to... autonomous vehicles . Thus the new method is implemented as a component of the image-understanding system in the autonomous mobile robot Yamabico-11 at

  12. Sandia National Laboratories proof-of-concept robotic security vehicle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrington, J.J.; Jones, D.P.; Klarer, P.R.

    1989-01-01

    Several years ago Sandia National Laboratories developed a prototype interior robot that could navigate autonomously inside a large complex building to air and test interior intrusion detection systems. Recently the Department of Energy Office of Safeguards and Security has supported the development of a vehicle that will perform limited security functions autonomously in a structured exterior environment. The goal of the first phase of this project was to demonstrate the feasibility of an exterior robotic vehicle for security applications by using converted interior robot technology, if applicable. An existing teleoperational test bed vehicle with remote driving controls was modified andmore » integrated with a newly developed command driving station and navigation system hardware and software to form the Robotic Security Vehicle (RSV) system. The RSV, also called the Sandia Mobile Autonomous Navigator (SANDMAN), has been successfully used to demonstrate that teleoperated security vehicles which can perform limited autonomous functions are viable and have the potential to decrease security manpower requirements and improve system capabilities. 2 refs., 3 figs.« less

  13. Solar Thermal Utility-Scale Joint Venture Program (USJVP) Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MANCINI,THOMAS R.

    2001-04-01

    Several years ago Sandia National Laboratories developed a prototype interior robot [1] that could navigate autonomously inside a large complex building to aid and test interior intrusion detection systems. Recently the Department of Energy Office of Safeguards and Security has supported the development of a vehicle that will perform limited security functions autonomously in a structured exterior environment. The goal of the first phase of this project was to demonstrate the feasibility of an exterior robotic vehicle for security applications by using converted interior robot technology, if applicable. An existing teleoperational test bed vehicle with remote driving controls was modifiedmore » and integrated with a newly developed command driving station and navigation system hardware and software to form the Robotic Security Vehicle (RSV) system. The RSV, also called the Sandia Mobile Autonomous Navigator (SANDMAN), has been successfully used to demonstrate that teleoperated security vehicles which can perform limited autonomous functions are viable and have the potential to decrease security manpower requirements and improve system capabilities.« less

  14. Control technique for planetary rover

    NASA Technical Reports Server (NTRS)

    Nakatani, Ichiro; Kubota, Takashi; Adachi, Tadashi; Saitou, Hiroaki; Okamoto, Sinya

    1994-01-01

    Beginning next century, several schemes for sending a planetary rover to the moon or Mars are being planned. As part of the development program, autonomous navigation technology is being studied to allow the rover the ability to move autonomously over a long range of unknown planetary surface. In the previous study, we ran the autonomous navigation experiment on an outdoor test terrain by using a rover test-bed that was controlled by a conventional sense-plan-act method. In some cases during the experiment, a problem occurred with the rover moving into untraversable areas. To improve this situation, a new control technique has been developed that gives the rover the ability of reacting to the outputs of the proximity sensors, a reaction behavior if you will. We have developed a new rover test-bed system on which an autonomous navigation experiment was performed using the newly developed control technique. In this outdoor experiment, the new control technique effectively produced the control command for the rover to avoid obstacles and be guided to the goal point safely.

  15. Angles-only relative orbit determination in low earth orbit

    NASA Astrophysics Data System (ADS)

    Ardaens, Jean-Sébastien; Gaias, Gabriella

    2018-06-01

    The paper provides an overview of the angles-only relative orbit determination activities conducted to support the Autonomous Vision Approach Navigation and Target Identification (AVANTI) experiment. This in-orbit endeavor was carried out by the German Space Operations Center (DLR/GSOC) in autumn 2016 to demonstrate the capability to perform spaceborne autonomous close-proximity operations using solely line-of-sight measurements. The images collected onboard have been reprocessed by an independent on-ground facility for precise relative orbit determination, which served as ultimate instance to monitor the formation safety and to characterize the onboard navigation and control performances. During two months, several rendezvous have been executed, generating a valuable collection of images taken at distances ranging from 50 km to only 50 m. Despite challenging experimental conditions characterized by a poor visibility and strong orbit perturbations, angles-only relative positioning products could be continuously derived throughout the whole experiment timeline, promising accuracy at the meter level during the close approaches. The results presented in the paper are complemented with former angles-only experience gained with the PRISMA satellites to better highlight the specificities induced by different orbits and satellite designs.

  16. Fully autonomous navigation for the NASA cargo transfer vehicle

    NASA Technical Reports Server (NTRS)

    Wertz, James R.; Skulsky, E. David

    1991-01-01

    A great deal of attention has been paid to navigation during the close approach (less than or equal to 1 km) phase of spacecraft rendezvous. However, most spacecraft also require a navigation system which provides the necessary accuracy for placing both satellites within the range of the docking sensors. The Microcosm Autonomous Navigation System (MANS) is an on-board system which uses Earth-referenced attitude sensing hardware to provide precision orbit and attitude determination. The system is capable of functioning from LEO to GEO and beyond. Performance depends on the number of available sensors as well as mission geometry; however, extensive simulations have shown that MANS will provide 100 m to 400 m (3(sigma)) position accuracy and 0.03 to 0.07 deg (3(sigma)) attitude accuracy in low Earth orbit. The system is independent of any external source, including GPS. MANS is expected to have a significant impact on ground operations costs, mission definition and design, survivability, and the potential development of very low-cost, fully autonomous spacecraft.

  17. Autonomous Navigation Error Propagation Assessment for Lunar Surface Mobility Applications

    NASA Technical Reports Server (NTRS)

    Welch, Bryan W.; Connolly, Joseph W.

    2006-01-01

    The NASA Vision for Space Exploration is focused on the return of astronauts to the Moon. While navigation systems have already been proven in the Apollo missions to the moon, the current exploration campaign will involve more extensive and extended missions requiring new concepts for lunar navigation. In this document, the results of an autonomous navigation error propagation assessment are provided. The analysis is intended to be the baseline error propagation analysis for which Earth-based and Lunar-based radiometric data are added to compare these different architecture schemes, and quantify the benefits of an integrated approach, in how they can handle lunar surface mobility applications when near the Lunar South pole or on the Lunar Farside.

  18. ATON (Autonomous Terrain-based Optical Navigation) for exploration missions: recent flight test results

    NASA Astrophysics Data System (ADS)

    Theil, S.; Ammann, N.; Andert, F.; Franz, T.; Krüger, H.; Lehner, H.; Lingenauber, M.; Lüdtke, D.; Maass, B.; Paproth, C.; Wohlfeil, J.

    2018-03-01

    Since 2010 the German Aerospace Center is working on the project Autonomous Terrain-based Optical Navigation (ATON). Its objective is the development of technologies which allow autonomous navigation of spacecraft in orbit around and during landing on celestial bodies like the Moon, planets, asteroids and comets. The project developed different image processing techniques and optical navigation methods as well as sensor data fusion. The setup—which is applicable to many exploration missions—consists of an inertial measurement unit, a laser altimeter, a star tracker and one or multiple navigation cameras. In the past years, several milestones have been achieved. It started with the setup of a simulation environment including the detailed simulation of camera images. This was continued by hardware-in-the-loop tests in the Testbed for Robotic Optical Navigation (TRON) where images were generated by real cameras in a simulated downscaled lunar landing scene. Data were recorded in helicopter flight tests and post-processed in real-time to increase maturity of the algorithms and to optimize the software. Recently, two more milestones have been achieved. In late 2016, the whole navigation system setup was flying on an unmanned helicopter while processing all sensor information onboard in real time. For the latest milestone the navigation system was tested in closed-loop on the unmanned helicopter. For that purpose the ATON navigation system provided the navigation state for the guidance and control of the unmanned helicopter replacing the GPS-based standard navigation system. The paper will give an introduction to the ATON project and its concept. The methods and algorithms of ATON are briefly described. The flight test results of the latest two milestones are presented and discussed.

  19. A traffic priority language for collision-free navigation of autonomous mobile robots in dynamic environments.

    PubMed

    Bourbakis, N G

    1997-01-01

    This paper presents a generic traffic priority language, called KYKLOFORTA, used by autonomous robots for collision-free navigation in a dynamic unknown or known navigation space. In a previous work by X. Grossmman (1988), a set of traffic control rules was developed for the navigation of the robots on the lines of a two-dimensional (2-D) grid and a control center coordinated and synchronized their movements. In this work, the robots are considered autonomous: they are moving anywhere and in any direction inside the free space, and there is no need of a central control to coordinate and synchronize them. The requirements for each robot are i) visual perception, ii) range sensors, and iii) the ability of each robot to detect other moving objects in the same free navigation space, define the other objects perceived size, their velocity and their directions. Based on these assumptions, a traffic priority language is needed for each robot, making it able to decide during the navigation and avoid possible collision with other moving objects. The traffic priority language proposed here is based on a set of primitive traffic priority alphabet and rules which compose pattern of corridors for the application of the traffic priority rules.

  20. SLAM algorithm applied to robotics assistance for navigation in unknown environments

    PubMed Central

    2010-01-01

    Background The combination of robotic tools with assistance technology determines a slightly explored area of applications and advantages for disability or elder people in their daily tasks. Autonomous motorized wheelchair navigation inside an environment, behaviour based control of orthopaedic arms or user's preference learning from a friendly interface are some examples of this new field. In this paper, a Simultaneous Localization and Mapping (SLAM) algorithm is implemented to allow the environmental learning by a mobile robot while its navigation is governed by electromyographic signals. The entire system is part autonomous and part user-decision dependent (semi-autonomous). The environmental learning executed by the SLAM algorithm and the low level behaviour-based reactions of the mobile robot are robotic autonomous tasks, whereas the mobile robot navigation inside an environment is commanded by a Muscle-Computer Interface (MCI). Methods In this paper, a sequential Extended Kalman Filter (EKF) feature-based SLAM algorithm is implemented. The features correspond to lines and corners -concave and convex- of the environment. From the SLAM architecture, a global metric map of the environment is derived. The electromyographic signals that command the robot's movements can be adapted to the patient's disabilities. For mobile robot navigation purposes, five commands were obtained from the MCI: turn to the left, turn to the right, stop, start and exit. A kinematic controller to control the mobile robot was implemented. A low level behavior strategy was also implemented to avoid robot's collisions with the environment and moving agents. Results The entire system was tested in a population of seven volunteers: three elder, two below-elbow amputees and two young normally limbed patients. The experiments were performed within a closed low dynamic environment. Subjects took an average time of 35 minutes to navigate the environment and to learn how to use the MCI. The SLAM results have shown a consistent reconstruction of the environment. The obtained map was stored inside the Muscle-Computer Interface. Conclusions The integration of a highly demanding processing algorithm (SLAM) with a MCI and the communication between both in real time have shown to be consistent and successful. The metric map generated by the mobile robot would allow possible future autonomous navigation without direct control of the user, whose function could be relegated to choose robot destinations. Also, the mobile robot shares the same kinematic model of a motorized wheelchair. This advantage can be exploited for wheelchair autonomous navigation. PMID:20163735

  1. Relative Navigation for Formation Flying of Spacecraft

    NASA Technical Reports Server (NTRS)

    Alonso, Roberto; Du, Ju-Young; Hughes, Declan; Junkins, John L.; Crassidis, John L.

    2001-01-01

    This paper presents a robust and efficient approach for relative navigation and attitude estimation of spacecraft flying in formation. This approach uses measurements from a new optical sensor that provides a line of sight vector from the master spacecraft to the secondary satellite. The overall system provides a novel, reliable, and autonomous relative navigation and attitude determination system, employing relatively simple electronic circuits with modest digital signal processing requirements and is fully independent of any external systems. Experimental calibration results are presented, which are used to achieve accurate line of sight measurements. State estimation for formation flying is achieved through an optimal observer design. Also, because the rotational and translational motions are coupled through the observation vectors, three approaches are suggested to separate both signals just for stability analysis. Simulation and experimental results indicate that the combined sensor/estimator approach provides accurate relative position and attitude estimates.

  2. A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft

    NASA Astrophysics Data System (ADS)

    Leishman, Robert C.

    Autonomous flight in unstructured, confined, and unknown GPS-denied environments is a challenging problem. Solutions could be tremendously beneficial for scenarios that require information about areas that are difficult to access and that present a great amount of risk. The goal of this research is to develop a new framework that enables improved solutions to this problem and to validate the approach with experiments using a hardware prototype. In Chapter 2 we examine the consequences and practical aspects of using an improved dynamic model for multirotor state estimation, using only IMU measurements. The improved model correctly explains the measurements available from the accelerometers on a multirotor. We provide hardware results demonstrating the improved attitude, velocity and even position estimates that can be achieved through the use of this model. We propose a new architecture to simplify some of the challenges that constrain GPS-denied aerial flight in Chapter 3. At the core, the approach combines visual graph-SLAM with a multiplicative extended Kalman filter (MEKF). More importantly, we depart from the common practice of estimating global states and instead keep the position and yaw states of the MEKF relative to the current node in the map. This relative navigation approach provides a tremendous benefit compared to maintaining estimates with respect to a single global coordinate frame. We discuss the architecture of this new system and provide important details for each component. We verify the approach with goal-directed autonomous flight-test results. The MEKF is the basis of the new relative navigation approach and is detailed in Chapter 4. We derive the relative filter and show how the states must be augmented and marginalized each time a new node is declared. The relative estimation approach is verified using hardware flight test results accompanied by comparisons to motion capture truth. Additionally, flight results with estimates in the control loop are provided. We believe that the relative, vision-based framework described in this work is an important step in furthering the capabilities of indoor aerial navigation in confined, unknown environments. Current approaches incur challenging problems by requiring globally referenced states. Utilizing a relative approach allows more flexibility as the critical, real-time processes of localization and control do not depend on computationally-demanding optimization and loop-closure processes.

  3. Autonomous Navigation Results from the Mars Exploration Rover (MER) Mission

    NASA Technical Reports Server (NTRS)

    Maimone, Mark; Johnson, Andrew; Cheng, Yang; Willson, Reg; Matthies, Larry H.

    2004-01-01

    In January, 2004, the Mars Exploration Rover (MER) mission landed two rovers, Spirit and Opportunity, on the surface of Mars. Several autonomous navigation capabilities were employed in space for the first time in this mission. ]n the Entry, Descent, and Landing (EDL) phase, both landers used a vision system called the, Descent Image Motion Estimation System (DIMES) to estimate horizontal velocity during the last 2000 meters (m) of descent, by tracking features on the ground with a downlooking camera, in order to control retro-rocket firing to reduce horizontal velocity before impact. During surface operations, the rovers navigate autonomously using stereo vision for local terrain mapping and a local, reactive planning algorithm called Grid-based Estimation of Surface Traversability Applied to Local Terrain (GESTALT) for obstacle avoidance. ]n areas of high slip, stereo vision-based visual odometry has been used to estimate rover motion, As of mid-June, Spirit had traversed 3405 m, of which 1253 m were done autonomously; Opportunity had traversed 1264 m, of which 224 m were autonomous. These results have contributed substantially to the success of the mission and paved the way for increased levels of autonomy in future missions.

  4. New Vectorial Propulsion System and Trajectory Control Designs for Improved AUV Mission Autonomy

    PubMed Central

    Gonzalez, Julian; Galarza, Cesar; Aguzzi, Jacopo; del Rio, Joaquin

    2018-01-01

    Autonomous Underwater Vehicles (AUV) are proving to be a promising platform design for multidisciplinary autonomous operability with a wide range of applications in marine ecology and geoscience. Here, two novel contributions towards increasing the autonomous navigation capability of a new AUV prototype (the Guanay II) as a mix between a propelled vehicle and a glider are presented. Firstly, a vectorial propulsion system has been designed to provide full vehicle maneuverability in both horizontal and vertical planes. Furthermore, two controllers have been designed, based on fuzzy controls, to provide the vehicle with autonomous navigation capabilities. Due to the decoupled system propriety, the controllers in the horizontal plane have been designed separately from the vertical plane. This class of non-linear controllers has been used to interpret linguistic laws into different zones of functionality. This method provided good performance, used as interpolation between different rules or linear controls. Both improvements have been validated through simulations and field tests, displaying good performance results. Finally, the conclusion of this work is that the Guanay II AUV has a solid controller to perform autonomous navigation and carry out vertical immersions. PMID:29673224

  5. Sign detection for autonomous navigation

    NASA Astrophysics Data System (ADS)

    Goodsell, Thomas G.; Snorrason, Magnus S.; Cartwright, Dustin; Stube, Brian; Stevens, Mark R.; Ablavsky, Vitaly X.

    2003-09-01

    Mobile robots currently cannot detect and read arbitrary signs. This is a major hindrance to mobile robot usability, since they cannot be tasked using directions that are intuitive to humans. It also limits their ability to report their position relative to intuitive landmarks. Other researchers have demonstrated some success on traffic sign recognition, but using template based methods limits the set of recognizable signs. There is a clear need for a sign detection and recognition system that can process a much wider variety of signs: traffic signs, street signs, store-name signs, building directories, room signs, etc. We are developing a system for Sign Understanding in Support of Autonomous Navigation (SUSAN), that detects signs from various cues common to most signs: vivid colors, compact shape, and text. We have demonstrated the feasibility of our approach on a variety of signs in both indoor and outdoor locations.

  6. Autonomous navigation system and method

    DOEpatents

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID

    2009-09-08

    A robot platform includes perceptors, locomotors, and a system controller, which executes instructions for autonomously navigating a robot. The instructions repeat, on each iteration through an event timing loop, the acts of defining an event horizon based on the robot's current velocity, detecting a range to obstacles around the robot, testing for an event horizon intrusion by determining if any range to the obstacles is within the event horizon, and adjusting rotational and translational velocity of the robot accordingly. If the event horizon intrusion occurs, rotational velocity is modified by a proportion of the current rotational velocity reduced by a proportion of the range to the nearest obstacle and translational velocity is modified by a proportion of the range to the nearest obstacle. If no event horizon intrusion occurs, translational velocity is set as a ratio of a speed factor relative to a maximum speed.

  7. 2001 Flight Mechanics Symposium

    NASA Technical Reports Server (NTRS)

    Lynch, John P. (Editor)

    2001-01-01

    This conference publication includes papers and abstracts presented at the Flight Mechanics Symposium held on June 19-21, 2001. Sponsored by the Guidance, Navigation and Control Center of Goddard Space Flight Center, this symposium featured technical papers on a wide range of issues related to attitude/orbit determination, prediction and control; attitude simulation; attitude sensor calibration; theoretical foundation of attitude computation; dynamics model improvements; autonomous navigation; constellation design and formation flying; estimation theory and computational techniques; Earth environment mission analysis and design; and, spacecraft re-entry mission design and operations.

  8. COBALT CoOperative Blending of Autonomous Landing Technology

    NASA Technical Reports Server (NTRS)

    Carson, John M. III; Restrepo, Carolina I.; Robertson, Edward A.; Seubert, Carl R.; Amzajerdian, Farzin

    2016-01-01

    COBALT is a terrestrial test platform for development and maturation of GN&C (Guidance, Navigation and Control) technologies for PL&HA (Precision Landing and Hazard Avoidance). The project is developing a third generation, Langley Navigation Doppler Lidar (NDL) for ultra-precise velocity and range measurements, which will be integrated and tested with the JPL Lander Vision System (LVS) for Terrain Relative Navigation (TRN) position estimates. These technologies together provide navigation that enables controlled precision landing. The COBALT hardware will be integrated in 2017 into the GN&C subsystem of the Xodiac rocket-propulsive Vertical Test Bed (VTB) developed by Masten Space Systems (MSS), and two terrestrial flight campaigns will be conducted: one open-loop (i.e., passive) and one closed-loop (i.e., active).

  9. A Long Distance Laser Altimeter for Terrain Relative Navigation and Spacecraft Landing

    NASA Technical Reports Server (NTRS)

    Pierrottet, Diego F.; Amzajerdian, Farzin; Barnes, Bruce W.

    2014-01-01

    A high precision laser altimeter was developed under the Autonomous Landing and Hazard Avoidance (ALHAT) project at NASA Langley Research Center. The laser altimeter provides slant-path range measurements from operational ranges exceeding 30 km that will be used to support surface-relative state estimation and navigation during planetary descent and precision landing. The altimeter uses an advanced time-of-arrival receiver, which produces multiple signal-return range measurements from tens of kilometers with 5 cm precision. The transmitter is eye-safe, simplifying operations and testing on earth. The prototype is fully autonomous, and able to withstand the thermal and mechanical stresses experienced during test flights conducted aboard helicopters, fixed-wing aircraft, and Morpheus, a terrestrial rocket-powered vehicle developed by NASA Johnson Space Center. This paper provides an overview of the sensor and presents results obtained during recent field experiments including a helicopter flight test conducted in December 2012 and Morpheus flight tests conducted during March of 2014.

  10. A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots.

    PubMed

    Sherwin, Tyrone; Easte, Mikala; Chen, Andrew Tzer-Yeu; Wang, Kevin I-Kai; Dai, Wenbin

    2018-02-14

    Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system.

  11. A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots

    PubMed Central

    Sherwin, Tyrone; Easte, Mikala; Wang, Kevin I-Kai; Dai, Wenbin

    2018-01-01

    Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system. PMID:29443906

  12. PointCom: semi-autonomous UGV control with intuitive interface

    NASA Astrophysics Data System (ADS)

    Rohde, Mitchell M.; Perlin, Victor E.; Iagnemma, Karl D.; Lupa, Robert M.; Rohde, Steven M.; Overholt, James; Fiorani, Graham

    2008-04-01

    Unmanned ground vehicles (UGVs) will play an important role in the nation's next-generation ground force. Advances in sensing, control, and computing have enabled a new generation of technologies that bridge the gap between manual UGV teleoperation and full autonomy. In this paper, we present current research on a unique command and control system for UGVs named PointCom (Point-and-Go Command). PointCom is a semi-autonomous command system for one or multiple UGVs. The system, when complete, will be easy to operate and will enable significant reduction in operator workload by utilizing an intuitive image-based control framework for UGV navigation and allowing a single operator to command multiple UGVs. The project leverages new image processing algorithms for monocular visual servoing and odometry to yield a unique, high-performance fused navigation system. Human Computer Interface (HCI) techniques from the entertainment software industry are being used to develop video-game style interfaces that require little training and build upon the navigation capabilities. By combining an advanced navigation system with an intuitive interface, a semi-autonomous control and navigation system is being created that is robust, user friendly, and less burdensome than many current generation systems. mand).

  13. Method and System for Gamma-Ray Localization Induced Spacecraft Navigation Using Celestial Gamma-Ray Sources

    NASA Technical Reports Server (NTRS)

    Hisamoto, Chuck (Inventor); Arzoumanian, Zaven (Inventor); Sheikh, Suneel I. (Inventor)

    2015-01-01

    A method and system for spacecraft navigation using distant celestial gamma-ray bursts which offer detectable, bright, high-energy events that provide well-defined characteristics conducive to accurate time-alignment among spatially separated spacecraft. Utilizing assemblages of photons from distant gamma-ray bursts, relative range between two spacecraft can be accurately computed along the direction to each burst's source based upon the difference in arrival time of the burst emission at each spacecraft's location. Correlation methods used to time-align the high-energy burst profiles are provided. The spacecraft navigation may be carried out autonomously or in a central control mode of operation.

  14. Vision Based Navigation for Autonomous Cooperative Docking of CubeSats

    NASA Astrophysics Data System (ADS)

    Pirat, Camille; Ankersen, Finn; Walker, Roger; Gass, Volker

    2018-05-01

    A realistic rendezvous and docking navigation solution applicable to CubeSats is investigated. The scalability analysis of the ESA Autonomous Transfer Vehicle Guidance, Navigation & Control (GNC) performances and the Russian docking system, shows that the docking of two CubeSats would require a lateral control performance of the order of 1 cm. Line of sight constraints and multipath effects affecting Global Navigation Satellite System (GNSS) measurements in close proximity prevent the use of this sensor for the final approach. This consideration and the high control accuracy requirement led to the use of vision sensors for the final 10 m of the rendezvous and docking sequence. A single monocular camera on the chaser satellite and various sets of Light-Emitting Diodes (LEDs) on the target vehicle ensure the observability of the system throughout the approach trajectory. The simple and novel formulation of the measurement equations allows differentiating unambiguously rotations from translations between the target and chaser docking port and allows a navigation performance better than 1 mm at docking. Furthermore, the non-linear measurement equations can be solved in order to provide an analytic navigation solution. This solution can be used to monitor the navigation filter solution and ensure its stability, adding an extra layer of robustness for autonomous rendezvous and docking. The navigation filter initialization is addressed in detail. The proposed method is able to differentiate LEDs signals from Sun reflections as demonstrated by experimental data. The navigation filter uses a comprehensive linearised coupled rotation/translation dynamics, describing the chaser to target docking port motion. The handover, between GNSS and vision sensor measurements, is assessed. The performances of the navigation function along the approach trajectory is discussed.

  15. Advancing Underwater Acoustic Communication for Autonomous Distributed Networks via Sparse Channel Sensing, Coding, and Navigation Support

    DTIC Science & Technology

    2013-09-30

    underwater acoustic communication technologies for autonomous distributed underwater networks, through innovative signal processing, coding, and navigation...in real enviroments , an offshore testbed has been developed to conduct field experimetns. The testbed consists of four nodes and has been deployed...Leadership by the Connecticut Technology Council. Dr. Zhaohui Wang joined the faculty of the Department of Electrical and Computer Engineering at

  16. Learning for Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Robotic ground vehicles for outdoor applications have achieved some remarkable successes, notably in autonomous highway following (Dickmanns, 1987), planetary exploration (1), and off-road navigation on Earth (1). Nevertheless, major challenges remain to enable reliable, high-speed, autonomous navigation in a wide variety of complex, off-road terrain. 3-D perception of terrain geometry with imaging range sensors is the mainstay of off-road driving systems. However, the stopping distance at high speed exceeds the effective lookahead distance of existing range sensors. Prospects for extending the range of 3-D sensors is strongly limited by sensor physics, eye safety of lasers, and related issues. Range sensor limitations also allow vehicles to enter large cul-de-sacs even at low speed, leading to long detours. Moreover, sensing only terrain geometry fails to reveal mechanical properties of terrain that are critical to assessing its traversability, such as potential for slippage, sinkage, and the degree of compliance of potential obstacles. Rovers in the Mars Exploration Rover (MER) mission have got stuck in sand dunes and experienced significant downhill slippage in the vicinity of large rock hazards. Earth-based off-road robots today have very limited ability to discriminate traversable vegetation from non-traversable vegetation or rough ground. It is impossible today to preprogram a system with knowledge of these properties for all types of terrain and weather conditions that might be encountered.

  17. Autonomous Wheeled Robot Platform Testbed for Navigation and Mapping Using Low-Cost Sensors

    NASA Astrophysics Data System (ADS)

    Calero, D.; Fernandez, E.; Parés, M. E.

    2017-11-01

    This paper presents the concept of an architecture for a wheeled robot system that helps researchers in the field of geomatics to speed up their daily research on kinematic geodesy, indoor navigation and indoor positioning fields. The presented ideas corresponds to an extensible and modular hardware and software system aimed at the development of new low-cost mapping algorithms as well as at the evaluation of the performance of sensors. The concept, already implemented in the CTTC's system ARAS (Autonomous Rover for Automatic Surveying) is generic and extensible. This means that it is possible to incorporate new navigation algorithms or sensors at no maintenance cost. Only the effort related to the development tasks required to either create such algorithms needs to be taken into account. As a consequence, change poses a much small problem for research activities in this specific area. This system includes several standalone sensors that may be combined in different ways to accomplish several goals; that is, this system may be used to perform a variety of tasks, as, for instance evaluates positioning algorithms performance or mapping algorithms performance.

  18. Tracked robot controllers for climbing obstacles autonomously

    NASA Astrophysics Data System (ADS)

    Vincent, Isabelle

    2009-05-01

    Research in mobile robot navigation has demonstrated some success in navigating flat indoor environments while avoiding obstacles. However, the challenge of analyzing complex environments to climb obstacles autonomously has had very little success due to the complexity of the task. Unmanned ground vehicles currently exhibit simple autonomous behaviours compared to the human ability to move in the world. This paper presents the control algorithms designed for a tracked mobile robot to autonomously climb obstacles by varying its tracks configuration. Two control algorithms are proposed to solve the autonomous locomotion problem for climbing obstacles. First, a reactive controller evaluates the appropriate geometric configuration based on terrain and vehicle geometric considerations. Then, a reinforcement learning algorithm finds alternative solutions when the reactive controller gets stuck while climbing an obstacle. The methodology combines reactivity to learning. The controllers have been demonstrated in box and stair climbing simulations. The experiments illustrate the effectiveness of the proposed approach for crossing obstacles.

  19. Nature-Inspired Acoustic Sensor Projects

    DTIC Science & Technology

    1999-08-24

    m). The pager motors are worn on the wrists. Yale Intelligent Sensors Lab 8 Autonomous vehicle navigation Yago – Yale Autonomous Go-Cart Yago is used...proximity sensor determined the presence of close-by objects missed by the sonars. Yago operated autonomously by avoiding obstacles. Problems being

  20. Improved obstacle avoidance and navigation for an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Giri, Binod; Cho, Hyunsu; Williams, Benjamin C.; Tann, Hokchhay; Shakya, Bicky; Bharam, Vishal; Ahlgren, David J.

    2015-01-01

    This paper presents improvements made to the intelligence algorithms employed on Q, an autonomous ground vehicle, for the 2014 Intelligent Ground Vehicle Competition (IGVC). In 2012, the IGVC committee combined the formerly separate autonomous and navigation challenges into a single AUT-NAV challenge. In this new challenge, the vehicle is required to navigate through a grassy obstacle course and stay within the course boundaries (a lane of two white painted lines) that guide it toward a given GPS waypoint. Once the vehicle reaches this waypoint, it enters an open course where it is required to navigate to another GPS waypoint while avoiding obstacles. After reaching the final waypoint, the vehicle is required to traverse another obstacle course before completing the run. Q uses modular parallel software architecture in which image processing, navigation, and sensor control algorithms run concurrently. A tuned navigation algorithm allows Q to smoothly maneuver through obstacle fields. For the 2014 competition, most revisions occurred in the vision system, which detects white lines and informs the navigation component. Barrel obstacles of various colors presented a new challenge for image processing: the previous color plane extraction algorithm would not suffice. To overcome this difficulty, laser range sensor data were overlaid on visual data. Q also participates in the Joint Architecture for Unmanned Systems (JAUS) challenge at IGVC. For 2014, significant updates were implemented: the JAUS component accepted a greater variety of messages and showed better compliance to the JAUS technical standard. With these improvements, Q secured second place in the JAUS competition.

  1. Autonomous detection of indoor and outdoor signs

    NASA Astrophysics Data System (ADS)

    Holden, Steven; Snorrason, Magnus; Goodsell, Thomas; Stevens, Mark R.

    2005-05-01

    Most goal-oriented mobile robot tasks involve navigation to one or more known locations. This is generally done using GPS coordinates and landmarks outdoors, or wall-following and fiducial marks indoors. Such approaches ignore the rich source of navigation information that is already in place for human navigation in all man-made environments: signs. A mobile robot capable of detecting and reading arbitrary signs could be tasked using directions that are intuitive to hu-mans, and it could report its location relative to intuitive landmarks (a street corner, a person's office, etc.). Such ability would not require active marking of the environment and would be functional in the absence of GPS. In this paper we present an updated version of a system we call Sign Understanding in Support of Autonomous Navigation (SUSAN). This system relies on cues common to most signs, the presence of text, vivid color, and compact shape. By not relying on templates, SUSAN can detect a wide variety of signs: traffic signs, street signs, store-name signs, building directories, room signs, etc. In this paper we focus on the text detection capability. We present results summarizing probability of detection and false alarm rate across many scenes containing signs of very different designs and in a variety of lighting conditions.

  2. Development of autonomous grasping and navigating robot

    NASA Astrophysics Data System (ADS)

    Kudoh, Hiroyuki; Fujimoto, Keisuke; Nakayama, Yasuichi

    2015-01-01

    The ability to find and grasp target items in an unknown environment is important for working robots. We developed an autonomous navigating and grasping robot. The operations are locating a requested item, moving to where the item is placed, finding the item on a shelf or table, and picking the item up from the shelf or the table. To achieve these operations, we designed the robot with three functions: an autonomous navigating function that generates a map and a route in an unknown environment, an item position recognizing function, and a grasping function. We tested this robot in an unknown environment. It achieved a series of operations: moving to a destination, recognizing the positions of items on a shelf, picking up an item, placing it on a cart with its hand, and returning to the starting location. The results of this experiment show the applicability of reducing the workforce with robots.

  3. Vector Pursuit Path Tracking for Autonomous Ground Vehicles

    DTIC Science & Technology

    2000-08-01

    vi INTRODUCTION ...........................................................................................................1...other geometric path-tracking techniques. 1 CHAPTER 1 INTRODUCTION An autonomous vehicle is one that is capable of automatic navigation. It is...Joint Architecture for Unmanned Ground Vehicles ( JAUGS ) working group meeting held at the University of Florida. 5 Figure 1.5: Autonomous

  4. Autonomous navigation using lunar beacons

    NASA Technical Reports Server (NTRS)

    Khatib, A. R.; Ellis, J.; French, J.; Null, G.; Yunck, T.; Wu, S.

    1983-01-01

    The concept of using lunar beacon signal transmission for on-board navigation for earth satellites and near-earth spacecraft is described. The system would require powerful transmitters on the earth-side of the moon's surface and black box receivers with antennae and microprocessors placed on board spacecraft for autonomous navigation. Spacecraft navigation requires three position and three velocity elements to establish location coordinates. Two beacons could be soft-landed on the lunar surface at the limits of allowable separation and each would transmit a wide-beam signal with cones reaching GEO heights and be strong enough to be received by small antennae in near-earth orbit. The black box processor would perform on-board computation with one-way Doppler/range data and dynamical models. Alternatively, GEO satellites such as the GPS or TDRSS spacecraft can be used with interferometric techniques to provide decimeter-level accuracy for aircraft navigation.

  5. Relative Motion Modeling and Autonomous Navigation Accuracy

    DTIC Science & Technology

    2016-11-15

    Dynamical Astronomy , Vol. 91, No. 3-4, 2005, pp. 239–268. [9] B. Mahajan, S. R. Vadali, and K. T. Alfriend, “Analytic Solution for Satellite...and Dynamical Astronomy , Vol. 9, April 1974, pp. 239–267. [14] D. Vallado, Fundamentals of Astrodynamics and Applications, New York, NY: McGraw- Hill

  6. Small Body Landing Accuracy Using In-Situ Navigation

    NASA Technical Reports Server (NTRS)

    Bhaskaran, Shyam; Nandi, Sumita; Broschart, Stephen; Wallace, Mark; Olson, Corwin; Cangahuala, L. Alberto

    2011-01-01

    Spacecraft landings on small bodies (asteroids and comets) can require target accuracies too stringent to be met using ground-based navigation alone, especially if specific landing site requirements must be met for safety or to meet science goals. In-situ optical observations coupled with onboard navigation processing can meet the tighter accuracy requirements to enable such missions. Recent developments in deep space navigation capability include a self-contained autonomous navigation system (used in flight on three missions) and a landmark tracking system (used experimentally on the Japanese Hayabusa mission). The merging of these two technologies forms a methodology to perform autonomous onboard navigation around small bodies. This paper presents an overview of these systems, as well as the results from Monte Carlo studies to quantify the achievable landing accuracies by using these methods. Sensitivity of the results to variations in spacecraft maneuver execution error, attitude control accuracy and unmodeled forces are examined. Cases for two bodies, a small asteroid and on a mid-size comet, are presented.

  7. An Algorithm for Autonomous Formation Obstacle Avoidance

    NASA Astrophysics Data System (ADS)

    Cruz, Yunior I.

    The level of human interaction with Unmanned Aerial Systems varies greatly from remotely piloted aircraft to fully autonomous systems. In the latter end of the spectrum, the challenge lies in designing effective algorithms to dictate the behavior of the autonomous agents. A swarm of autonomous Unmanned Aerial Vehicles requires collision avoidance and formation flight algorithms to negotiate environmental challenges it may encounter during the execution of its mission, which may include obstacles and chokepoints. In this work, a simple algorithm is developed to allow a formation of autonomous vehicles to perform point to point navigation while avoiding obstacles and navigating through chokepoints. Emphasis is placed on maintaining formation structures. Rather than breaking formation and individually navigating around the obstacle or through the chokepoint, vehicles are required to assemble into appropriately sized/shaped sub-formations, bifurcate around the obstacle or negotiate the chokepoint, and reassemble into the original formation at the far side of the obstruction. The algorithm receives vehicle and environmental properties as inputs and outputs trajectories for each vehicle from start to the desired ending location. Simulation results show that the algorithm safely routes all vehicles past the obstruction while adhering to the aforementioned requirements. The formation adapts and successfully negotiates the obstacles and chokepoints in its path while maintaining proper vehicle separation.

  8. Navigation of robotic system using cricket motes

    NASA Astrophysics Data System (ADS)

    Patil, Yogendra J.; Baine, Nicholas A.; Rattan, Kuldip S.

    2011-06-01

    This paper presents a novel algorithm for self-mapping of the cricket motes that can be used for indoor navigation of autonomous robotic systems. The cricket system is a wireless sensor network that can provide indoor localization service to its user via acoustic ranging techniques. The behavior of the ultrasonic transducer on the cricket mote is studied and the regions where satisfactorily distance measurements can be obtained are recorded. Placing the motes in these regions results fine-grain mapping of the cricket motes. Trilateration is used to obtain a rigid coordinate system, but is insufficient if the network is to be used for navigation. A modified SLAM algorithm is applied to overcome the shortcomings of trilateration. Finally, the self-mapped cricket motes can be used for navigation of autonomous robotic systems in an indoor location.

  9. Navigation strategies for multiple autonomous mobile robots moving in formation

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1991-01-01

    The problem of deriving navigation strategies for a fleet of autonomous mobile robots moving in formation is considered. Here, each robot is represented by a particle with a spherical effective spatial domain and a specified cone of visibility. The global motion of each robot in the world space is described by the equations of motion of the robot's center of mass. First, methods for formation generation are discussed. Then, simple navigation strategies for robots moving in formation are derived. A sufficient condition for the stability of a desired formation pattern for a fleet of robots each equipped with the navigation strategy based on nearest neighbor tracking is developed. The dynamic behavior of robot fleets consisting of three or more robots moving in formation in a plane is studied by means of computer simulation.

  10. An Analysis of Navigation Algorithms for Smartphones Using J2ME

    NASA Astrophysics Data System (ADS)

    Santos, André C.; Tarrataca, Luís; Cardoso, João M. P.

    Embedded systems are considered one of the most potential areas for future innovations. Two embedded fields that will most certainly take a primary role in future innovations are mobile robotics and mobile computing. Mobile robots and smartphones are growing in number and functionalities, becoming a presence in our daily life. In this paper, we study the current feasibility of a smartphone to execute navigation algorithms. As a test case, we use a smartphone to control an autonomous mobile robot. We tested three navigation problems: Mapping, Localization and Path Planning. For each of these problems, an algorithm has been chosen, developed in J2ME, and tested on the field. Results show the current mobile Java capacity for executing computationally demanding algorithms and reveal the real possibility of using smartphones for autonomous navigation.

  11. Autonomous optical navigation using nanosatellite-class instruments: a Mars approach case study

    NASA Astrophysics Data System (ADS)

    Enright, John; Jovanovic, Ilija; Kazemi, Laila; Zhang, Harry; Dzamba, Tom

    2018-02-01

    This paper examines the effectiveness of small star trackers for orbital estimation. Autonomous optical navigation has been used for some time to provide local estimates of orbital parameters during close approach to celestial bodies. These techniques have been used extensively on spacecraft dating back to the Voyager missions, but often rely on long exposures and large instrument apertures. Using a hyperbolic Mars approach as a reference mission, we present an EKF-based navigation filter suitable for nanosatellite missions. Observations of Mars and its moons allow the estimator to correct initial errors in both position and velocity. Our results show that nanosatellite-class star trackers can produce good quality navigation solutions with low position (<300 {m}) and velocity (<0.15 {m/s}) errors as the spacecraft approaches periapse.

  12. Autonomous Locator of Thermals (ALOFT) Autonomous Soaring Algorithm

    DTIC Science & Technology

    2015-04-03

    estimator used on the NRL CICADA Mk 3 micro air vehicle [13]. An extended Kalman filter (EKF) was designed to estimate the airspeed sensor bias and...Boulder, 2007. ALOFT Autonomous Soaring Algorithm 31 13. A.D. Kahn and D.J. Edwards, “Navigation, Guidance and Control for the CICADA Expendable

  13. Enabling Autonomous Navigation for Affordable Scooters.

    PubMed

    Liu, Kaikai; Mulky, Rajathswaroop

    2018-06-05

    Despite the technical success of existing assistive technologies, for example, electric wheelchairs and scooters, they are still far from effective enough in helping those in need navigate to their destinations in a hassle-free manner. In this paper, we propose to improve the safety and autonomy of navigation by designing a cutting-edge autonomous scooter, thus allowing people with mobility challenges to ambulate independently and safely in possibly unfamiliar surroundings. We focus on indoor navigation scenarios for the autonomous scooter where the current location, maps, and nearby obstacles are unknown. To achieve semi-LiDAR functionality, we leverage the gyros-based pose data to compensate the laser motion in real time and create synthetic mapping of simple environments with regular shapes and deep hallways. Laser range finders are suitable for long ranges with limited resolution. Stereo vision, on the other hand, provides 3D structural data of nearby complex objects. To achieve simultaneous fine-grained resolution and long range coverage in the mapping of cluttered and complex environments, we dynamically fuse the measurements from the stereo vision camera system, the synthetic laser scanner, and the LiDAR. We propose solutions to self-correct errors in data fusion and create a hybrid map to assist the scooter in achieving collision-free navigation in an indoor environment.

  14. Guidance and control for unmanned ground vehicles

    NASA Astrophysics Data System (ADS)

    Bateman, Peter J.

    1994-06-01

    Techniques for the guidance, control, and navigation of unmanned ground vehicles are described in terms of the communication bandwidth requirements for driving and control of a vehicle remote from the human operator. Modes of operation are conveniently classified as conventional teleoperation, supervisory control, and fully autonomous control. The fundamental problem of maintaining a robust non-line-of-sight communications link between the human controller and the remote vehicle is discussed, as this provides the impetus for greater autonomy in the control system and the greatest scope for innovation. While supervisory control still requires the man to be providing the primary navigational intelligence, fully autonomous operation requires that mission navigation is provided solely by on-board machine intelligence. Methods directed at achieving this performance are described using various active and passive sensing of the terrain for route navigation and obstacle detection. Emphasis is given to TV imagery and signal processing techniques for image understanding. Reference is made to the limitations of current microprocessor technology and suitable computer architectures. Some of the more recent control techniques involve the use of neural networks, fuzzy logic, and data fusion and these are discussed in the context of road following and cross country navigation. Examples of autonomous vehicle testbeds operated at various laboratories around the world are given.

  15. Relative navigation for spacecraft formation flying

    NASA Technical Reports Server (NTRS)

    Hartman, Kate R.; Gramling, Cheryl J.; Lee, Taesul; Kelbel, David A.; Long, Anne C.

    1998-01-01

    The Goddard Space Flight Center Guidance, Navigation, and Control Center (GNCC) is currently developing and implementing advanced satellite systems to provide autonomous control of formation flyers. The initial formation maintenance capability will be flight-demonstrated on the Earth-Orbiter-1 (EO-1) satellite, which is planned under the National Aeronautics and Space Administration New Millennium Program to be a coflight with the Landsat-7 (L-7) satellite. Formation flying imposes relative navigation accuracy requirements in addition to the orbit accuracy requirements for the individual satellites. In the case of EO-1 and L-7, the two satellites are in nearly coplanar orbits, with a small difference in the longitude of the ascending node to compensate for the Earth's rotation. The GNCC has performed trajectory error analysis for the relative navigation of the EO-1/L-7 formation, as well as for a more advanced tracking configuration using cross-link satellite communications. This paper discusses the orbit determination and prediction accuracy achievable for EO-1 and L-7 under various tracking and orbit determination scenarios and discusses the expected relative separation errors in their formation flying configuration.

  16. Relative Navigation for Spacecraft Formation Flying

    NASA Technical Reports Server (NTRS)

    Hartman, Kate R.; Gramling, Cheryl J.; Lee, Taesul; Kelbel, David A.; Long, Anne C.

    1998-01-01

    The Goddard Space Flight Center Guidance, Navigation, and Control Center (GNCC) is currently developing and implementing advanced satellite systems to provide autonomous control of formation flyers. The initial formation maintenance capability will be flight-demonstrated on the Earth-Orbiter-1 (EO-l) satellite, which is planned under the National Aeronautics and Space Administration New Millennium Program to be a coflight with the Landsat-7 (L-7) satellite. Formation flying imposes relative navigation accuracy requirements in addition to the orbit accuracy requirements for the individual satellites. In the case of EO-1 and L-7, the two satellites are in nearly coplanar orbits, with a small difference in the longitude of the ascending node to compensate for the Earth's rotation. The GNCC has performed trajectory error analysis for the relative navigation of the EO-1/L-7 formation, as well as for a more advanced tracking configuration using cross- link satellite communications. This paper discusses the orbit determination and prediction accuracy achievable for EO-1 and L-7 under various tracking and orbit determination scenarios and discusses the expected relative separation errors in their formation flying configuration.

  17. Neural Network-Based Landmark Recognition and Navigation with IAMRs. Understanding the Principles of Thought and Behavior.

    ERIC Educational Resources Information Center

    Doty, Keith L.

    1999-01-01

    Research on neural networks and hippocampal function demonstrating how mammals construct mental maps and develop navigation strategies is being used to create Intelligent Autonomous Mobile Robots (IAMRs). Such robots are able to recognize landmarks and navigate without "vision." (SK)

  18. First Results from a Hardware-in-the-Loop Demonstration of Closed-Loop Autonomous Formation Flying

    NASA Technical Reports Server (NTRS)

    Gill, E.; Naasz, Bo; Ebinuma, T.

    2003-01-01

    A closed-loop system for the demonstration of autonomous satellite formation flying technologies using hardware-in-the-loop has been developed. Making use of a GPS signal simulator with a dual radio frequency outlet, the system includes two GPS space receivers as well as a powerful onboard navigation processor dedicated to the GPS-based guidance, navigation, and control of a satellite formation in real-time. The closed-loop system allows realistic simulations of autonomous formation flying scenarios, enabling research in the fields of tracking and orbit control strategies for a wide range of applications. The autonomous closed-loop formation acquisition and keeping strategy is based on Lyapunov's direct control method as applied to the standard set of Keplerian elements. This approach not only assures global and asymptotic stability of the control but also maintains valuable physical insight into the applied control vectors. Furthermore, the approach can account for system uncertainties and effectively avoids a computationally expensive solution of the two point boundary problem, which renders the concept particularly attractive for implementation in onboard processors. A guidance law has been developed which strictly separates the relative from the absolute motion, thus avoiding the numerical integration of a target trajectory in the onboard processor. Moreover, upon using precise kinematic relative GPS solutions, a dynamical modeling or filtering is avoided which provides for an efficient implementation of the process on an onboard processor. A sample formation flying scenario has been created aiming at the autonomous transition of a Low Earth Orbit satellite formation from an initial along-track separation of 800 m to a target distance of 100 m. Assuming a low-thrust actuator which may be accommodated on a small satellite, a typical control accuracy of less than 5 m has been achieved which proves the applicability of autonomous formation flying techniques to formations of satellites as close as 50 m.

  19. Synopsis of Precision Landing and Hazard Avoidance (PL&HA) Capabilities for Space Exploration

    NASA Technical Reports Server (NTRS)

    Robertson, Edward A.

    2017-01-01

    Until recently, robotic exploration missions to the Moon, Mars, and other solar system bodies relied upon controlled blind landings. Because terrestrial techniques for terrain relative navigation (TRN) had not yet been evolved to support space exploration, landing dispersions were driven by the capabilities of inertial navigation systems combined with surface relative altimetry and velocimetry. Lacking tight control over the actual landing location, mission success depended on the statistical vetting of candidate landing areas within the predicted landing dispersion ellipse based on orbital reconnaissance data, combined with the ability of the spacecraft to execute a controlled landing in terms of touchdown attitude, attitude rates, and velocity. In addition, the sensors, algorithms, and processing technologies required to perform autonomous hazard detection and avoidance in real time during the landing sequence were not yet available. Over the past decade, NASA has invested substantial resources on the development, integration, and testing of autonomous precision landing and hazard avoidance (PL&HA) capabilities. In addition to substantially improving landing accuracy and safety, these autonomous PL&HA functions also offer access to targets of interest located within more rugged and hazardous terrain. Optical TRN systems are baselined on upcoming robotic landing missions to the Moon and Mars, and NASA JPL is investigating the development of a comprehensive PL&HA system for a Europa lander. These robotic missions will demonstrate and mature PL&HA technologies that are considered essential for future human exploration missions. PL&HA technologies also have applications to rendezvous and docking/berthing with other spacecraft, as well as proximity navigation, contact, and retrieval missions to smaller bodies with microgravity environments, such as asteroids.

  20. Autonomous Navigation Above the GNSS Constellations and Beyond: GPS Navigation for the Magnetospheric Multiscale Mission and SEXTANT Pulsar Navigation Demonstration

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke

    2017-01-01

    This talk will describe two first-of-their-kind technology demonstrations attached to ongoing NASA science missions, both of which aim to extend the range of autonomous spacecraft navigation far from the Earth. First, we will describe the onboard GPS navigation system for the Magnetospheric Multiscale (MMS) mission which is currently operating in elliptic orbits reaching nearly halfway to the Moon. The MMS navigation system is a key outgrowth of a larger effort at NASA Goddard Space Flight Center to advance high-altitude Global Navigation Satellite System (GNSS) navigation on multiple fronts, including developing Global Positioning System receivers and onboard navigation software, running simulation studies, and leading efforts to characterize and protect signals at high-altitude in the so-called GNSS Space-Service Volume (SSV). In the second part of the talk, we will describe the Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) mission that aims to make the first in-space demonstration of X-ray pulsar navigation (XNAV). SEXTANT is attached to the NASA astrophysics mission Neutron-star Interior Composition ExploreR (NICER) whose International Space Station mounted X-ray telescope is investigating the fundamental physics of extremes in gravity, material density, and electromagnetic fields found in neutron stars, and whose instrument provides a nearly ideal navigation sensor for XNAV.

  1. Navigation Concepts for the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Long, Anne; Leung, Dominic; Kelbel, David; Beckman, Mark; Grambling, Cheryl

    2003-01-01

    This paper evaluates the performance that can be achieved using candidate ground and onboard navigation approaches for operation of the James Webb Space Telescope, which will be in an orbit about the Sun-Earth L2 libration point. The ground navigation approach processes standard range and Doppler measurements from the Deep Space Network The onboard navigation approach processes celestial object measurements and/or ground-to- spacecraft Doppler measurements to autonomously estimate the spacecraft s position and velocity and Doppler reference frequency. Particular attention is given to assessing the absolute position and velocity accuracy that can be achieved in the presence of the frequent spacecraft reorientations and momentum unloads planned for this mission. The ground navigation approach provides stable navigation solutions using a tracking schedule of one 30-minute contact per day. The onboard navigation approach that uses only optical quality celestial object measurements provides stable autonomous navigation solutions. This study indicates that unmodeled changes in the solar radiation pressure cross-sectional area and modeled momentum unload velocity changes are the major error sources. These errors can be mitigated by modeling these changes, by estimating corrections to compensate for the changes, or by including acceleration measurements.

  2. The Development of a Simulator System and Hardware Test Bed for Deep Space X-Ray Navigation

    NASA Astrophysics Data System (ADS)

    Doyle, Patrick T.

    2013-03-01

    Currently, there is a considerable interest in developing technologies that will allow using photon measurements from celestial x-ray sources for deep space navigation. The impetus for this is that many envisioned future space missions will require spacecraft to have autonomous navigation capabilities. For missions close to Earth, Global Navigation Satellite Systems (GNSS) such as GPS are readily available for use, but for missions far from Earth, other alternatives must be provided. While existing systems such as the Deep Space Network (DSN) can be used, latencies associated with servicing a fleet of vehicles may not be compatible with some autonomous operations requiring timely updates of their navigation solution. Because of their somewhat predictable emissions, pulsars are the ideal candidates for x-ray sources that can be used to provide key parameters for navigation. Algorithms and simulation tools that will enable designing and analyzing x-ray navigation concepts are presented. The development of a compact x-ray detector system is pivotal to the eventual deployment of such navigation systems. Therefore, results of a high altitude balloon test to evaluate the design of a compact x-ray detector system are described as well.

  3. Navigation through unknown and dynamic open spaces using topological notions

    NASA Astrophysics Data System (ADS)

    Miguel-Tomé, Sergio

    2018-04-01

    Until now, most algorithms used for navigation have had the purpose of directing system towards one point in space. However, humans communicate tasks by specifying spatial relations among elements or places. In addition, the environments in which humans develop their activities are extremely dynamic. The only option that allows for successful navigation in dynamic and unknown environments is making real-time decisions. Therefore, robots capable of collaborating closely with human beings must be able to make decisions based on the local information registered by the sensors and interpret and express spatial relations. Furthermore, when one person is asked to perform a task in an environment, this task is communicated given a category of goals so the person does not need to be supervised. Thus, two problems appear when one wants to create multifunctional robots: how to navigate in dynamic and unknown environments using spatial relations and how to accomplish this without supervision. In this article, a new architecture to address the two cited problems is presented, called the topological qualitative navigation architecture. In previous works, a qualitative heuristic called the heuristic of topological qualitative semantics (HTQS) has been developed to establish and identify spatial relations. However, that heuristic only allows for establishing one spatial relation with a specific object. In contrast, navigation requires a temporal sequence of goals with different objects. The new architecture attains continuous generation of goals and resolves them using HTQS. Thus, the new architecture achieves autonomous navigation in dynamic or unknown open environments.

  4. Development Of Autonomous Systems

    NASA Astrophysics Data System (ADS)

    Kanade, Takeo

    1989-03-01

    In the last several years at the Robotics Institute of Carnegie Mellon University, we have been working on two projects for developing autonomous systems: Nablab for Autonomous Land Vehicle and Ambler for Mars Rover. These two systems are for different purposes: the Navlab is a four-wheeled vehicle (van) for road and open terrain navigation, and the Ambler is a six-legged locomotor for Mars exploration. The two projects, however, share many common aspects. Both are large-scale integrated systems for navigation. In addition to the development of individual components (eg., construction and control of the vehicle, vision and perception, and planning), integration of those component technologies into a system by means of an appropriate architecture is a major issue.

  5. Vegetation Versus Man-Made Object Detection from Imagery for Unmanned Vehicles in Off-Road Environments

    DTIC Science & Technology

    2013-05-01

    saliency, natural scene statistics 1. INTRODUCTION Research into the area of autonomous navigation for unmanned ground vehicles (UGV) has accelerated in...recent years. This is partly due to the success of programs such as the DARPA Grand Challenge1 and the dream of driverless cars ,2 but is also due to the...NOTES 14. ABSTRACT There have been several major advances in autonomous navigation for unmanned ground vehicles in controlled urban environments in

  6. Perception system and functions for autonomous navigation in a natural environment

    NASA Technical Reports Server (NTRS)

    Chatila, Raja; Devy, Michel; Lacroix, Simon; Herrb, Matthieu

    1994-01-01

    This paper presents the approach, algorithms, and processes we developed for the perception system of a cross-country autonomous robot. After a presentation of the tele-programming context we favor for intervention robots, we introduce an adaptive navigation approach, well suited for the characteristics of complex natural environments. This approach lead us to develop a heterogeneous perception system that manages several different terrain representatives. The perception functionalities required during navigation are listed, along with the corresponding representations we consider. The main perception processes we developed are presented. They are integrated within an on-board control architecture we developed. First results of an ambitious experiment currently underway at LAAS are then presented.

  7. High accuracy autonomous navigation using the global positioning system (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, Son H.; Hart, Roger C.; Shoan, Wendy C.; Wood, Terri; Long, Anne C.; Oza, Dipak H.; Lee, Taesul

    1997-01-01

    The application of global positioning system (GPS) technology to the improvement of the accuracy and economy of spacecraft navigation, is reported. High-accuracy autonomous navigation algorithms are currently being qualified in conjunction with the GPS attitude determination flyer (GADFLY) experiment for the small satellite technology initiative Lewis spacecraft. Preflight performance assessments indicated that these algorithms are able to provide a real time total position accuracy of better than 10 m and a velocity accuracy of better than 0.01 m/s, with selective availability at typical levels. It is expected that the position accuracy will be increased to 2 m if corrections are provided by the GPS wide area augmentation system.

  8. Mobile robots IV; Proceedings of the Meeting, Philadelphia, PA, Nov. 6, 7, 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, W.J.; Chun, W.H.

    1990-01-01

    The present conference on mobile robot systems discusses high-speed machine perception based on passive sensing, wide-angle optical ranging, three-dimensional path planning for flying/crawling robots, navigation of autonomous mobile intelligence in an unstructured natural environment, mechanical models for the locomotion of a four-articulated-track robot, a rule-based command language for a semiautonomous Mars rover, and a computer model of the structured light vision system for a Mars rover. Also discussed are optical flow and three-dimensional information for navigation, feature-based reasoning trail detection, a symbolic neural-net production system for obstacle avoidance and navigation, intelligent path planning for robot navigation in an unknown environment,more » behaviors from a hierarchical control system, stereoscopic TV systems, the REACT language for autonomous robots, and a man-amplifying exoskeleton.« less

  9. An Architecture for Autonomous Rovers on Future Planetary Missions

    NASA Astrophysics Data System (ADS)

    Ocon, J.; Avilés, M.; Graziano, M.

    2018-04-01

    This paper proposes an architecture for autonomous planetary rovers. This architecture combines a set of characteristics required in this type of system: high level of abstraction, reactive event-based activity execution, and automous navigation.

  10. The Integration, Testing and Flight of the EO-1 GPS

    NASA Technical Reports Server (NTRS)

    Quinn, David A.; Sanneman, Paul A.; Shulman, Seth E.; Sager, Jennifer A.

    2001-01-01

    The Global Positioning System has long been hailed as the wave of the future for autonomous on-board navigation of low Earth orbiting spacecraft despite the fact that relatively few spacecraft have actually employed it for this purpose. While several missions operated out of the Goddard Space Flight Center have flown GPS receivers on board, the New Millenium Program (NMP) Earth Orbiting-1 (EO-1) spacecraft is the first to employ GPS for active, autonomous on-board navigation. Since EO-1 was designed to employ GPS as its primary source of the navigation ephemeris, special care had to be taken during the integration phase of spacecraft construction to assure proper performance. This paper is a discussion of that process: a brief overview of how the GPS works, how it fits into the design of the EO-1 Attitude Control System (ACS), the steps taken to integrate the system into the EO-1 spacecraft, the ultimate on-orbit performance during launch and early operations of the EO-1 mission and the performance of the on-board GPS ephemeris versus the ground based ephemeris. Conclusions will include a discussion of the lessons learned.

  11. Cobalt: Development and Maturation of GN&C Technologies for Precision Landing

    NASA Technical Reports Server (NTRS)

    Carson, John M.; Restrepo, Carolina; Seubert, Carl; Amzajerdian, Farzin

    2016-01-01

    The CoOperative Blending of Autonomous Landing Technologies (COBALT) instrument is a terrestrial test platform for development and maturation of guidance, navigation and control (GN&C) technologies for precision landing. The project is developing a third-generation Langley Research Center (LaRC) navigation doppler lidar (NDL) for ultra-precise velocity and range measurements, which will be integrated and tested with the Jet Propulsion Laboratory (JPL) lander vision system (LVS) for terrain relative navigation (TRN) position estimates. These technologies together provide precise navigation knowledge that is critical for a controlled and precise touchdown. The COBALT hardware will be integrated in 2017 into the GN&C subsystem of the Xodiac rocket-propulsive vertical test bed (VTB) developed by Masten Space Systems, and two terrestrial flight campaigns will be conducted: one open-loop (i.e., passive) and one closed-loop (i.e., active).

  12. Terrain matching image pre-process and its format transform in autonomous underwater navigation

    NASA Astrophysics Data System (ADS)

    Cao, Xuejun; Zhang, Feizhou; Yang, Dongkai; Yang, Bogang

    2007-06-01

    Underwater passive navigation technology is one of the important development orientations in the field of modern navigation. With the advantage of high self-determination, stealth at sea, anti-jamming and high precision, passive navigation is completely meet with actual navigation requirements. Therefore passive navigation has become a specific navigating method for underwater vehicles. The scientists and researchers in the navigating field paid more attention to it. The underwater passive navigation can provide accurate navigation information with main Inertial Navigation System (INS) for a long period, such as location and speed. Along with the development of micro-electronics technology, the navigation of AUV is given priority to INS assisted with other navigation methods, such as terrain matching navigation. It can provide navigation ability for a long period, correct the errors of INS and make AUV not emerge from the seabed termly. With terrain matching navigation technique, in the assistance of digital charts and ocean geographical characteristics sensors, we carry through underwater image matching assistant navigation to obtain the higher location precision, therefore it is content with the requirement of underwater, long-term, high precision and all-weather of the navigation system for Autonomous Underwater Vehicles. Tertian-assistant navigation (TAN) is directly dependent on the image information (map information) in the navigating field to assist the primary navigation system according to the path appointed in advance. In TAN, a factor coordinative important with the system operation is precision and practicability of the storable images and the database which produce the image data. If the data used for characteristics are not suitable, the system navigation precision will be low. Comparing with terrain matching assistant navigation system, image matching navigation system is a kind of high precision and low cost assistant navigation system, and its matching precision directly influences the final precision of integrated navigation system. Image matching assistant navigation is spatially matching and aiming at two underwater scenery images coming from two different sensors matriculating of the same scenery in order to confirm the relative displacement of the two images. In this way, we can obtain the vehicle's location in fiducial image known geographical relation, and the precise location information given from image matching location is transmitted to INS to eliminate its location error and greatly enhance the navigation precision of vehicle. Digital image data analysis and processing of image matching in underwater passive navigation is important. In regard to underwater geographic data analysis, we focus on the acquirement, disposal, analysis, expression and measurement of database information. These analysis items structure one of the important contents of underwater terrain matching and are propitious to know the seabed terrain configuration of navigation areas so that the best advantageous seabed terrain district and dependable navigation algorithm can be selected. In this way, we can improve the precision and reliability of terrain assistant navigation system. The pre-process and format transformation of digital image during underwater image matching are expatiated in this paper. The information of the terrain status in navigation areas need further study to provide the reliable data terrain characteristic and underwater overcast for navigation. Through realizing the choice of sea route, danger district prediction and navigating algorithm analysis, TAN can obtain more high location precision and probability, hence provide technological support for image matching of underwater passive navigation.

  13. Communication and Control for Fleets of Autonomous Underwater Vehicles

    DTIC Science & Technology

    2006-10-30

    Washington State University (WSU) on fuzzy logic control systems [2-4] and autonomous vehicles [5-10]. The ALWSE-MC program developed at NAVSEA CSS was...rotating head sonar on crawlers as an additional sensor for navigation. We have previously investigated the use of video cameras on autonomous vehicles for...simulates autonomous vehicles performing mine reconnaissance/mapping, clearance, and surveillance in a littoral region. Three simulations were preformed

  14. Acoustic Communications and Navigation for Mobile Under-Ice Sensors

    DTIC Science & Technology

    2017-02-04

    From- To) 04/02/2017 Final Report 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Acoustic Communications and Navigation for Mobile Under-Ice Sensors...development and fielding of a new acoustic communications and navigation system for use on autonomous platforms (gliders and profiling floats) under the...contact below the ice. 15. SUBJECT TERMS Arctic Ocean, Undersea Workstations & Vehicles, Signal Processing, Navigation, Underwater Acoustics 16

  15. A Dynamic Navigation Model for Unmanned Aircraft Systems and an Application to Autonomous Front-On Environmental Sensing and Photography Using Low-Cost Sensor Systems.

    PubMed

    Cooper, Andrew James; Redman, Chelsea Anne; Stoneham, David Mark; Gonzalez, Luis Felipe; Etse, Victor Kwesi

    2015-08-28

    This paper presents an unmanned aircraft system (UAS) that uses a probabilistic model for autonomous front-on environmental sensing or photography of a target. The system is based on low-cost and readily-available sensor systems in dynamic environments and with the general intent of improving the capabilities of dynamic waypoint-based navigation systems for a low-cost UAS. The behavioural dynamics of target movement for the design of a Kalman filter and Markov model-based prediction algorithm are included. Geometrical concepts and the Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of a target, thus delivering a new waypoint for autonomous navigation. The results of the application to aerial filming with low-cost UAS are presented, achieving the desired goal of maintained front-on perspective without significant constraint to the route or pace of target movement.

  16. Navigation of autonomous vehicles for oil spill cleaning in dynamic and uncertain environments

    NASA Astrophysics Data System (ADS)

    Jin, Xin; Ray, Asok

    2014-04-01

    In the context of oil spill cleaning by autonomous vehicles in dynamic and uncertain environments, this paper presents a multi-resolution algorithm that seamlessly integrates the concepts of local navigation and global navigation based on the sensory information; the objective here is to enable adaptive decision making and online replanning of vehicle paths. The proposed algorithm provides a complete coverage of the search area for clean-up of the oil spills and does not suffer from the problem of having local minima, which is commonly encountered in potential-field-based methods. The efficacy of the algorithm is tested on a high-fidelity player/stage simulator for oil spill cleaning in a harbour, where the underlying oil weathering process is modelled as 2D random-walk particle tracking. A preliminary version of this paper was presented by X. Jin and A. Ray as 'Coverage Control of Autonomous Vehicles for Oil Spill Cleaning in Dynamic and Uncertain Environments', Proceedings of the American Control Conference, Washington, DC, June 2013, pp. 2600-2605.

  17. A Dynamic Navigation Model for Unmanned Aircraft Systems and an Application to Autonomous Front-On Environmental Sensing and Photography Using Low-Cost Sensor Systems

    PubMed Central

    Cooper, Andrew James; Redman, Chelsea Anne; Stoneham, David Mark; Gonzalez, Luis Felipe; Etse, Victor Kwesi

    2015-01-01

    This paper presents an unmanned aircraft system (UAS) that uses a probabilistic model for autonomous front-on environmental sensing or photography of a target. The system is based on low-cost and readily-available sensor systems in dynamic environments and with the general intent of improving the capabilities of dynamic waypoint-based navigation systems for a low-cost UAS. The behavioural dynamics of target movement for the design of a Kalman filter and Markov model-based prediction algorithm are included. Geometrical concepts and the Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of a target, thus delivering a new waypoint for autonomous navigation. The results of the application to aerial filming with low-cost UAS are presented, achieving the desired goal of maintained front-on perspective without significant constraint to the route or pace of target movement. PMID:26343680

  18. Drogue detection for vision-based autonomous aerial refueling via low rank and sparse decomposition with multiple features

    NASA Astrophysics Data System (ADS)

    Gao, Shibo; Cheng, Yongmei; Song, Chunhua

    2013-09-01

    The technology of vision-based probe-and-drogue autonomous aerial refueling is an amazing task in modern aviation for both manned and unmanned aircraft. A key issue is to determine the relative orientation and position of the drogue and the probe accurately for relative navigation system during the approach phase, which requires locating the drogue precisely. Drogue detection is a challenging task due to disorderly motion of drogue caused by both the tanker wake vortex and atmospheric turbulence. In this paper, the problem of drogue detection is considered as a problem of moving object detection. A drogue detection algorithm based on low rank and sparse decomposition with local multiple features is proposed. The global and local information of drogue is introduced into the detection model in a unified way. The experimental results on real autonomous aerial refueling videos show that the proposed drogue detection algorithm is effective.

  19. 77 FR 27202 - 36(b)(1) Arms Sales Notification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-09

    ... includes: Electronic Warfare Systems, Command, Control, Communication, Computers and Intelligence/Communication, Navigational and Identifications (C4I/CNI), Autonomic Logistics Global Support System (ALGS... Systems, Command, Control, Communication, Computers and Intelligence/Communication, Navigational and...

  20. RAIM availability for supplemental GPS navigation

    DOT National Transportation Integrated Search

    1992-06-29

    This paper examines GPS receiver autonomous integrity monitoring (RAIM) availability for supplemental navigation based on the approximate radial-error protection (ARP) method. This method applies ceiling levels for the ARP figure of merit to screen o...

  1. Guidance and Navigation for Rendezvous and Proximity Operations with a Non-Cooperative Spacecraft at Geosynchronous Orbit

    NASA Technical Reports Server (NTRS)

    Barbee, Brent William; Carpenter, J. Russell; Heatwole, Scott; Markley, F. Landis; Moreau, Michael; Naasz, Bo J.; VanEepoel, John

    2010-01-01

    The feasibility and benefits of various spacecraft servicing concepts are currently being assessed, and all require that the servicer spacecraft perform rendezvous, proximity, and capture operations with the target spacecraft to be serviced. Many high-value spacecraft, which would be logical targets for servicing from an economic point of view, are located in geosynchronous orbit, a regime in which autonomous rendezvous and capture operations are not commonplace. Furthermore, existing GEO spacecraft were not designed to be serviced. Most do not have cooperative relative navigation sensors or docking features, and some servicing applications, such as de-orbiting of a non-functional spacecraft, entail rendezvous and capture with a spacecraft that may be non-functional or un-controlled. Several of these challenges have been explored via the design of a notional mission in which a nonfunctional satellite in geosynchronous orbit is captured by a servicer spacecraft and boosted into super-synchronous orbit for safe disposal. A strategy for autonomous rendezvous, proximity operations, and capture is developed, and the Orbit Determination Toolbox (ODTBX) is used to perform a relative navigation simulation to assess the feasibility of performing the rendezvous using a combination of angles-only and range measurements. Additionally, a method for designing efficient orbital rendezvous sequences for multiple target spacecraft is utilized to examine the capabilities of a servicer spacecraft to service multiple targets during the course of a single mission.

  2. Geomagnetic Navigation of Autonomous Underwater Vehicle Based on Multi-objective Evolutionary Algorithm.

    PubMed

    Li, Hong; Liu, Mingyong; Zhang, Feihu

    2017-01-01

    This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments.

  3. Geomagnetic Navigation of Autonomous Underwater Vehicle Based on Multi-objective Evolutionary Algorithm

    PubMed Central

    Li, Hong; Liu, Mingyong; Zhang, Feihu

    2017-01-01

    This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments. PMID:28747884

  4. Research of autonomous celestial navigation based on new measurement model of stellar refraction

    NASA Astrophysics Data System (ADS)

    Yu, Cong; Tian, Hong; Zhang, Hui; Xu, Bo

    2014-09-01

    Autonomous celestial navigation based on stellar refraction has attracted widespread attention for its high accuracy and full autonomy.In this navigation method, establishment of accurate stellar refraction measurement model is the fundament and key issue to achieve high accuracy navigation. However, the existing measurement models are limited due to the uncertainty of atmospheric parameters. Temperature, pressure and other factors which affect the stellar refraction within the height of earth's stratosphere are researched, and the varying model of atmosphere with altitude is derived on the basis of standard atmospheric data. Furthermore, a novel measurement model of stellar refraction in a continuous range of altitudes from 20 km to 50 km is produced by modifying the fixed altitude (25 km) measurement model, and equation of state with the orbit perturbations is established, then a simulation is performed using the improved Extended Kalman Filter. The results show that the new model improves the navigation accuracy, which has a certain practical application value.

  5. Development of Mission Enabling Infrastructure — Cislunar Autonomous Positioning System (CAPS)

    NASA Astrophysics Data System (ADS)

    Cheetham, B. W.

    2017-10-01

    Advanced Space, LLC is developing the Cislunar Autonomous Positioning System (CAPS) which would provide a scalable and evolvable architecture for navigation to reduce ground congestion and improve operations for missions throughout cislunar space.

  6. DEMONSTRATION OF AUTONOMOUS AIR MONITORING THROUGH ROBOTICS

    EPA Science Inventory

    This project included modifying an existing teleoperated robot to include autonomous navigation, large object avoidance, and air monitoring and demonstrating that prototype robot system in indoor and outdoor environments. An existing teleoperated "Surveyor" robot developed by ARD...

  7. Telecommunications, navigation and information management concept overview for the Space Exploration Initiative program

    NASA Technical Reports Server (NTRS)

    Bell, Jerome A.; Stephens, Elaine; Barton, Gregg

    1991-01-01

    An overview is provided of the Space Exploration Initiative (SEI) concepts for telecommunications, information systems, and navigation (TISN), and engineering and architecture issues are discussed. The SEI program data system is reviewed to identify mission TISN interfaces, and reference TISN concepts are described for nominal, degraded, and mission-critical data services. The infrastructures reviewed include telecommunications for robotics support, autonomous navigation without earth-based support, and information networks for tracking and data acquisition. Four options for TISN support architectures are examined which relate to unique SEI exploration strategies. Detailed support estimates are given for: (1) a manned stay on Mars; (2) permanent lunar and Martian settlements; short-duration missions; and (4) systematic exploration of the moon and Mars.

  8. Autonomous Flight Safety System

    NASA Technical Reports Server (NTRS)

    Simpson, James

    2010-01-01

    The Autonomous Flight Safety System (AFSS) is an independent self-contained subsystem mounted onboard a launch vehicle. AFSS has been developed by and is owned by the US Government. Autonomously makes flight termination/destruct decisions using configurable software-based rules implemented on redundant flight processors using data from redundant GPS/IMU navigation sensors. AFSS implements rules determined by the appropriate Range Safety officials.

  9. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1990-01-01

    A research program and strategy are described which include fundamental teleoperation issues and autonomous-control issues of sensing and navigation for satellite robots. The program consists of developing interfaces for visual operation and studying the consequences of interface designs as well as developing navigation and control technologies based on visual interaction. A space-robot-vehicle simulator is under development for use in virtual-environment teleoperation experiments and neutral-buoyancy investigations. These technologies can be utilized in a study of visual interfaces to address tradeoffs between head-tracking and manual remote cameras, panel-mounted and helmet-mounted displays, and stereoscopic and monoscopic display systems. The present program can provide significant data for the development of control experiments for autonomously controlled satellite robots.

  10. Development of a GPS/INS/MAG navigation system and waypoint navigator for a VTOL UAV

    NASA Astrophysics Data System (ADS)

    Meister, Oliver; Mönikes, Ralf; Wendel, Jan; Frietsch, Natalie; Schlaile, Christian; Trommer, Gert F.

    2007-04-01

    Unmanned aerial vehicles (UAV) can be used for versatile surveillance and reconnaissance missions. If a UAV is capable of flying automatically on a predefined path the range of possible applications is widened significantly. This paper addresses the development of the integrated GPS/INS/MAG navigation system and a waypoint navigator for a small vertical take-off and landing (VTOL) unmanned four-rotor helicopter with a take-off weight below 1 kg. The core of the navigation system consists of low cost inertial sensors which are continuously aided with GPS, magnetometer compass, and a barometric height information. Due to the fact, that the yaw angle becomes unobservable during hovering flight, the integration with a magnetic compass is mandatory. This integration must be robust with respect to errors caused by the terrestrial magnetic field deviation and interferences from surrounding electronic devices as well as ferrite metals. The described integration concept with a Kalman filter overcomes the problem that erroneous magnetic measurements yield to an attitude error in the roll and pitch axis. The algorithm provides long-term stable navigation information even during GPS outages which is mandatory for the flight control of the UAV. In the second part of the paper the guidance algorithms are discussed in detail. These algorithms allow the UAV to operate in a semi-autonomous mode position hold as well an complete autonomous waypoint mode. In the position hold mode the helicopter maintains its position regardless of wind disturbances which ease the pilot job during hold-and-stare missions. The autonomous waypoint navigator enable the flight outside the range of vision and beyond the range of the radio link. Flight test results of the implemented modes of operation are shown.

  11. Cheap or Robust? The practical realization of self-driving wheelchair technology.

    PubMed

    Burhanpurkar, Maya; Labbe, Mathieu; Guan, Charlie; Michaud, Francois; Kelly, Jonathan

    2017-07-01

    To date, self-driving experimental wheelchair technologies have been either inexpensive or robust, but not both. Yet, in order to achieve real-world acceptance, both qualities are fundamentally essential. We present a unique approach to achieve inexpensive and robust autonomous and semi-autonomous assistive navigation for existing fielded wheelchairs, of which there are approximately 5 million units in Canada and United States alone. Our prototype wheelchair platform is capable of localization and mapping, as well as robust obstacle avoidance, using only a commodity RGB-D sensor and wheel odometry. As a specific example of the navigation capabilities, we focus on the single most common navigation problem: the traversal of narrow doorways in arbitrary environments. The software we have developed is generalizable to corridor following, desk docking, and other navigation tasks that are either extremely difficult or impossible for people with upper-body mobility impairments.

  12. Local navigation and fuzzy control realization for autonomous guided vehicle

    NASA Astrophysics Data System (ADS)

    El-Konyaly, El-Sayed H.; Saraya, Sabry F.; Shehata, Raef S.

    1996-10-01

    This paper addresses the problem of local navigation for an autonomous guided vehicle (AGV) in a structured environment that contains static and dynamic obstacles. Information about the environment is obtained via a CCD camera. The problem is formulated as a dynamic feedback control problem in which speed and steering decisions are made on the fly while the AGV is moving. A decision element (DE) that uses local information is proposed. The DE guides the vehicle in the environment by producing appropriate navigation decisions. Dynamic models of a three-wheeled vehicle for driving and steering mechanisms are derived. The interaction between them is performed via the local feedback DE. A controller, based on fuzzy logic, is designed to drive the vehicle safely in an intelligent and human-like manner. The effectiveness of the navigation and control strategies in driving the AGV is illustrated and evaluated.

  13. Development and Evaluation of Positioning Systems for Autonomous Vehicle Navigation

    DTIC Science & Technology

    2001-12-01

    generation of autonomous vehicles to utilize NTV technology is built on a commercially-available vehicle built by ASV. The All-Purpose Remote Transport...larger scale, AFRL and CIMAR are involved in the development of a standard approach in the design and specification of autonomous vehicles being...1996. Shi92 Shin, D.H., Sanjiv, S., and Lee, J.J., “Explicit Path Tracking by Autonomous Vehicles ,” Robotica, 10, (1992), 69-87. Ste95

  14. Visual Requirements for Human Drivers and Autonomous Vehicles

    DOT National Transportation Integrated Search

    2016-03-01

    Identification of published literature between 1995 and 2013, focusing on determining the quantity and quality of visual information needed under both driving modes (i.e., human and autonomous) to navigate the road safely, especially as it pertains t...

  15. A reactive system for open terrain navigation: Performance and limitations

    NASA Technical Reports Server (NTRS)

    Langer, D.; Rosenblatt, J.; Hebert, M.

    1994-01-01

    We describe a core system for autonomous navigation in outdoor natural terrain. The system consists of three parts: a perception module which processes range images to identify untraversable regions of the terrain, a local map management module which maintains a representation of the environment in the vicinity of the vehicle, and a planning module which issues commands to the vehicle controller. Our approach is to use the concept of 'early traversability evaluation', and on the use of reactive planning for generating commands to drive the vehicle. We argue that our approach leads to a robust and efficient navigation system. We illustrate our approach by an experiment in which a vehicle travelled autonomously for one kilometer through unmapped cross-country terrain.

  16. H∞ robust fault-tolerant controller design for an autonomous underwater vehicle's navigation control system

    NASA Astrophysics Data System (ADS)

    Cheng, Xiang-Qin; Qu, Jing-Yuan; Yan, Zhe-Ping; Bian, Xin-Qian

    2010-03-01

    In order to improve the security and reliability for autonomous underwater vehicle (AUV) navigation, an H∞ robust fault-tolerant controller was designed after analyzing variations in state-feedback gain. Operating conditions and the design method were then analyzed so that the control problem could be expressed as a mathematical optimization problem. This permitted the use of linear matrix inequalities (LMI) to solve for the H∞ controller for the system. When considering different actuator failures, these conditions were then also mathematically expressed, allowing the H∞ robust controller to solve for these events and thus be fault-tolerant. Finally, simulation results showed that the H∞ robust fault-tolerant controller could provide precise AUV navigation control with strong robustness.

  17. Technology initiatives for the autonomous guidance, navigation, and control of single and multiple satellites

    NASA Astrophysics Data System (ADS)

    Croft, John; Deily, John; Hartman, Kathy; Weidow, David

    1998-01-01

    In the twenty-first century, NASA envisions frequent low-cost missions to explore the solar system, observe the universe, and study our planet. To realize NASA's goal, the Guidance, Navigation, and Control Center (GNCC) at the Goddard Space Flight Center sponsors technology programs that enhance spacecraft performance, streamline processes and ultimately enable cheaper science. Our technology programs encompass control system architectures, sensor and actuator components, electronic systems, design and development of algorithms, embedded systems and space vehicle autonomy. Through collaboration with government, universities, non-profit organizations, and industry, the GNCC incrementally develops key technologies that conquer NASA's challenges. This paper presents an overview of several innovative technology initiatives for the autonomous guidance, navigation, and control (GN&C) of satellites.

  18. Reduction of User Interaction by Autonomy

    NASA Technical Reports Server (NTRS)

    Morfopoulos, Arin; McHenry, Michael; Matthies, Larry

    2006-01-01

    This paper describes experiments that quantify the improvement that autonomous behaviors enable in the amount of user interaction required to navigate a robot in urban environments. Many papers have discussed various ways to measure the absolute level of autonomy of a system; we measured the relative improvement of autonomous behaviors over teleoperation across multiple traverses of the same course. We performed four runs each on an 'easy' course and a 'hard' course, where half the runs were teleoperated and half used more autonomous behaviors. Statistics show 40-70% reductions in the amount of time the user interacts with the control station; however, with the behaviors tested, user attention remained on the control station even when he was not interacting. Reducing the need for attention will require better obstacle detection and avoidance and better absolute position estimation.

  19. Terminal Homing for Autonomous Underwater Vehicle Docking

    DTIC Science & Technology

    2016-06-01

    underwater domain, accurate navigation. Above the water, light and electromagnetic signals travel well through air and space, mediums that allow for a...DISTRIBUTION CODE 13. ABSTRACT The use of docking stations for autonomous underwater vehicles (AUV) provides the ability to keep a vehicle on...Mechanical and Aerospace Engineering iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT The use of docking stations for autonomous underwater

  20. Advancing Underwater Acoustic Communication for Autonomous Distributed Networks via Sparse Channel Sensing, Coding, and Navigation Support

    DTIC Science & Technology

    2014-09-30

    underwater acoustic communication technologies for autonomous distributed underwater networks , through innovative signal processing, coding, and...4. TITLE AND SUBTITLE Advancing Underwater Acoustic Communication for Autonomous Distributed Networks via Sparse Channel Sensing, Coding, and...coding: 3) OFDM modulated dynamic coded cooperation in underwater acoustic channels; 3 Localization, Networking , and Testbed: 4) On-demand

  1. Science Benefits of Onboard Spacecraft Navigation

    NASA Technical Reports Server (NTRS)

    Cangahuala, Al; Bhaskaran, Shyam; Owen, Bill

    2012-01-01

    Primitive bodies (asteroids and comets), which have remained relatively unaltered since their formation, are important targets for scientific missions that seek to understand the evolution of the solar system. Often the first step is to fly by these bodies with robotic spacecraft. The key to maximizing data returns from these flybys is to determine the spacecraft trajectory relative to the target body-in short, navigate the spacecraft- with sufficient accuracy so that the target is guaranteed to be in the instruments' field of view. The most powerful navigation data in these scenarios are images taken by the spacecraft of the target against a known star field (onboard astrometry). Traditionally, the relative trajectory of the spacecraft must be estimated hours to days in advance using images collected by the spacecraft. This is because of (1)!the long round-trip light times between the spacecraft and the Earth and (2)!the time needed to downlink and process navigation data on the ground, make decisions based on the result, and build and uplink instrument pointing sequences from the results. The light time and processing time compromise navigation accuracy considerably, because there is not enough time to use more accurate data collected closer to the target-such data are more accurate because the angular capability of the onboard astrometry is essentially constant as the distance to the target decreases, resulting in better "plane-of- sky" knowledge of the target. Excellent examples of these timing limitations are high-speed comet encounters. Comets are difficult to observe up close; their orbits often limit scientists to brief, rapid flybys, and their coma further restricts viewers from seeing the nucleus in any detail, unless they can view the nucleus at close range. Comet nuclei details are typically discernable for much shorter durations than the roundtrip light time to Earth, so robotic spacecraft must be able to perform onboard navigation. This onboard navigation can be accomplished through a self- contained system that by eliminating light time restrictions dramatically improves the relative trajectory knowledge and control and subsequently increases the amount of quality data collected. Flybys are one-time events, so the system's underlying algorithms and software must be extremely robust. The autonomous software must also be able to cope with the unknown size, shape, and orientation of the previously unseen comet nucleus. Furthermore, algorithms must be reliable in the presence of imperfections and/or damage to onboard cameras accrued after many years of deep-space operations. The AutoNav operational flight software packages, developed by scientists at the Jet Propulsion Laboratory (JPL) under contract with NASA, meet all these requirements. They have been directly responsible for the successful encounters on all of NASA's close-up comet-imaging missions (see Figure !1). AutoNav is the only system to date that has autonomously tracked comet nuclei during encounters and performed autonomous interplanetary navigation. AutoNav has enabled five cometary flyby missions (Table!1) residing on four NASA spacecraft provided by three different spacecraft builders. Using this software, missions were able to process a combined total of nearly 1000 images previously unseen by humans. By eliminating the need to navigate spacecraft from Earth, the accuracy gained by AutoNav during flybys compared to ground-based navigation is about 1!order of magnitude in targeting and 2!orders of magnitude in time of flight. These benefits ensure that pointing errors do not compromise data gathered during flybys. In addition, these benefits can be applied to flybys of other solar system objects, flybys at much slower relative velocities, mosaic imaging campaigns, and other proximity activities (e.g., orbiting, hovering, and descent/ascent).

  2. A Simulation Study of a Speed Control System for Autonomous On-Road Operation of Automotive Vehicles.

    DTIC Science & Technology

    1987-06-01

    by block numoiber) The study of human driving of automotive vehicles is an important aid to the development of viable autonomous vehicle navigation...of human driving which could provide some different insights into possible approaches to autonomous vehicle control. At the start of this work, it was...advanced work in the behavioral aspects of human driving . Research of this nature can have a significant impact on the development of autonomous vehicles

  3. Integrated INS/GPS Navigation from a Popular Perspective

    NASA Technical Reports Server (NTRS)

    Omerbashich, Mensur

    2002-01-01

    Inertial navigation, blended with other navigation aids, Global Positioning System (GPS) in particular, has gained significance due to enhanced navigation and inertial reference performance and dissimilarity for fault tolerance and anti-jamming. Relatively new concepts based upon using Differential GPS (DGPS) blended with Inertial (and visual) Navigation Sensors (INS) offer the possibility of low cost, autonomous aircraft landing. The FAA has decided to implement the system in a sophisticated form as a new standard navigation tool during this decade. There have been a number of new inertial sensor concepts in the recent past that emphasize increased accuracy of INS/GPS versus INS and reliability of navigation, as well as lower size and weight, and higher power, fault tolerance, and long life. The principles of GPS are not discussed; rather the attention is directed towards general concepts and comparative advantages. A short introduction to the problems faced in kinematics is presented. The intention is to relate the basic principles of kinematics to probably the most used navigation method in the future-INS/GPS. An example of the airborne INS is presented, with emphasis on how it works. The discussion of the error types and sources in navigation, and of the role of filters in optimal estimation of the errors then follows. The main question this paper is trying to answer is 'What are the benefits of the integration of INS and GPS and how is this, navigation concept of the future achieved in reality?' The main goal is to communicate the idea about what stands behind a modern navigation method.

  4. GNC architecture for autonomous robotic capture of a non-cooperative target: Preliminary concept design

    NASA Astrophysics Data System (ADS)

    Jankovic, Marko; Paul, Jan; Kirchner, Frank

    2016-04-01

    Recent studies of the space debris population in low Earth orbit (LEO) have concluded that certain regions have already reached a critical density of objects. This will eventually lead to a cascading process called the Kessler syndrome. The time may have come to seriously consider active debris removal (ADR) missions as the only viable way of preserving the space environment for future generations. Among all objects in the current environment, the SL-8 (Kosmos 3M second stages) rocket bodies (R/Bs) are some of the most suitable targets for future robotic ADR missions. However, to date, an autonomous relative navigation to and capture of an non-cooperative target has never been performed. Therefore, there is a need for more advanced, autonomous and modular systems that can cope with uncontrolled, tumbling objects. The guidance, navigation and control (GNC) system is one of the most critical ones. The main objective of this paper is to present a preliminary concept of a modular GNC architecture that should enable a safe and fuel-efficient capture of a known but uncooperative target, such as Kosmos 3M R/B. In particular, the concept was developed having in mind the most critical part of an ADR mission, i.e. close range proximity operations, and state of the art algorithms in the field of autonomous rendezvous and docking. In the end, a brief description of the hardware in the loop (HIL) testing facility is made, foreseen for the practical evaluation of the developed architecture.

  5. Application of parallelized software architecture to an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Shakya, Rahul; Wright, Adam; Shin, Young Ho; Momin, Orko; Petkovsek, Steven; Wortman, Paul; Gautam, Prasanna; Norton, Adam

    2011-01-01

    This paper presents improvements made to Q, an autonomous ground vehicle designed to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2010 IGVC, Q was upgraded with a new parallelized software architecture and a new vision processor. Improvements were made to the power system reducing the number of batteries required for operation from six to one. In previous years, a single state machine was used to execute the bulk of processing activities including sensor interfacing, data processing, path planning, navigation algorithms and motor control. This inefficient approach led to poor software performance and made it difficult to maintain or modify. For IGVC 2010, the team implemented a modular parallel architecture using the National Instruments (NI) LabVIEW programming language. The new architecture divides all the necessary tasks - motor control, navigation, sensor data collection, etc. into well-organized components that execute in parallel, providing considerable flexibility and facilitating efficient use of processing power. Computer vision is used to detect white lines on the ground and determine their location relative to the robot. With the new vision processor and some optimization of the image processing algorithm used last year, two frames can be acquired and processed in 70ms. With all these improvements, Q placed 2nd in the autonomous challenge.

  6. Relative receiver autonomous integrity monitoring for future GNSS-based aircraft navigation

    NASA Astrophysics Data System (ADS)

    Gratton, Livio Rafael

    The Global Positioning System (GPS) has enabled reliable, safe, and practical aircraft positioning for en-route and non-precision phases of flight for more than a decade. Intense research is currently devoted to extending the use of Global Navigation Satellite Systems (GNSS), including GPS, to precision approach and landing operations. In this context, this work is focused on the development, analysis, and verification of the concept of Relative Receiver Autonomous Integrity Monitoring (RRAIM) and its potential applications to precision approach navigation. RRAIM fault detection algorithms are developed, and associated mathematical bounds on position error are derived. These are investigated as possible solutions to some current key challenges in precision approach navigation, discussed below. Augmentation systems serving continent-size areas (like the Wide Area Augmentation System or WAAS) allow certain precision approach operations within the covered region. More and better satellites, with dual frequency capabilities, are expected to be in orbit in the mid-term future, which will potentially allow WAAS-like capabilities worldwide with a sparse ground station network. Two main challenges in achieving this goal are (1) ensuring that navigation fault detection functions are fast enough to alert worldwide users of hazardously misleading information, and (2) minimizing situations in which navigation is unavailable because the user's local satellite geometry is insufficient for safe position estimation. Local augmentation systems (implemented at individual airports, like the Local Area Augmentation System or LAAS) have the potential to allow precision approach and landing operations by providing precise corrections to user-satellite range measurements. An exception to these capabilities arises during ionospheric storms (caused by solar activity), when hazardous situations can exist with residual range errors several orders of magnitudes higher than nominal. Until dual frequency civil GPS signals are available, the ability to provide integrity during ionospheric storms, without excessive loss of availability is a major challenge. For all users, with or without augmentation, some situations cause short duration losses of satellites in view. Two examples are aircraft banking during turns and ionospheric scintillation. The loss of range signals can translate into gaps in good satellite geometry, and the resulting challenge is to ensure navigation continuity by bridging these gaps, while simultaneously maintaining high integrity. It is shown that the RRAIM methods developed in this research can be applied to mitigate each of these obstacles to safe and reliable precision aircraft navigation.

  7. Libration Point Navigation Concepts Supporting the Vision for Space Exploration

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Folta, David C.; Moreau, Michael C.; Quinn, David A.

    2004-01-01

    This work examines the autonomous navigation accuracy achievable for a lunar exploration trajectory from a translunar libration point lunar navigation relay satellite, augmented by signals from the Global Positioning System (GPS). We also provide a brief analysis comparing the libration point relay to lunar orbit relay architectures, and discuss some issues of GPS usage for cis-lunar trajectories.

  8. Ground Simulation of an Autonomous Satellite Rendezvous and Tracking System Using Dual Robotic Systems

    NASA Technical Reports Server (NTRS)

    Trube, Matthew J.; Hyslop, Andrew M.; Carignan, Craig R.; Easley, Joseph W.

    2012-01-01

    A hardware-in-the-loop ground system was developed for simulating a robotic servicer spacecraft tracking a target satellite at short range. A relative navigation sensor package "Argon" is mounted on the end-effector of a Fanuc 430 manipulator, which functions as the base platform of the robotic spacecraft servicer. Machine vision algorithms estimate the pose of the target spacecraft, mounted on a Rotopod R-2000 platform, relay the solution to a simulation of the servicer spacecraft running in "Freespace", which performs guidance, navigation and control functions, integrates dynamics, and issues motion commands to a Fanuc platform controller so that it tracks the simulated servicer spacecraft. Results will be reviewed for several satellite motion scenarios at different ranges. Key words: robotics, satellite, servicing, guidance, navigation, tracking, control, docking.

  9. Navigation of military and space unmanned ground vehicles in unstructured terrains

    NASA Technical Reports Server (NTRS)

    Lescoe, Paul; Lavery, David; Bedard, Roger

    1991-01-01

    Development of unmanned vehicles for local navigation in terrains unstructured by humans is reviewed. Modes of navigation include teleoperation or remote control, computer assisted remote driving (CARD), and semiautonomous navigation (SAN). A first implementation of a CARD system was successfully tested using the Robotic Technology Test Vehicle developed by Jet Propulsion Laboratory. Stereo pictures were transmitted to a remotely located human operator, who performed the sensing, perception, and planning functions of navigation. A computer provided range and angle measurements and the path plan was transmitted to the vehicle which autonomously executed the path. This implementation is to be enhanced by providing passive stereo vision and a reflex control system for autonomously stopping the vehicle if blocked by an obstacle. SAN achievements include implementation of a navigation testbed on a six wheel, three-body articulated rover vehicle, development of SAN algorithms and code, integration of SAN software onto the vehicle, and a successful feasibility demonstration that represents a step forward towards the technology required for long-range exploration of the lunar or Martian surface. The vehicle includes a passive stereo vision system with real-time area-based stereo image correlation, a terrain matcher, a path planner, and a path execution planner.

  10. Autonomous Vision Navigation for Spacecraft in Lunar Orbit

    NASA Astrophysics Data System (ADS)

    Bader, Nolan A.

    NASA aims to achieve unprecedented navigational reliability for the first manned lunar mission of the Orion spacecraft in 2023. A technique for accomplishing this is to integrate autonomous feature tracking as an added means of improving position and velocity estimation. In this thesis, a template matching algorithm and optical sensor are tested onboard three simulated lunar trajectories using linear covariance techniques under various conditions. A preliminary characterization of the camera gives insight into its ability to determine azimuth and elevation angles to points on the surface of the Moon. A navigation performance analysis shows that an optical camera sensor can aid in decreasing position and velocity errors, particularly in a loss of communication scenario. Furthermore, it is found that camera quality and computational capability are driving factors affecting the performance of such a system.

  11. ARK: Autonomous mobile robot in an industrial environment

    NASA Technical Reports Server (NTRS)

    Nickerson, S. B.; Jasiobedzki, P.; Jenkin, M.; Jepson, A.; Milios, E.; Down, B.; Service, J. R. R.; Terzopoulos, D.; Tsotsos, J.; Wilkes, D.

    1994-01-01

    This paper describes research on the ARK (Autonomous Mobile Robot in a Known Environment) project. The technical objective of the project is to build a robot that can navigate in a complex industrial environment using maps with permanent structures. The environment is not altered in any way by adding easily identifiable beacons and the robot relies on naturally occurring objects to use as visual landmarks for navigation. The robot is equipped with various sensors that can detect unmapped obstacles, landmarks and objects. In this paper we describe the robot's industrial environment, it's architecture, a novel combined range and vision sensor and our recent results in controlling the robot in the real-time detection of objects using their color and in the processing of the robot's range and vision sensor data for navigation.

  12. Ribbon networks for modeling navigable paths of autonomous agents in virtual environments.

    PubMed

    Willemsen, Peter; Kearney, Joseph K; Wang, Hongling

    2006-01-01

    This paper presents the Environment Description Framework (EDF) for modeling complex networks of intersecting roads and pathways in virtual environments. EDF represents information about the layout of streets and sidewalks, the rules that govern behavior on roads and walkways, and the locations of agents with respect to navigable structures. The framework serves as the substrate on which behavior programs for autonomous vehicles and pedestrians are built. Pathways are modeled as ribbons in space. The ribbon structure provides a natural coordinate frame for defining the local geometry of navigable surfaces. EDF includes a powerful runtime interface supported by robust and efficient code for locating objects on the ribbon network, for mapping between Cartesian and ribbon coordinates, and for determining behavioral constraints imposed by the environment.

  13. Underwater terrain-aided navigation system based on combination matching algorithm.

    PubMed

    Li, Peijuan; Sheng, Guoliang; Zhang, Xiaofei; Wu, Jingqiu; Xu, Baochun; Liu, Xing; Zhang, Yao

    2018-07-01

    Considering that the terrain-aided navigation (TAN) system based on iterated closest contour point (ICCP) algorithm diverges easily when the indicative track of strapdown inertial navigation system (SINS) is large, Kalman filter is adopted in the traditional ICCP algorithm, difference between matching result and SINS output is used as the measurement of Kalman filter, then the cumulative error of the SINS is corrected in time by filter feedback correction, and the indicative track used in ICCP is improved. The mathematic model of the autonomous underwater vehicle (AUV) integrated into the navigation system and the observation model of TAN is built. Proper matching point number is designated by comparing the simulation results of matching time and matching precision. Simulation experiments are carried out according to the ICCP algorithm and the mathematic model. It can be concluded from the simulation experiments that the navigation accuracy and stability are improved with the proposed combinational algorithm in case that proper matching point number is engaged. It will be shown that the integrated navigation system is effective in prohibiting the divergence of the indicative track and can meet the requirements of underwater, long-term and high precision of the navigation system for autonomous underwater vehicles. Copyright © 2017. Published by Elsevier Ltd.

  14. Pulsar Timing and Its Application for Navigation and Gravitational Wave Detection

    NASA Astrophysics Data System (ADS)

    Becker, Werner; Kramer, Michael; Sesana, Alberto

    2018-02-01

    Pulsars are natural cosmic clocks. On long timescales they rival the precision of terrestrial atomic clocks. Using a technique called pulsar timing, the exact measurement of pulse arrival times allows a number of applications, ranging from testing theories of gravity to detecting gravitational waves. Also an external reference system suitable for autonomous space navigation can be defined by pulsars, using them as natural navigation beacons, not unlike the use of GPS satellites for navigation on Earth. By comparing pulse arrival times measured on-board a spacecraft with predicted pulse arrivals at a reference location (e.g. the solar system barycenter), the spacecraft position can be determined autonomously and with high accuracy everywhere in the solar system and beyond. We describe the unique properties of pulsars that suggest that such a navigation system will certainly have its application in future astronautics. We also describe the on-going experiments to use the clock-like nature of pulsars to "construct" a galactic-sized gravitational wave detector for low-frequency (f_{GW}˜ 10^{-9} - 10^{-7} Hz) gravitational waves. We present the current status and provide an outlook for the future.

  15. Navigation Architecture For A Space Mobile Network

    NASA Technical Reports Server (NTRS)

    Valdez, Jennifer E.; Ashman, Benjamin; Gramling, Cheryl; Heckler, Gregory W.; Carpenter, Russell

    2016-01-01

    The Tracking and Data Relay Satellite System (TDRSS) Augmentation Service for Satellites (TASS) is a proposed beacon service to provide a global, space-based GPS augmentation service based on the NASA Global Differential GPS (GDGPS) System. The TASS signal will be tied to the GPS time system and usable as an additional ranging and Doppler radiometric source. Additionally, it will provide data vital to autonomous navigation in the near Earth regime, including space weather information, TDRS ephemerides, Earth Orientation Parameters (EOP), and forward commanding capability. TASS benefits include enhancing situational awareness, enabling increased autonomy, and providing near real-time command access for user platforms. As NASA Headquarters Space Communication and Navigation Office (SCaN) begins to move away from a centralized network architecture and towards a Space Mobile Network (SMN) that allows for user initiated services, autonomous navigation will be a key part of such a system. This paper explores how a TASS beacon service enables the Space Mobile Networking paradigm, what a typical user platform would require, and provides an in-depth analysis of several navigation scenarios and operations concepts.

  16. Autonomous unmanned air vehicles (UAV) techniques

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Lee, Ting N.

    2007-04-01

    The UAVs (Unmanned Air Vehicles) have great potentials in different civilian applications, such as oil pipeline surveillance, precision farming, forest fire fighting (yearly), search and rescue, boarder patrol, etc. The related industries of UAVs can create billions of dollars for each year. However, the road block of adopting UAVs is that it is against FAA (Federal Aviation Administration) and ATC (Air Traffic Control) regulations. In this paper, we have reviewed the latest technologies and researches on UAV navigation and obstacle avoidance. We have purposed a system design of Jittering Mosaic Image Processing (JMIP) with stereo vision and optical flow to fulfill the functionalities of autonomous UAVs.

  17. Experiment D009: Simple navigation

    NASA Technical Reports Server (NTRS)

    Silva, R. M.; Jorris, T. R.; Vallerie, E. M., III

    1971-01-01

    Space position-fixing techniques have been investigated by collecting data on the observable phenomena of space flight that could be used to solve the problem of autonomous navigation by the use of optical data and manual computations to calculate the position of a spacecraft. After completion of the developmental and test phases, the product of the experiment would be a manual-optical technique of orbital space navigation that could be used as a backup to onboard and ground-based spacecraft-navigation systems.

  18. First Results from a Hardware-in-the-Loop Demonstration of Closed-Loop Autonomous Formation Flying

    NASA Technical Reports Server (NTRS)

    Gill, E.; Naasz, Bo; Ebinuma, T.

    2003-01-01

    A closed-loop system for the demonstration of formation flying technologies has been developed at NASA s Goddard Space Flight Center. Making use of a GPS signal simulator with a dual radio frequency outlet, the system includes two GPS space receivers as well as a powerful onboard navigation processor dedicated to the GPS-based guidance, navigation, and control of a satellite formation in real-time. The closed-loop system allows realistic simulations of autonomous formation flying scenarios, enabling research in the fields of tracking and orbit control strategies for a wide range of applications. A sample scenario has been set up where the autonomous transition of a satellite formation from an initial along-track separation of 800 m to a final distance of 100 m has been demonstrated. As a result, a typical control accuracy of about 5 m has been achieved which proves the applicability of autonomous formation flying techniques to formations of satellites as close as 50 m.

  19. An onboard navigation system which fulfills Mars aerocapture guidance requirements

    NASA Technical Reports Server (NTRS)

    Brand, Timothy J.; Fuhry, Douglas P.; Shepperd, Stanley W.

    1989-01-01

    The development of a candidate autonomous onboard Mars approach navigation scheme capable of supporting aerocapture into Mars orbit is discussed. An aerocapture guidance and navigation system which can run independently of the preaerocapture navigation was used to define a preliminary set of accuracy requirements at entry interface. These requirements are used to evaluate the proposed preaerocapture navigation scheme. This scheme uses optical sightings on Deimos with a star tracker and an inertial measurement unit for instrumentation as a source for navigation nformation. Preliminary results suggest that the approach will adequately support aerocaputre into Mars orbit.

  20. Robot navigation research using the HERMIES mobile robot

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, D.L.

    1989-01-01

    In recent years robot navigation has attracted much attention from researchers around the world. Not only are theoretical studies being simulated on sophisticated computers, but many mobile robots are now used as test vehicles for these theoretical studies. Various algorithms have been perfected for navigation in a known static environment; but navigation in an unknown and dynamic environment poses a much more challenging problem for researchers. Many different methodologies have been developed for autonomous robot navigation, but each methodology is usually restricted to a particular type of environment. One important research focus of the Center for Engineering Systems Advanced researchmore » (CESAR) at Oak Ridge National Laboratory, is autonomous navigation in unknown and dynamic environments using the series of HERMIES mobile robots. The research uses an expert system for high-level planning interfaced with C-coded routines for implementing the plans, and for quick processing of data requested by the expert system. In using this approach, the navigation is not restricted to one methodology since the expert system can activate a rule module for the methodology best suited for the current situation. Rule modules can be added the rule base as they are developed and tested. Modules are being developed or enhanced for navigating from a map, searching for a target, exploring, artificial potential-field navigation, navigation using edge-detection, etc. This paper will report on the various rule modules and methods of navigation in use, or under development at CESAR, using the HERMIES-IIB robot as a testbed. 13 refs., 5 figs., 1 tab.« less

  1. Autonomous navigation and control of a Mars rover

    NASA Technical Reports Server (NTRS)

    Miller, D. P.; Atkinson, D. J.; Wilcox, B. H.; Mishkin, A. H.

    1990-01-01

    A Mars rover will need to be able to navigate autonomously kilometers at a time. This paper outlines the sensing, perception, planning, and execution monitoring systems that are currently being designed for the rover. The sensing is based around stereo vision. The interpretation of the images use a registration of the depth map with a global height map provided by an orbiting spacecraft. Safe, low energy paths are then planned through the map, and expectations of what the rover's articulation sensors should sense are generated. These expectations are then used to ensure that the planned path is correctly being executed.

  2. The JPL roadmap for Deep Space navigation

    NASA Technical Reports Server (NTRS)

    Martin-Mur, Tomas J.; Abraham, Douglas S.; Berry, David; Bhaskaran, Shyam; Cesarone, Robert J.; Wood, Lincoln

    2006-01-01

    This paper reviews the tentative set of deep space missions that will be supported by NASA's Deep Space Mission System in the next twenty-five years, and extracts the driving set of navigation capabilities that these missions will require. There will be many challenges including the support of new mission navigation approaches such as formation flying and rendezvous in deep space, low-energy and low-thrust orbit transfers, precise landing and ascent vehicles, and autonomous navigation. Innovative strategies and approaches will be needed to develop and field advanced navigation capabilities.

  3. Helicopter Field Testing of NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) System fully Integrated with the Morpheus Vertical Test Bed Avionics

    NASA Technical Reports Server (NTRS)

    Epp, Chirold D.; Robertson, Edward A.; Ruthishauser, David K.

    2013-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project was chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with real-time terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. This is accomplished with the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN). The NASA plan for the ALHAT technology is to perform the TRL6 closed loop demonstration on the Morpheus Vertical Test Bed (VTB). The first Morpheus vehicle was lost in August of 2012 during free-flight testing at Kennedy Space Center (KSC), so the decision was made to perform a helicopter test of the integrated ALHAT System with the Morpheus avionics over the ALHAT planetary hazard field at KSC. The KSC helicopter tests included flight profiles approximating planetary approaches, with the entire ALHAT system interfaced with all appropriate Morpheus subsystems and operated in real-time. During these helicopter flights, the ALHAT system imaged the simulated lunar terrain constructed in FY2012 to support ALHAT/Morpheus testing at KSC. To the best of our knowledge, this represents the highest fidelity testing of a system of this kind to date. During this helicopter testing, two new Morpheus landers were under construction at the Johnson Space Center to support the objective of an integrated ALHAT/Morpheus free-flight demonstration. This paper provides an overview of this helicopter flight test activity, including results and lessons learned, and also provides an overview of recent integrated testing of ALHAT on the second Morpheus vehicle.

  4. Helicopter Field Testing of NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) System fully integrated with the Morpheus Vertical Test Bed Avionics

    NASA Technical Reports Server (NTRS)

    Rutishauser, David; Epp, Chirold; Robertson, Edward

    2013-01-01

    The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project was chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with real-time terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. This is accomplished with the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN). The NASA plan for the ALHAT technology is to perform the TRL6 closed loop demonstration on the Morpheus Vertical Test Bed (VTB). The first Morpheus vehicle was lost in August of 2012 during free-flight testing at Kennedy Space Center (KSC), so the decision was made to perform a helicopter test of the integrated ALHAT System with the Morpheus avionics over the ALHAT planetary hazard field at KSC. The KSC helicopter tests included flight profiles approximating planetary approaches, with the entire ALHAT system interfaced with all appropriate Morpheus subsystems and operated in real-time. During these helicopter flights, the ALHAT system imaged the simulated lunar terrain constructed in FY2012 to support ALHAT/Morpheus testing at KSC. To the best of our knowledge, this represents the highest fidelity testing of a system of this kind to date. During this helicopter testing, two new Morpheus landers were under construction at the Johnson Space Center to support the objective of an integrated ALHAT/Morpheus free-flight demonstration. This paper provides an overview of this helicopter flight test activity, including results and lessons learned, and also provides an overview of recent integrated testing of ALHAT on the second Morpheus vehicle.

  5. Novel Intersection Type Recognition for Autonomous Vehicles Using a Multi-Layer Laser Scanner.

    PubMed

    An, Jhonghyun; Choi, Baehoon; Sim, Kwee-Bo; Kim, Euntai

    2016-07-20

    There are several types of intersections such as merge-roads, diverge-roads, plus-shape intersections and two types of T-shape junctions in urban roads. When an autonomous vehicle encounters new intersections, it is crucial to recognize the types of intersections for safe navigation. In this paper, a novel intersection type recognition method is proposed for an autonomous vehicle using a multi-layer laser scanner. The proposed method consists of two steps: (1) static local coordinate occupancy grid map (SLOGM) building and (2) intersection classification. In the first step, the SLOGM is built relative to the local coordinate using the dynamic binary Bayes filter. In the second step, the SLOGM is used as an attribute for the classification. The proposed method is applied to a real-world environment and its validity is demonstrated through experimentation.

  6. Novel Intersection Type Recognition for Autonomous Vehicles Using a Multi-Layer Laser Scanner

    PubMed Central

    An, Jhonghyun; Choi, Baehoon; Sim, Kwee-Bo; Kim, Euntai

    2016-01-01

    There are several types of intersections such as merge-roads, diverge-roads, plus-shape intersections and two types of T-shape junctions in urban roads. When an autonomous vehicle encounters new intersections, it is crucial to recognize the types of intersections for safe navigation. In this paper, a novel intersection type recognition method is proposed for an autonomous vehicle using a multi-layer laser scanner. The proposed method consists of two steps: (1) static local coordinate occupancy grid map (SLOGM) building and (2) intersection classification. In the first step, the SLOGM is built relative to the local coordinate using the dynamic binary Bayes filter. In the second step, the SLOGM is used as an attribute for the classification. The proposed method is applied to a real-world environment and its validity is demonstrated through experimentation. PMID:27447640

  7. Autonomous Mars ascent and orbit rendezvous for earth return missions

    NASA Technical Reports Server (NTRS)

    Edwards, H. C.; Balmanno, W. F.; Cruz, Manuel I.; Ilgen, Marc R.

    1991-01-01

    The details of tha assessment of autonomous Mars ascent and orbit rendezvous for earth return missions are presented. Analyses addressing navigation system assessments, trajectory planning, targeting approaches, flight control guidance strategies, and performance sensitivities are included. Tradeoffs in the analysis and design process are discussed.

  8. Vision-Aided Autonomous Landing and Ingress of Micro Aerial Vehicles

    NASA Technical Reports Server (NTRS)

    Brockers, Roland; Ma, Jeremy C.; Matthies, Larry H.; Bouffard, Patrick

    2012-01-01

    Micro aerial vehicles have limited sensor suites and computational power. For reconnaissance tasks and to conserve energy, these systems need the ability to autonomously land at vantage points or enter buildings (ingress). But for autonomous navigation, information is needed to identify and guide the vehicle to the target. Vision algorithms can provide egomotion estimation and target detection using input from cameras that are easy to include in miniature systems.

  9. Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm

    PubMed Central

    Kyristsis, Sarantis; Antonopoulos, Angelos; Chanialakis, Theofilos; Stefanakis, Emmanouel; Linardos, Christos; Tripolitsiotis, Achilles; Partsinevelos, Panagiotis

    2016-01-01

    Nowadays, various unmanned aerial vehicle (UAV) applications become increasingly demanding since they require real-time, autonomous and intelligent functions. Towards this end, in the present study, a fully autonomous UAV scenario is implemented, including the tasks of area scanning, target recognition, geo-location, monitoring, following and finally landing on a high speed moving platform. The underlying methodology includes AprilTag target identification through Graphics Processing Unit (GPU) parallelized processing, image processing and several optimized locations and approach algorithms employing gimbal movement, Global Navigation Satellite System (GNSS) readings and UAV navigation. For the experimentation, a commercial and a custom made quad-copter prototype were used, portraying a high and a low-computational embedded platform alternative. Among the successful targeting and follow procedures, it is shown that the landing approach can be successfully performed even under high platform speeds. PMID:27827883

  10. Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm.

    PubMed

    Kyristsis, Sarantis; Antonopoulos, Angelos; Chanialakis, Theofilos; Stefanakis, Emmanouel; Linardos, Christos; Tripolitsiotis, Achilles; Partsinevelos, Panagiotis

    2016-11-03

    Nowadays, various unmanned aerial vehicle (UAV) applications become increasingly demanding since they require real-time, autonomous and intelligent functions. Towards this end, in the present study, a fully autonomous UAV scenario is implemented, including the tasks of area scanning, target recognition, geo-location, monitoring, following and finally landing on a high speed moving platform. The underlying methodology includes AprilTag target identification through Graphics Processing Unit (GPU) parallelized processing, image processing and several optimized locations and approach algorithms employing gimbal movement, Global Navigation Satellite System (GNSS) readings and UAV navigation. For the experimentation, a commercial and a custom made quad-copter prototype were used, portraying a high and a low-computational embedded platform alternative. Among the successful targeting and follow procedures, it is shown that the landing approach can be successfully performed even under high platform speeds.

  11. Autonomous Underwater Navigation and Optical Mapping in Unknown Natural Environments.

    PubMed

    Hernández, Juan David; Istenič, Klemen; Gracias, Nuno; Palomeras, Narcís; Campos, Ricard; Vidal, Eduard; García, Rafael; Carreras, Marc

    2016-07-26

    We present an approach for navigating in unknown environments while, simultaneously, gathering information for inspecting underwater structures using an autonomous underwater vehicle (AUV). To accomplish this, we first use our pipeline for mapping and planning collision-free paths online, which endows an AUV with the capability to autonomously acquire optical data in close proximity. With that information, we then propose a reconstruction pipeline to create a photo-realistic textured 3D model of the inspected area. These 3D models are also of particular interest to other fields of study in marine sciences, since they can serve as base maps for environmental monitoring, thus allowing change detection of biological communities and their environment over time. Finally, we evaluate our approach using the Sparus II, a torpedo-shaped AUV, conducting inspection missions in a challenging, real-world and natural scenario.

  12. Autonomous Underwater Navigation and Optical Mapping in Unknown Natural Environments

    PubMed Central

    Hernández, Juan David; Istenič, Klemen; Gracias, Nuno; Palomeras, Narcís; Campos, Ricard; Vidal, Eduard; García, Rafael; Carreras, Marc

    2016-01-01

    We present an approach for navigating in unknown environments while, simultaneously, gathering information for inspecting underwater structures using an autonomous underwater vehicle (AUV). To accomplish this, we first use our pipeline for mapping and planning collision-free paths online, which endows an AUV with the capability to autonomously acquire optical data in close proximity. With that information, we then propose a reconstruction pipeline to create a photo-realistic textured 3D model of the inspected area. These 3D models are also of particular interest to other fields of study in marine sciences, since they can serve as base maps for environmental monitoring, thus allowing change detection of biological communities and their environment over time. Finally, we evaluate our approach using the Sparus II, a torpedo-shaped AUV, conducting inspection missions in a challenging, real-world and natural scenario. PMID:27472337

  13. Multidisciplinary unmanned technology teammate (MUTT)

    NASA Astrophysics Data System (ADS)

    Uzunovic, Nenad; Schneider, Anne; Lacaze, Alberto; Murphy, Karl; Del Giorno, Mark

    2013-01-01

    The U.S. Army Tank Automotive Research, Development and Engineering Center (TARDEC) held an autonomous robot competition called CANINE in June 2012. The goal of the competition was to develop innovative and natural control methods for robots. This paper describes the winning technology, including the vision system, the operator interaction, and the autonomous mobility. The rules stated only gestures or voice commands could be used for control. The robots would learn a new object at the start of each phase, find the object after it was thrown into a field, and return the object to the operator. Each of the six phases became more difficult, including clutter of the same color or shape as the object, moving and stationary obstacles, and finding the operator who moved from the starting location to a new location. The Robotic Research Team integrated techniques in computer vision, speech recognition, object manipulation, and autonomous navigation. A multi-filter computer vision solution reliably detected the objects while rejecting objects of similar color or shape, even while the robot was in motion. A speech-based interface with short commands provided close to natural communication of complicated commands from the operator to the robot. An innovative gripper design allowed for efficient object pickup. A robust autonomous mobility and navigation solution for ground robotic platforms provided fast and reliable obstacle avoidance and course navigation. The research approach focused on winning the competition while remaining cognizant and relevant to real world applications.

  14. Evaluation of Hardware and Software for a Small Autonomous Underwater Vehicle Navigation System (SANS)

    DTIC Science & Technology

    1994-09-01

    Hyslop , G.L., Schieber, G.E., Schwartz, M.K., "Automated Mission Planning for the Standoff Land Attack Missile (SLAM)", Proceedings of the...1993, pp. 277-290. [PARK80] Parkinson, B.W., "Overview", Global Positioning System, Vol. 1, The Institute of Navigation, Washington, D.C., 1980 , pp...Navigation Message", Global Positioning System, Vol. 1, The Institute of Navigation, Washington, D.C., 1980 , pp. 55-73. 139 [WOOD851 Wooden, W. H

  15. Autonomous Navigation of the SSTI/Lewis Spacecraft Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Hart, R. C.; Long, A. C.; Lee, T.

    1997-01-01

    The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) is pursuing the application of Global Positioning System (GPS) technology to improve the accuracy and economy of spacecraft navigation. High-accuracy autonomous navigation algorithms are being flight qualified in conjunction with GSFC's GPS Attitude Determination Flyer (GADFLY) experiment on the Small Satellite Technology Initiative (SSTI) Lewis spacecraft, which is scheduled for launch in 1997. Preflight performance assessments indicate that these algorithms can provide a real-time total position accuracy of better than 10 meters (1 sigma) and velocity accuracy of better than 0.01 meter per second (1 sigma), with selective availability at typical levels. This accuracy is projected to improve to the 2-meter level if corrections to be provided by the GPS Wide Area Augmentation System (WAAS) are included.

  16. Laser Range and Bearing Finder for Autonomous Missions

    NASA Technical Reports Server (NTRS)

    Granade, Stephen R.

    2004-01-01

    NASA has recently re-confirmed their interest in autonomous systems as an enabling technology for future missions. In order for autonomous missions to be possible, highly-capable relative sensor systems are needed to determine an object's distance, direction, and orientation. This is true whether the mission is autonomous in-space assembly, rendezvous and docking, or rover surface navigation. Advanced Optical Systems, Inc. has developed a wide-angle laser range and bearing finder (RBF) for autonomous space missions. The laser RBF has a number of features that make it well-suited for autonomous missions. It has an operating range of 10 m to 5 km, with a 5 deg field of view. Its wide field of view removes the need for scanning systems such as gimbals, eliminating moving parts and making the sensor simpler and space qualification easier. Its range accuracy is 1% or better. It is designed to operate either as a stand-alone sensor or in tandem with a sensor that returns range, bearing, and orientation at close ranges, such as NASA's Advanced Video Guidance Sensor. We have assembled the initial prototype and are currently testing it. We will discuss the laser RBF's design and specifications. Keywords: laser range and bearing finder, autonomous rendezvous and docking, space sensors, on-orbit sensors, advanced video guidance sensor

  17. Global Positioning System (GPS) Receiver Autonomous Integrity Monitoring (RAIM) web service to support Area Navigation (RNAV) flight planning

    DOT National Transportation Integrated Search

    2008-01-28

    The Volpe Center designed, implemented, and deployed a Global Positioning System (GPS) Receiver Autonomous Integrity Monitoring (RAIM) prediction system in the mid 1990s to support both Air Force and Federal Aviation Administration (FAA) use of TSO C...

  18. Recent CESAR (Center for Engineering Systems Advanced Research) research activities in sensor based reasoning for autonomous machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pin, F.G.; de Saussure, G.; Spelt, P.F.

    1988-01-01

    This paper describes recent research activities at the Center for Engineering Systems Advanced Research (CESAR) in the area of sensor based reasoning, with emphasis being given to their application and implementation on our HERMIES-IIB autonomous mobile vehicle. These activities, including navigation and exploration in a-priori unknown and dynamic environments, goal recognition, vision-guided manipulation and sensor-driven machine learning, are discussed within the framework of a scenario in which an autonomous robot is asked to navigate through an unknown dynamic environment, explore, find and dock at the panel, read and understand the status of the panel's meters and dials, learn the functioningmore » of a process control panel, and successfully manipulate the control devices of the panel to solve a maintenance emergency problems. A demonstration of the successful implementation of the algorithms on our HERMIES-IIB autonomous robot for resolution of this scenario is presented. Conclusions are drawn concerning the applicability of the methodologies to more general classes of problems and implications for future work on sensor-driven reasoning for autonomous robots are discussed. 8 refs., 3 figs.« less

  19. Diver Relative UUV Navigation for Joint Human-Robot Operations

    DTIC Science & Technology

    2013-09-01

    loop response: (10) where Kej is the gain that scales the position error to force . Substituting the measured values for ζ and ων as well as the...Underwater Vehicle; Tethered ; Hovering; Autonomous Underwater Vehicle; Joint human-robot operations; dynamic, uncertain environments 15. NUMBER OF PAGES...4   Figure 3.   The SeaBotix vLBV300 tethered AUV platform (left), and the planar vectored thruster

  20. Geometry-Based Observability Metric

    NASA Technical Reports Server (NTRS)

    Eaton, Colin; Naasz, Bo

    2012-01-01

    The Satellite Servicing Capabilities Office (SSCO) is currently developing and testing Goddard s Natural Feature Image Recognition (GNFIR) software for autonomous rendezvous and docking missions. GNFIR has flight heritage and is still being developed and tailored for future missions with non-cooperative targets: (1) DEXTRE Pointing Package System on the International Space Station, (2) Relative Navigation System (RNS) on the Space Shuttle for the fourth Hubble Servicing Mission.

  1. COBALT: Development of a Platform to Flight Test Lander GN&C Technologies on Suborbital Rockets

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Seubert, Carl R.; Amzajerdian, Farzin; Bergh, Chuck; Kourchians, Ara; Restrepo, Carolina I.; Villapando, Carlos Y.; O'Neal, Travis V.; Robertson, Edward A.; Pierrottet, Diego; hide

    2017-01-01

    The NASA COBALT Project (CoOperative Blending of Autonomous Landing Technologies) is developing and integrating new precision-landing Guidance, Navigation and Control (GN&C) technologies, along with developing a terrestrial fight-test platform for Technology Readiness Level (TRL) maturation. The current technologies include a third- generation Navigation Doppler Lidar (NDL) sensor for ultra-precise velocity and line- of-site (LOS) range measurements, and the Lander Vision System (LVS) that provides passive-optical Terrain Relative Navigation (TRN) estimates of map-relative position. The COBALT platform is self contained and includes the NDL and LVS sensors, blending filter, a custom compute element, power unit, and communication system. The platform incorporates a structural frame that has been designed to integrate with the payload frame onboard the new Masten Xodiac vertical take-o, vertical landing (VTVL) terrestrial rocket vehicle. Ground integration and testing is underway, and terrestrial fight testing onboard Xodiac is planned for 2017 with two flight campaigns: one open-loop and one closed-loop.

  2. Structured Kernel Subspace Learning for Autonomous Robot Navigation.

    PubMed

    Kim, Eunwoo; Choi, Sungjoon; Oh, Songhwai

    2018-02-14

    This paper considers two important problems for autonomous robot navigation in a dynamic environment, where the goal is to predict pedestrian motion and control a robot with the prediction for safe navigation. While there are several methods for predicting the motion of a pedestrian and controlling a robot to avoid incoming pedestrians, it is still difficult to safely navigate in a dynamic environment due to challenges, such as the varying quality and complexity of training data with unwanted noises. This paper addresses these challenges simultaneously by proposing a robust kernel subspace learning algorithm based on the recent advances in nuclear-norm and l 1 -norm minimization. We model the motion of a pedestrian and the robot controller using Gaussian processes. The proposed method efficiently approximates a kernel matrix used in Gaussian process regression by learning low-rank structured matrix (with symmetric positive semi-definiteness) to find an orthogonal basis, which eliminates the effects of erroneous and inconsistent data. Based on structured kernel subspace learning, we propose a robust motion model and motion controller for safe navigation in dynamic environments. We evaluate the proposed robust kernel learning in various tasks, including regression, motion prediction, and motion control problems, and demonstrate that the proposed learning-based systems are robust against outliers and outperform existing regression and navigation methods.

  3. X-Ray Detection and Processing Models for Spacecraft Navigation and Timing

    NASA Technical Reports Server (NTRS)

    Sheikh, Suneel; Hanson, John

    2013-01-01

    The current primary method of deepspace navigation is the NASA Deep Space Network (DSN). High-performance navigation is achieved using Delta Differential One-Way Range techniques that utilize simultaneous observations from multiple DSN sites, and incorporate observations of quasars near the line-of-sight to a spacecraft in order to improve the range and angle measurement accuracies. Over the past four decades, x-ray astronomers have identified a number of xray pulsars with pulsed emissions having stabilities comparable to atomic clocks. The x-ray pulsar-based navigation and time determination (XNAV) system uses phase measurements from these sources to establish autonomously the position of the detector, and thus the spacecraft, relative to a known reference frame, much as the Global Positioning System (GPS) uses phase measurements from radio signals from several satellites to establish the position of the user relative to an Earth-centered fixed frame of reference. While a GPS receiver uses an antenna to detect the radio signals, XNAV uses a detector array to capture the individual xray photons from the x-ray pulsars. The navigation solution relies on detailed xray source models, signal processing, navigation and timing algorithms, and analytical tools that form the basis of an autonomous XNAV system. Through previous XNAV development efforts, some techniques have been established to utilize a pulsar pulse time-of-arrival (TOA) measurement to correct a position estimate. One well-studied approach, based upon Kalman filter methods, optimally adjusts a dynamic orbit propagation solution based upon the offset in measured and predicted pulse TOA. In this delta position estimator scheme, previously estimated values of spacecraft position and velocity are utilized from an onboard orbit propagator. Using these estimated values, the detected arrival times at the spacecraft of pulses from a pulsar are compared to the predicted arrival times defined by the pulsar s pulse timing model. A discrepancy provides an estimate of the spacecraft position offset, since an error in position will relate to the measured time offset of a pulse along the line of sight to the pulsar. XNAV researchers have been developing additional enhanced approaches to process the photon TOAs to arrive at an estimate of spacecraft position, including those using maximum-likelihood estimation, digital phase locked loops, and "single photon processing" schemes that utilize all available time data associated with each photon. Using pulsars from separate, non-coplanar locations provides range and range-rate measurements in each pulsar s direction. Combining these different pulsar measurements solves for offsets in position and velocity in three dimensions, and provides accurate overall navigation for deep space vehicles.

  4. Flight Testing of Terrain-Relative Navigation and Large-Divert Guidance on a VTVL Rocket

    NASA Technical Reports Server (NTRS)

    Trawny, Nikolas; Benito, Joel; Tweddle, Brent; Bergh, Charles F.; Khanoyan, Garen; Vaughan, Geoffrey M.; Zheng, Jason X.; Villalpando, Carlos Y.; Cheng, Yang; Scharf, Daniel P.; hide

    2015-01-01

    Since 2011, the Autonomous Descent and Ascent Powered-Flight Testbed (ADAPT) has been used to demonstrate advanced descent and landing technologies onboard the Masten Space Systems (MSS) Xombie vertical-takeoff, vertical-landing suborbital rocket. The current instantiation of ADAPT is a stand-alone payload comprising sensing and avionics for terrain-relative navigation and fuel-optimal onboard planning of large divert trajectories, thus providing complete pin-point landing capabilities needed for planetary landers. To this end, ADAPT combines two technologies developed at JPL, the Lander Vision System (LVS), and the Guidance for Fuel Optimal Large Diverts (G-FOLD) software. This paper describes the integration and testing of LVS and G-FOLD in the ADAPT payload, culminating in two successful free flight demonstrations on the Xombie vehicle conducted in December 2014.

  5. Autonomous precision landing using terrain-following navigation

    NASA Technical Reports Server (NTRS)

    Vaughan, R. M.; Gaskell, R. W.; Halamek, P.; Klumpp, A. R.; Synnott, S. P.

    1991-01-01

    Terrain-following navigation studies that have been done over the past two years in the navigation system section at JPL are described. A descent to Mars scenario based on Mars Rover and Sample Return mission profiles is described, and navigation and image processing issues pertaining to descent phases where landmark picture can be obtained are examined. A covariance analysis is performed to verify that landmark measurements from a terrain-following navigation system can satisfy precision landing requirements. Image processing problems involving known landmarks in actual pictures are considered. Mission design alternatives that can alleviate some of these problems are suggested.

  6. A Self-Tuning Kalman Filter for Autonomous Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, Son H.

    1999-01-01

    Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS (Global Positioning Systems) data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.

  7. A Self-Tuning Kalman Filter for Autonomous Navigation using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.

    1999-01-01

    Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.

  8. An Overview of Flight Test Results for a Formation Flight Autopilot

    NASA Technical Reports Server (NTRS)

    Hanson, Curtis E.; Ryan, Jack; Allen, Michael J.; Jacobson, Steven R.

    2002-01-01

    The first flight test phase of the NASA Dryden Flight Research Center Autonomous Formation Flight project has successfully demonstrated precision autonomous station-keeping of an F/A-18 research airplane with a second F/A-18 airplane. Blended inertial navigation system (INS) and global positioning system (GPS) measurements have been communicated across an air-to-air telemetry link and used to compute relative-position estimates. A precision research formation autopilot onboard the trailing airplane controls lateral and vertical spacing while the leading airplane operates under production autopilot control. Four research autopilot gain sets have been designed and flight-tested, and each exceeds the project design requirement of steady-state tracking accuracy within 1 standard deviation of 10 ft. Performance also has been demonstrated using single- and multiple-axis inputs such as step commands and frequency sweeps. This report briefly describes the experimental formation flight systems employed and discusses the navigation, guidance, and control algorithms that have been flight-tested. An overview of the flight test results of the formation autopilot during steady-state tracking and maneuvering flight is presented.

  9. Autonomous Navigation Apparatus With Neural Network for a Mobile Vehicle

    NASA Technical Reports Server (NTRS)

    Quraishi, Naveed (Inventor)

    1996-01-01

    An autonomous navigation system for a mobile vehicle arranged to move within an environment includes a plurality of sensors arranged on the vehicle and at least one neural network including an input layer coupled to the sensors, a hidden layer coupled to the input layer, and an output layer coupled to the hidden layer. The neural network produces output signals representing respective positions of the vehicle, such as the X coordinate, the Y coordinate, and the angular orientation of the vehicle. A plurality of patch locations within the environment are used to train the neural networks to produce the correct outputs in response to the distances sensed.

  10. Daytime Water Detection by Fusing Multiple Cues for Autonomous Off-Road Navigation

    NASA Technical Reports Server (NTRS)

    Rankin, A. L.; Matthies, L. H.; Huertas, A.

    2004-01-01

    Detecting water hazards is a significant challenge to unmanned ground vehicle autonomous off-road navigation. This paper focuses on detecting the presence of water during the daytime using color cameras. A multi-cue approach is taken. Evidence of the presence of water is generated from color, texture, and the detection of reflections in stereo range data. A rule base for fusing water cues was developed by evaluating detection results from an extensive archive of data collection imagery containing water. This software has been implemented into a run-time passive perception subsystem and tested thus far under Linux on a Pentium based processor.

  11. Obstacle Avoidance On Roadways Using Range Data

    NASA Astrophysics Data System (ADS)

    Dunlay, R. Terry; Morgenthaler, David G.

    1987-02-01

    This report describes range data based obstacle avoidance techniques developed for use on an autonomous road-following robot vehicle. The purpose of these techniques is to detect and locate obstacles present in a road environment for navigation of a robot vehicle equipped with an active laser-based range sensor. Techniques are presented for obstacle detection, obstacle location, and coordinate transformations needed in the construction of Scene Models (symbolic structures representing the 3-D obstacle boundaries used by the vehicle's Navigator for path planning). These techniques have been successfully tested on an outdoor robotic vehicle, the Autonomous Land Vehicle (ALV), at speeds up to 3.5 km/hour.

  12. Dynamic multisensor fusion for mobile robot navigation in an indoor environment

    NASA Astrophysics Data System (ADS)

    Jin, Taeseok; Lee, Jang-Myung; Luk, Bing L.; Tso, Shiu K.

    2001-10-01

    In this study, as the preliminary step for developing a multi-purpose Autonomous robust carrier mobile robot to transport trolleys or heavy goods and serve as robotic nursing assistant in hospital wards. The aim of this paper is to present the use of multi-sensor data fusion such as sonar, CCD camera dn IR sensor for map-building mobile robot to navigate, and presents an experimental mobile robot designed to operate autonomously within both indoor and outdoor environments. Smart sensory systems are crucial for successful autonomous systems. We will give an explanation for the robot system architecture designed and implemented in this study and a short review of existing techniques, since there exist several recent thorough books and review paper on this paper. Instead we will focus on the main results with relevance to the intelligent service robot project at the Centre of Intelligent Design, Automation & Manufacturing (CIDAM). We will conclude by discussing some possible future extensions of the project. It is first dealt with the general principle of the navigation and guidance architecture, then the detailed functions recognizing environments updated, obstacle detection and motion assessment, with the first results form the simulations run.

  13. Scene Segmentation For Autonomous Robotic Navigation Using Sequential Laser Projected Structured Light

    NASA Astrophysics Data System (ADS)

    Brown, C. David; Ih, Charles S.; Arce, Gonzalo R.; Fertell, David A.

    1987-01-01

    Vision systems for mobile robots or autonomous vehicles navigating in an unknown terrain environment must provide a rapid and accurate method of segmenting the scene ahead into regions of pathway and background. A major distinguishing feature between the pathway and background is the three dimensional texture of these two regions. Typical methods of textural image segmentation are very computationally intensive, often lack the required robustness, and are incapable of sensing the three dimensional texture of various regions of the scene. A method is presented where scanned laser projected lines of structured light, viewed by a stereoscopically located single video camera, resulted in an image in which the three dimensional characteristics of the scene were represented by the discontinuity of the projected lines. This image was conducive to processing with simple regional operators to classify regions as pathway or background. Design of some operators and application methods, and demonstration on sample images are presented. This method provides rapid and robust scene segmentation capability that has been implemented on a microcomputer in near real time, and should result in higher speed and more reliable robotic or autonomous navigation in unstructured environments.

  14. Cerebellum Augmented Rover Development

    NASA Technical Reports Server (NTRS)

    King, Matthew

    2005-01-01

    Bio-Inspired Technologies and Systems (BITS) are a very natural result of thinking about Nature's way of solving problems. Knowledge of animal behaviors an be used in developing robotic behaviors intended for planetary exploration. This is the expertise of the JFL BITS Group and has served as a philosophical model for NMSU RioRobolab. Navigation is a vital function for any autonomous system. Systems must have the ability to determine a safe path between their current location and some target location. The MER mission, as well as other JPL rover missions, uses a method known as dead-reckoning to determine position information. Dead-reckoning uses wheel encoders to sense the wheel's rotation. In a sandy environment such as Mars, this method is highly inaccurate because the wheels will slip in the sand. Improving positioning error will allow the speed of an autonomous navigating rover to be greatly increased. Therefore, local navigation based upon landmark tracking is desirable in planetary exploration. The BITS Group is developing navigation technology based upon landmark tracking. Integration of the current rover architecture with a cerebellar neural network tracking algorithm will demonstrate that this approach to navigation is feasible and should be implemented in future rover and spacecraft missions.

  15. Spoofing Detection Using GNSS/INS/Odometer Coupling for Vehicular Navigation

    PubMed Central

    Broumandan, Ali; Lachapelle, Gérard

    2018-01-01

    Location information is one of the most vital information required to achieve intelligent and context-aware capability for various applications such as driverless cars. However, related security and privacy threats are a major holdback. With increasing focus on using Global Navigation Satellite Systems (GNSS) for autonomous navigation and related applications, it is important to provide robust navigation solutions, yet signal spoofing for illegal or covert transportation and misleading receiver timing is increasing and now frequent. Hence, detection and mitigation of spoofing attacks has become an important topic. Several contributions on spoofing detection have been made, focusing on different layers of a GNSS receiver. This paper focuses on spoofing detection utilizing self-contained sensors, namely inertial measurement units (IMUs) and vehicle odometer outputs. A spoofing detection approach based on a consistency check between GNSS and IMU/odometer mechanization is proposed. To detect a spoofing attack, the method analyses GNSS and IMU/odometer measurements independently during a pre-selected observation window and cross checks the solutions provided by GNSS and inertial navigation solution (INS)/odometer mechanization. The performance of the proposed method is verified in real vehicular environments. Mean spoofing detection time and detection performance in terms of receiver operation characteristics (ROC) in sub-urban and dense urban environments are evaluated. PMID:29695064

  16. Spoofing Detection Using GNSS/INS/Odometer Coupling for Vehicular Navigation.

    PubMed

    Broumandan, Ali; Lachapelle, Gérard

    2018-04-24

    Location information is one of the most vital information required to achieve intelligent and context-aware capability for various applications such as driverless cars. However, related security and privacy threats are a major holdback. With increasing focus on using Global Navigation Satellite Systems (GNSS) for autonomous navigation and related applications, it is important to provide robust navigation solutions, yet signal spoofing for illegal or covert transportation and misleading receiver timing is increasing and now frequent. Hence, detection and mitigation of spoofing attacks has become an important topic. Several contributions on spoofing detection have been made, focusing on different layers of a GNSS receiver. This paper focuses on spoofing detection utilizing self-contained sensors, namely inertial measurement units (IMUs) and vehicle odometer outputs. A spoofing detection approach based on a consistency check between GNSS and IMU/odometer mechanization is proposed. To detect a spoofing attack, the method analyses GNSS and IMU/odometer measurements independently during a pre-selected observation window and cross checks the solutions provided by GNSS and inertial navigation solution (INS)/odometer mechanization. The performance of the proposed method is verified in real vehicular environments. Mean spoofing detection time and detection performance in terms of receiver operation characteristics (ROC) in sub-urban and dense urban environments are evaluated.

  17. A low-cost test-bed for real-time landmark tracking

    NASA Astrophysics Data System (ADS)

    Csaszar, Ambrus; Hanan, Jay C.; Moreels, Pierre; Assad, Christopher

    2007-04-01

    A low-cost vehicle test-bed system was developed to iteratively test, refine and demonstrate navigation algorithms before attempting to transfer the algorithms to more advanced rover prototypes. The platform used here was a modified radio controlled (RC) car. A microcontroller board and onboard laptop computer allow for either autonomous or remote operation via a computer workstation. The sensors onboard the vehicle represent the types currently used on NASA-JPL rover prototypes. For dead-reckoning navigation, optical wheel encoders, a single axis gyroscope, and 2-axis accelerometer were used. An ultrasound ranger is available to calculate distance as a substitute for the stereo vision systems presently used on rovers. The prototype also carries a small laptop computer with a USB camera and wireless transmitter to send real time video to an off-board computer. A real-time user interface was implemented that combines an automatic image feature selector, tracking parameter controls, streaming video viewer, and user generated or autonomous driving commands. Using the test-bed, real-time landmark tracking was demonstrated by autonomously driving the vehicle through the JPL Mars yard. The algorithms tracked rocks as waypoints. This generated coordinates calculating relative motion and visually servoing to science targets. A limitation for the current system is serial computing-each additional landmark is tracked in order-but since each landmark is tracked independently, if transferred to appropriate parallel hardware, adding targets would not significantly diminish system speed.

  18. The study of stereo vision technique for the autonomous vehicle

    NASA Astrophysics Data System (ADS)

    Li, Pei; Wang, Xi; Wang, Jiang-feng

    2015-08-01

    The stereo vision technology by two or more cameras could recovery 3D information of the field of view. This technology can effectively help the autonomous navigation system of unmanned vehicle to judge the pavement conditions within the field of view, and to measure the obstacles on the road. In this paper, the stereo vision technology in measuring the avoidance of the autonomous vehicle is studied and the key techniques are analyzed and discussed. The system hardware of the system is built and the software is debugged, and finally the measurement effect is explained by the measured data. Experiments show that the 3D reconstruction, within the field of view, can be rebuilt by the stereo vision technology effectively, and provide the basis for pavement condition judgment. Compared with unmanned vehicle navigation radar used in measuring system, the stereo vision system has the advantages of low cost, distance and so on, it has a good application prospect.

  19. HERMIES-3: A step toward autonomous mobility, manipulation, and perception

    NASA Technical Reports Server (NTRS)

    Weisbin, C. R.; Burks, B. L.; Einstein, J. R.; Feezell, R. R.; Manges, W. W.; Thompson, D. H.

    1989-01-01

    HERMIES-III is an autonomous robot comprised of a seven degree-of-freedom (DOF) manipulator designed for human scale tasks, a laser range finder, a sonar array, an omni-directional wheel-driven chassis, multiple cameras, and a dual computer system containing a 16-node hypercube expandable to 128 nodes. The current experimental program involves performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES-III). The environment in which the robots operate has been designed to include multiple valves, pipes, meters, obstacles on the floor, valves occluded from view, and multiple paths of differing navigation complexity. The ongoing research program supports the development of autonomous capability for HERMIES-IIB and III to perform complex navigation and manipulation under time constraints, while dealing with imprecise sensory information.

  20. Novel microbial diversity retrieved by autonomous robotic exploration of the world's deepest vertical phreatic sinkhole.

    PubMed

    Sahl, Jason W; Fairfield, Nathaniel; Harris, J Kirk; Wettergreen, David; Stone, William C; Spear, John R

    2010-03-01

    The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (approximately 318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment.

  1. Landmark-based robust navigation for tactical UGV control in GPS-denied communication-degraded environments

    NASA Astrophysics Data System (ADS)

    Endo, Yoichiro; Balloch, Jonathan C.; Grushin, Alexander; Lee, Mun Wai; Handelman, David

    2016-05-01

    Control of current tactical unmanned ground vehicles (UGVs) is typically accomplished through two alternative modes of operation, namely, low-level manual control using joysticks and high-level planning-based autonomous control. Each mode has its own merits as well as inherent mission-critical disadvantages. Low-level joystick control is vulnerable to communication delay and degradation, and high-level navigation often depends on uninterrupted GPS signals and/or energy-emissive (non-stealth) range sensors such as LIDAR for localization and mapping. To address these problems, we have developed a mid-level control technique where the operator semi-autonomously drives the robot relative to visible landmarks that are commonly recognizable by both humans and machines such as closed contours and structured lines. Our novel solution relies solely on optical and non-optical passive sensors and can be operated under GPS-denied, communication-degraded environments. To control the robot using these landmarks, we developed an interactive graphical user interface (GUI) that allows the operator to select landmarks in the robot's view and direct the robot relative to one or more of the landmarks. The integrated UGV control system was evaluated based on its ability to robustly navigate through indoor environments. The system was successfully field tested with QinetiQ North America's TALON UGV and Tactical Robot Controller (TRC), a ruggedized operator control unit (OCU). We found that the proposed system is indeed robust against communication delay and degradation, and provides the operator with steady and reliable control of the UGV in realistic tactical scenarios.

  2. Method and system for providing autonomous control of a platform

    NASA Technical Reports Server (NTRS)

    Seelinger, Michael J. (Inventor); Yoder, John-David (Inventor)

    2012-01-01

    The present application provides a system for enabling instrument placement from distances on the order of five meters, for example, and increases accuracy of the instrument placement relative to visually-specified targets. The system provides precision control of a mobile base of a rover and onboard manipulators (e.g., robotic arms) relative to a visually-specified target using one or more sets of cameras. The system automatically compensates for wheel slippage and kinematic inaccuracy ensuring accurate placement (on the order of 2 mm, for example) of the instrument relative to the target. The system provides the ability for autonomous instrument placement by controlling both the base of the rover and the onboard manipulator using a single set of cameras. To extend the distance from which the placement can be completed to nearly five meters, target information may be transferred from navigation cameras (used for long-range) to front hazard cameras (used for positioning the manipulator).

  3. Systems and Methods for Automated Vessel Navigation Using Sea State Prediction

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance L. (Inventor); Howard, Andrew B. (Inventor); Reinhart, Rene Felix (Inventor); Aghazarian, Hrand (Inventor); Rankin, Arturo (Inventor)

    2017-01-01

    Systems and methods for sea state prediction and autonomous navigation in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes a method of predicting a future sea state including generating a sequence of at least two 3D images of a sea surface using at least two image sensors, detecting peaks and troughs in the 3D images using a processor, identifying at least one wavefront in each 3D image based upon the detected peaks and troughs using the processor, characterizing at least one propagating wave based upon the propagation of wavefronts detected in the sequence of 3D images using the processor, and predicting a future sea state using at least one propagating wave characterizing the propagation of wavefronts in the sequence of 3D images using the processor. Another embodiment includes a method of autonomous vessel navigation based upon a predicted sea state and target location.

  4. Systems and Methods for Automated Vessel Navigation Using Sea State Prediction

    NASA Technical Reports Server (NTRS)

    Aghazarian, Hrand (Inventor); Reinhart, Rene Felix (Inventor); Huntsberger, Terrance L. (Inventor); Rankin, Arturo (Inventor); Howard, Andrew B. (Inventor)

    2015-01-01

    Systems and methods for sea state prediction and autonomous navigation in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes a method of predicting a future sea state including generating a sequence of at least two 3D images of a sea surface using at least two image sensors, detecting peaks and troughs in the 3D images using a processor, identifying at least one wavefront in each 3D image based upon the detected peaks and troughs using the processor, characterizing at least one propagating wave based upon the propagation of wavefronts detected in the sequence of 3D images using the processor, and predicting a future sea state using at least one propagating wave characterizing the propagation of wavefronts in the sequence of 3D images using the processor. Another embodiment includes a method of autonomous vessel navigation based upon a predicted sea state and target location.

  5. Simple autonomous Mars walker

    NASA Technical Reports Server (NTRS)

    Larimer, Stanley J.; Lisec, Thomas R.; Spiessbach, Andrew J.

    1989-01-01

    Under a contract with NASA's Jet Propulsion Laboratory, Martin Marietta has developed several alternative rover concepts for unmanned exploration of the planet Mars. One of those concepts, the 'Walking Beam', is the subject of this paper. This concept was developed with the goal of achieving many of the capabilities of more sophisticated articulated-leg walkers with a much simpler, more robust, less computationally demanding and more power efficient design. It consists of two large-base tripods nested one within the other which alternately translate with respect to each other along a 5-meter beam to propel the vehicle. The semiautonomous navigation system relies on terrain geometry sensors and tacticle feedback from each foot to autonomously select a path which avoids hazards along a route designated from earth. Both mobility and navigation features of this concept are discussed including a top-level description of the vehicle's physical characteristics, deployment strategy, mobility elements, sensor suite, theory of operation, navigation and control processes, and estimated performance.

  6. A Long Way From Home

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This pair of pieced-together images was taken by the Mars Exploration Rover Spirit's left navigation camera looking aft on March 6, 2004. It reveals the long and rocky path of nearly 240 meters (787 feet) that Spirit had traveled since safely arriving at Gusev Crater on Jan. 3, 2004.

    The lander can still be seen in the distance, but will never be 'home' again for the journeying rover. This image is also a tribute to the effectiveness of the autonomous navigation system that the rovers use during parts of their martian drives. Instead of driving directly through the 'hollow' seen in the middle right of the image, the autonomous navigation system guided Spirit around the high ridge bordering the hollow.

    In the two days after these images were taken, Spirit has traveled roughly 60 meters (197 feet) farther toward its destination at the crater nicknamed 'Bonneville'.

  7. Three-dimensional motor schema based navigation

    NASA Technical Reports Server (NTRS)

    Arkin, Ronald C.

    1989-01-01

    Reactive schema-based navigation is possible in space domains by extending the methods developed for ground-based navigation found within the Autonomous Robot Architecture (AuRA). Reformulation of two dimensional motor schemas for three dimensional applications is a straightforward process. The manifold advantages of schema-based control persist, including modular development, amenability to distributed processing, and responsiveness to environmental sensing. Simulation results show the feasibility of this methodology for space docking operations in a cluttered work area.

  8. Autonomous navigation system. [gyroscopic pendulum for air navigation

    NASA Technical Reports Server (NTRS)

    Merhav, S. J. (Inventor)

    1981-01-01

    An inertial navigation system utilizing a servo-controlled two degree of freedom pendulum to obtain specific force components in the locally level coordinate system is described. The pendulum includes a leveling gyroscope and an azimuth gyroscope supported on a two gimbal system. The specific force components in the locally level coordinate system are converted to components in the geographical coordinate system by means of a single Euler transformation. The standard navigation equations are solved to determine longitudinal and lateral velocities. Finally, vehicle position is determined by a further integration.

  9. Ultra-Wideband Tracking System Design for Relative Navigation

    NASA Technical Reports Server (NTRS)

    Ni, Jianjun David; Arndt, Dickey; Bgo, Phong; Dekome, Kent; Dusl, John

    2011-01-01

    This presentation briefly discusses a design effort for a prototype ultra-wideband (UWB) time-difference-of-arrival (TDOA) tracking system that is currently under development at NASA Johnson Space Center (JSC). The system is being designed for use in localization and navigation of a rover in a GPS deprived environment for surface missions. In one application enabled by the UWB tracking, a robotic vehicle carrying equipments can autonomously follow a crewed rover from work site to work site such that resources can be carried from one landing mission to the next thereby saving up-mass. The UWB Systems Group at JSC has developed a UWB TDOA High Resolution Proximity Tracking System which can achieve sub-inch tracking accuracy of a target within the radius of the tracking baseline [1]. By extending the tracking capability beyond the radius of the tracking baseline, a tracking system is being designed to enable relative navigation between two vehicles for surface missions. A prototype UWB TDOA tracking system has been designed, implemented, tested, and proven feasible for relative navigation of robotic vehicles. Future work includes testing the system with the application code to increase the tracking update rate and evaluating the linear tracking baseline to improve the flexibility of antenna mounting on the following vehicle.

  10. Automated low-thrust guidance for the orbital maneuvering vehicle

    NASA Technical Reports Server (NTRS)

    Rose, Richard E.; Schmeichel, Harry; Shortwell, Charles P.; Werner, Ronald A.

    1988-01-01

    This paper describes the highly autonomous OMV Guidance Navigation and Control system. Emphasis is placed on a key feature of the design, the low thrust guidance algorithm. The two guidance modes, orbit change guidance and rendezvous guidance, are discussed in detail. It is shown how OMV will automatically transfer from its initial orbit to an arbitrary target orbit and reach a specified rendezvous position relative to the target vehicle.

  11. Radar-based collision avoidance for unmanned surface vehicles

    NASA Astrophysics Data System (ADS)

    Zhuang, Jia-yuan; Zhang, Lei; Zhao, Shi-qi; Cao, Jian; Wang, Bo; Sun, Han-bing

    2016-12-01

    Unmanned surface vehicles (USVs) have become a focus of research because of their extensive applications. To ensure safety and reliability and to perform complex tasks autonomously, USVs are required to possess accurate perception of the environment and effective collision avoidance capabilities. To achieve these, investigation into realtime marine radar target detection and autonomous collision avoidance technologies is required, aiming at solving the problems of noise jamming, uneven brightness, target loss, and blind areas in marine radar images. These technologies should also satisfy the requirements of real-time and reliability related to high navigation speeds of USVs. Therefore, this study developed an embedded collision avoidance system based on the marine radar, investigated a highly real-time target detection method which contains adaptive smoothing algorithm and robust segmentation algorithm, developed a stable and reliable dynamic local environment model to ensure the safety of USV navigation, and constructed a collision avoidance algorithm based on velocity obstacle (V-obstacle) which adjusts the USV's heading and speed in real-time. Sea trials results in multi-obstacle avoidance firstly demonstrate the effectiveness and efficiency of the proposed avoidance system, and then verify its great adaptability and relative stability when a USV sailing in a real and complex marine environment. The obtained results will improve the intelligent level of USV and guarantee the safety of USV independent sailing.

  12. Autonomous navigation and mobility for a planetary rover

    NASA Technical Reports Server (NTRS)

    Miller, David P.; Mishkin, Andrew H.; Lambert, Kenneth E.; Bickler, Donald; Bernard, Douglas E.

    1989-01-01

    This paper presents an overview of the onboard subsystems that will be used in guiding a planetary rover. Particular emphasis is placed on the planning and sensing systems and their associated costs, particularly in computation. Issues that will be used in evaluating trades between the navigation system and mobility system are also presented.

  13. Autonomous Navigation With Ground Station One-Way Forward-Link Doppler Data

    NASA Technical Reports Server (NTRS)

    Horstkamp, G. M.; Niklewski, D. J.; Gramling, C. J.

    1996-01-01

    The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) has spent several years developing operational onboard navigation systems (ONS's) to provide real time autonomous, highly accurate navigation products for spacecraft using NASA's space and ground communication systems. The highly successful Tracking and Data Relay Satellite (TDRSS) ONS (TONS) experiment on the Explorer Platform/Extreme Ultraviolet (EP/EUV) spacecraft, launched on June 7, 1992, flight demonstrated the ONS for high accuracy navigation using TDRSS forward link communication services. In late 1994, a similar ONS experiment was performed using EP/EUV flight hardware (the ultrastable oscillator and Doppler extractor card in one of the TDRSS transponders) and ground system software to demonstrate the feasibility of using an ONS with ground station forward link communication services. This paper provides a detailed evaluation of ground station-based ONS performance of data collected over a 20 day period. The ground station ONS (GONS) experiment results are used to project the expected performance of an operational system. The GONS processes Doppler data derived from scheduled ground station forward link services using a sequential estimation algorithm enhanced by a sophisticated process noise model to provide onboard orbit and frequency determination. Analysis of the GONS experiment performance indicates that real time onboard position accuracies of better than 125 meters (1 sigma) are achievable with two or more 5-minute contacts per day for the EP/EUV 525 kilometer altitude, 28.5 degree inclination orbit. GONS accuracy is shown to be a function of the fidelity of the onboard propagation model, the frequency/geometry of the tracking contacts, and the quality of the tracking measurements. GONS provides a viable option for using autonomous navigation to reduce operational costs for upcoming spacecraft missions with moderate position accuracy requirements.

  14. Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)

    NASA Technical Reports Server (NTRS)

    Folta, David C.; Hawkins, Albin; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    NASA's first autonomous formation flying mission completed its primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center (GSFC) implemented a universal 3-axis formation flying algorithm in an autonomous executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard flight design and presents the validation results of this unique system. Results from functionality assessment through fully autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a standalone algorithm.

  15. Recent Advances in Bathymetric Surveying of Continental Shelf Regions Using Autonomous Vehicles

    NASA Astrophysics Data System (ADS)

    Holland, K. T.; Calantoni, J.; Slocum, D.

    2016-02-01

    Obtaining bathymetric observations within the continental shelf in areas closer to the shore is often time consuming and dangerous, especially when uncharted shoals and rocks present safety concerns to survey ships and launches. However, surveys in these regions are critically important to numerical simulation of oceanographic processes, as bathymetry serves as the bottom boundary condition in operational forecasting models. We will present recent progress in bathymetric surveying using both traditional vessels retrofitted for autonomous operations and relatively inexpensive, small team deployable, Autonomous Underwater Vehicles (AUV). Both systems include either high-resolution multibeam echo sounders or interferometric sidescan sonar sensors with integrated inertial navigation system capabilities consistent with present commercial-grade survey operations. The advantages and limitations of these two configurations employing both unmanned and autonomous strategies are compared using results from several recent survey operations. We will demonstrate how sensor data collected from unmanned platforms can augment or even replace traditional data collection technologies. Oceanographic observations (e.g., sound speed, temperature and currents) collected simultaneously with bathymetry using autonomous technologies provide additional opportunities for advanced data assimilation in numerical forecasts. Discussion focuses on our vision for unmanned and autonomous systems working in conjunction with manned or in-situ systems to optimally and simultaneously collect data in environmentally hostile or difficult to reach areas.

  16. Fusion of Building Information and Range Imaging for Autonomous Location Estimation in Indoor Environments

    PubMed Central

    Kohoutek, Tobias K.; Mautz, Rainer; Wegner, Jan D.

    2013-01-01

    We present a novel approach for autonomous location estimation and navigation in indoor environments using range images and prior scene knowledge from a GIS database (CityGML). What makes this task challenging is the arbitrary relative spatial relation between GIS and Time-of-Flight (ToF) range camera further complicated by a markerless configuration. We propose to estimate the camera's pose solely based on matching of GIS objects and their detected location in image sequences. We develop a coarse-to-fine matching strategy that is able to match point clouds without any initial parameters. Experiments with a state-of-the-art ToF point cloud show that our proposed method delivers an absolute camera position with decimeter accuracy, which is sufficient for many real-world applications (e.g., collision avoidance). PMID:23435055

  17. Cybersecurity for aerospace autonomous systems

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2015-05-01

    High profile breaches have occurred across numerous information systems. One area where attacks are particularly problematic is autonomous control systems. This paper considers the aerospace information system, focusing on elements that interact with autonomous control systems (e.g., onboard UAVs). It discusses the trust placed in the autonomous systems and supporting systems (e.g., navigational aids) and how this trust can be validated. Approaches to remotely detect the UAV compromise, without relying on the onboard software (on a potentially compromised system) as part of the process are discussed. How different levels of autonomy (task-based, goal-based, mission-based) impact this remote characterization is considered.

  18. A Self-Tuning Kalman Filter for Autonomous Spacecraft Navigation

    NASA Technical Reports Server (NTRS)

    Truong, Son H.

    1998-01-01

    Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman Filter and Global Positioning System (GPS) data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. Current techniques of Kalman filtering, however, still rely on manual tuning from analysts, and cannot help in optimizing autonomy without compromising accuracy and performance. This paper presents an approach to produce a high accuracy autonomous navigation system fully integrated with the flight system. The resulting system performs real-time state estimation by using an Extended Kalman Filter (EKF) implemented with high-fidelity state dynamics model, as does the GPS Enhanced Orbit Determination Experiment (GEODE) system developed by the NASA Goddard Space Flight Center. Augmented to the EKF is a sophisticated neural-fuzzy system, which combines the explicit knowledge representation of fuzzy logic with the learning power of neural networks. The fuzzy-neural system performs most of the self-tuning capability and helps the navigation system recover from estimation errors. The core requirement is a method of state estimation that handles uncertainties robustly, capable of identifying estimation problems, flexible enough to make decisions and adjustments to recover from these problems, and compact enough to run on flight hardware. The resulting system can be extended to support geosynchronous spacecraft and high-eccentricity orbits. Mathematical methodology, systems and operations concepts, and implementation of a system prototype are presented in this paper. Results from the use of the prototype to evaluate optimal control algorithms implemented are discussed. Test data and major control issues (e.g., how to define specific roles for fuzzy logic to support the self-learning capability) are also discussed. In addition, architecture of a complete end-to-end candidate flight system that provides navigation with highly autonomous control using data from GPS is presented.

  19. State estimation for autonomous flight in cluttered environments

    NASA Astrophysics Data System (ADS)

    Langelaan, Jacob Willem

    Safe, autonomous operation in complex, cluttered environments is a critical challenge facing autonomous mobile systems. The research described in this dissertation was motivated by a particularly difficult example of autonomous mobility: flight of a small Unmanned Aerial Vehicle (UAV) through a forest. In cluttered environments (such as forests or natural and urban canyons) signals from navigation beacons such as GPS may frequently be occluded. Direct measurements of vehicle position are therefore unavailable, and information required for flight control, obstacle avoidance, and navigation must be obtained using only on-board sensors. However, payload limitations of small UAVs restrict both the mass and physical dimensions of sensors that can be carried. This dissertation describes the development and proof-of-concept demonstration of a navigation system that uses only a low-cost inertial measurement unit and a monocular camera. Micro electromechanical inertial measurements units are well suited to small UAV applications and provide measurements of acceleration and angular rate. However, they do not provide information about nearby obstacles (needed for collision avoidance) and their noise and bias characteristics lead to unbounded growth in computed position. A monocular camera can provide bearings to nearby obstacles and landmarks. These bearings can be used both to enable obstacle avoidance and to aid navigation. Presented here is a solution to the problem of estimating vehicle state (position, orientation and velocity) as well as positions of obstacles in the environment using only inertial measurements and bearings to obstacles. This is a highly nonlinear estimation problem, and standard estimation techniques such as the Extended Kalman Filter are prone to divergence in this application. In this dissertation a Sigma Point Kalman Filter is implemented, resulting in an estimator which is able to cope with the significant nonlinearities in the system equations and uncertainty in state estimates while remaining tractable for real-time operation. In addition, the issues of data association and landmark initialization are addressed. Estimator performance is examined through Monte Carlo simulations in both two and three dimensions for scenarios involving UAV flight in cluttered environments. Hardware tests and simulations demonstrate navigation through an obstacle-strewn environment by a small Unmanned Ground Vehicle.

  20. Application of a distributed systems architecture for increased speed in image processing on an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Wright, Adam A.; Momin, Orko; Shin, Young Ho; Shakya, Rahul; Nepal, Kumud; Ahlgren, David J.

    2010-01-01

    This paper presents the application of a distributed systems architecture to an autonomous ground vehicle, Q, that participates in both the autonomous and navigation challenges of the Intelligent Ground Vehicle Competition. In the autonomous challenge the vehicle is required to follow a course, while avoiding obstacles and staying within the course boundaries, which are marked by white lines. For the navigation challenge, the vehicle is required to reach a set of target destinations, known as way points, with given GPS coordinates and avoid obstacles that it encounters in the process. Previously the vehicle utilized a single laptop to execute all processing activities including image processing, sensor interfacing and data processing, path planning and navigation algorithms and motor control. National Instruments' (NI) LabVIEW served as the programming language for software implementation. As an upgrade to last year's design, a NI compact Reconfigurable Input/Output system (cRIO) was incorporated to the system architecture. The cRIO is NI's solution for rapid prototyping that is equipped with a real time processor, an FPGA and modular input/output. Under the current system, the real time processor handles the path planning and navigation algorithms, the FPGA gathers and processes sensor data. This setup leaves the laptop to focus on running the image processing algorithm. Image processing as previously presented by Nepal et. al. is a multi-step line extraction algorithm and constitutes the largest processor load. This distributed approach results in a faster image processing algorithm which was previously Q's bottleneck. Additionally, the path planning and navigation algorithms are executed more reliably on the real time processor due to the deterministic nature of operation. The implementation of this architecture required exploration of various inter-system communication techniques. Data transfer between the laptop and the real time processor using UDP packets was established as the most reliable protocol after testing various options. Improvement can be made to the system by migrating more algorithms to the hardware based FPGA to further speed up the operations of the vehicle.

  1. A Behavior-Based Strategy for Single and Multi-Robot Autonomous Exploration

    PubMed Central

    Cepeda, Jesus S.; Chaimowicz, Luiz; Soto, Rogelio; Gordillo, José L.; Alanís-Reyes, Edén A.; Carrillo-Arce, Luis C.

    2012-01-01

    In this paper, we consider the problem of autonomous exploration of unknown environments with single and multiple robots. This is a challenging task, with several potential applications. We propose a simple yet effective approach that combines a behavior-based navigation with an efficient data structure to store previously visited regions. This allows robots to safely navigate, disperse and efficiently explore the environment. A series of experiments performed using a realistic robotic simulator and a real testbed scenario demonstrate that our technique effectively distributes the robots over the environment and allows them to quickly accomplish their mission in large open spaces, narrow cluttered environments, dead-end corridors, as well as rooms with minimum exits.

  2. Implementation of Autonomous Navigation and Mapping using a Laser Line Scanner on a Tactical Unmanned Aerial Vehicle

    DTIC Science & Technology

    2011-12-01

    study new multi-agent algorithms to avoid collision and obstacles. Others, including Hanford et al. [2], have tried to build low-cost experimental...2007. [2] S. D. Hanford , L. N. Long, and J. F. Horn, “A Small Semi-Autonomous Rotary-Wing Unmanned Air Vehicle ( UAV ),” 2003 AIAA Atmospheric

  3. Design of an autonomous exterior security robot

    NASA Technical Reports Server (NTRS)

    Myers, Scott D.

    1994-01-01

    This paper discusses the requirements and preliminary design of robotic vehicle designed for performing autonomous exterior perimeter security patrols around warehouse areas, ammunition supply depots, and industrial parks for the U.S. Department of Defense. The preliminary design allows for the operation of up to eight vehicles in a six kilometer by six kilometer zone with autonomous navigation and obstacle avoidance. In addition to detection of crawling intruders at 100 meters, the system must perform real-time inventory checking and database comparisons using a microwave tags system.

  4. Flight Test Performance of a High Precision Navigation Doppler Lidar

    NASA Technical Reports Server (NTRS)

    Pierrottet, Diego; Amzajerdian, Farzin; Petway, Larry; Barnes, Bruce; Lockard, George

    2009-01-01

    A navigation Doppler Lidar (DL) was developed at NASA Langley Research Center (LaRC) for high precision velocity measurements from a lunar or planetary landing vehicle in support of the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. A unique feature of this DL is that it has the capability to provide a precision velocity vector which can be easily separated into horizontal and vertical velocity components and high accuracy line of sight (LOS) range measurements. This dual mode of operation can provide useful information, such as vehicle orientation relative to the direction of travel, and vehicle attitude relative to the sensor footprint on the ground. System performance was evaluated in a series of helicopter flight tests over the California desert. This paper provides a description of the DL system and presents results obtained from these flight tests.

  5. The Personal Satellite Assistant: An Internal Spacecraft Autonomous Mobile Monitor

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Gawdiak, Yuri; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper presents an overview of the research and development effort at the NASA Ames Research Center to create an internal spacecraft autonomous mobile monitor capable of performing intra-vehicular sensing activities by autonomously navigating onboard the International Space Station. We describe the capabilities, mission roles, rationale, high-level functional requirements, and design challenges for an autonomous mobile monitor. The rapid prototyping design methodology used, in which five prototypes of increasing fidelity are designed, is described as well as the status of these prototypes, of which two are operational and being tested, and one is actively being designed. The physical test facilities used to perform ground testing are briefly described, including a micro-gravity test facility that permits a prototype to propel itself in 3 dimensions with 6 degrees-of-freedom as if it were in an micro-gravity environment. We also describe an overview of the autonomy framework and its components including the software simulators used in the development process. Sample mission test scenarios are also described. The paper concludes with a discussion of future and related work followed by the summary.

  6. Autonomous Navigation of a Satellite Cluster

    DTIC Science & Technology

    1990-12-01

    satellite’s velocity are determined by the Clohessy - Wiltshire equations I (these equations will be introduced in the next section) and take the form: (8:80...transition matrix, is based upon the Clohessy - Wiltshire equations of motion. These equations describe "the relative motion of two satellites when one is in a...discovery warranted a re-examination of the solutions to the Clohessy - Wiltshire equations. If the solutions for satellite #1 and #2 are subtracted

  7. Rule-based navigation control design for autonomous flight

    NASA Astrophysics Data System (ADS)

    Contreras, Hugo; Bassi, Danilo

    2008-04-01

    This article depicts a navigation control system design that is based on a set of rules in order to follow a desired trajectory. The full control of the aircraft considered here comprises: a low level stability control loop, based on classic PID controller and the higher level navigation whose main job is to exercise lateral control (course) and altitude control, trying to follow a desired trajectory. The rules and PID gains were adjusted systematically according to the result of flight simulation. In spite of its simplicity, the rule-based navigation control proved to be robust, even with big perturbation, like crossing winds.

  8. TDRSS Onboard Navigation System (TONS) experiment for the Explorer Platform (EP)

    NASA Astrophysics Data System (ADS)

    Gramling, C. J.; Hornstein, R. S.; Long, A. C.; Samii, M. V.; Elrod, B. D.

    A TDRSS Onboard Navigation System (TONS) is currently being developed by NASA to provide a high-accuracy autonomous spacecraft navigation capability for users of TDRSS and its successor, the Advanced TDRSS. A TONS experiment will be performed in conjunction with the Explorer Platform (EP)/EUV Explorer mission to flight-qualify TONS Block I. This paper presents an overview of TDRSS on-board navigation goals and plans and the technical objectives of the TONS experiment. The operations concept of the experiment is described, including the characteristics of the ultrastable oscillator, the Doppler extractor, the signal-acquisition process, the TONS ground-support system, and the navigation flight software. A description of the on-board navigation algorithms and the rationale for their selection is also presented.

  9. Preliminary Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)

    NASA Technical Reports Server (NTRS)

    Folta, David; Hawkins, Albin

    2001-01-01

    NASA's first autonomous formation flying mission is completing a primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center has implemented an autonomous universal three-axis formation flying algorithm in executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard design and presents the preliminary validation results of this unique system. Results from functionality assessment and autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a stand-alone algorithm.

  10. The Role of X-Rays in Future Space Navigation and Communication

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke M. B.; Gendreau, Keith C.; Hasouneh, Monther A.; Mitchell, Jason W.; Fong, Wai H.; Lee, Wing-Tsz; Gavriil, Fotis; Arzoumanian, Zaven

    2013-01-01

    In the near future, applications using X-rays will enable autonomous navigation and time distribution throughout the solar system, high capacity and low-power space data links, highly accurate attitude sensing, and extremely high-precision formation flying capabilities. Each of these applications alone has the potential to revolutionize mission capabilities, particularly beyond Earth orbit. This paper will outline the NASA Goddard Space Flight Center vision and efforts toward realizing the full potential of X-ray navigation and communications.

  11. Robust analysis of an underwater navigational strategy in electrically heterogeneous corridors.

    PubMed

    Dimble, Kedar D; Ranganathan, Badri N; Keshavan, Jishnu; Humbert, J Sean

    2016-08-01

    Obstacles and other global stimuli provide relevant navigational cues to a weakly electric fish. In this work, robust analysis of a control strategy based on electrolocation for performing obstacle avoidance in electrically heterogeneous corridors is presented and validated. Static output feedback control is shown to achieve the desired goal of reflexive obstacle avoidance in such environments in simulation and experimentation. The proposed approach is computationally inexpensive and readily implementable on a small scale underwater vehicle, making underwater autonomous navigation feasible in real-time.

  12. Light Detection and Ranging-Based Terrain Navigation: A Concept Exploration

    NASA Technical Reports Server (NTRS)

    Campbell, Jacob; UijtdeHaag, Maarten; vanGraas, Frank; Young, Steve

    2003-01-01

    This paper discusses the use of Airborne Light Detection And Ranging (LiDAR) equipment for terrain navigation. Airborne LiDAR is a relatively new technology used primarily by the geo-spatial mapping community to produce highly accurate and dense terrain elevation maps. In this paper, the term LiDAR refers to a scanning laser ranger rigidly mounted to an aircraft, as opposed to an integrated sensor system that consists of a scanning laser ranger integrated with Global Positioning System (GPS) and Inertial Measurement Unit (IMU) data. Data from the laser range scanner and IMU will be integrated with a terrain database to estimate the aircraft position and data from the laser range scanner will be integrated with GPS to estimate the aircraft attitude. LiDAR data was collected using NASA Dryden's DC-8 flying laboratory in Reno, NV and was used to test the proposed terrain navigation system. The results of LiDAR-based terrain navigation shown in this paper indicate that airborne LiDAR is a viable technology enabler for fully autonomous aircraft navigation. The navigation performance is highly dependent on the quality of the terrain databases used for positioning and therefore high-resolution (2 m post-spacing) data was used as the terrain reference.

  13. SOVEREIGN: An autonomous neural system for incrementally learning planned action sequences to navigate towards a rewarded goal.

    PubMed

    Gnadt, William; Grossberg, Stephen

    2008-06-01

    How do reactive and planned behaviors interact in real time? How are sequences of such behaviors released at appropriate times during autonomous navigation to realize valued goals? Controllers for both animals and mobile robots, or animats, need reactive mechanisms for exploration, and learned plans to reach goal objects once an environment becomes familiar. The SOVEREIGN (Self-Organizing, Vision, Expectation, Recognition, Emotion, Intelligent, Goal-oriented Navigation) animat model embodies these capabilities, and is tested in a 3D virtual reality environment. SOVEREIGN includes several interacting subsystems which model complementary properties of cortical What and Where processing streams and which clarify similarities between mechanisms for navigation and arm movement control. As the animat explores an environment, visual inputs are processed by networks that are sensitive to visual form and motion in the What and Where streams, respectively. Position-invariant and size-invariant recognition categories are learned by real-time incremental learning in the What stream. Estimates of target position relative to the animat are computed in the Where stream, and can activate approach movements toward the target. Motion cues from animat locomotion can elicit head-orienting movements to bring a new target into view. Approach and orienting movements are alternately performed during animat navigation. Cumulative estimates of each movement are derived from interacting proprioceptive and visual cues. Movement sequences are stored within a motor working memory. Sequences of visual categories are stored in a sensory working memory. These working memories trigger learning of sensory and motor sequence categories, or plans, which together control planned movements. Predictively effective chunk combinations are selectively enhanced via reinforcement learning when the animat is rewarded. Selected planning chunks effect a gradual transition from variable reactive exploratory movements to efficient goal-oriented planned movement sequences. Volitional signals gate interactions between model subsystems and the release of overt behaviors. The model can control different motor sequences under different motivational states and learns more efficient sequences to rewarded goals as exploration proceeds.

  14. Insect-Based Vision for Autonomous Vehicles: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandyam V.

    1999-01-01

    The aims of the project were to use a high-speed digital video camera to pursue two questions: i) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; To study the fine structure of insect flight trajectories with in order to better understand the characteristics of flight control, orientation and navigation.

  15. Insect-Based Vision for Autonomous Vehicles: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandyam V.

    1999-01-01

    The aims of the project were to use a high-speed digital video camera to pursue two questions: (1) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; (2) To study the fine structure of insect flight trajectories in order to better understand the characteristics of flight control, orientation and navigation.

  16. Terrain discovery and navigation of a multi-articulated linear robot using map-seeking circuits

    NASA Astrophysics Data System (ADS)

    Snider, Ross K.; Arathorn, David W.

    2006-05-01

    A significant challenge in robotics is providing a robot with the ability to sense its environment and then autonomously move while accommodating obstacles. The DARPA Grand Challenge, one of the most visible examples, set the goal of driving a vehicle autonomously for over a hundred miles avoiding obstacles along a predetermined path. Map-Seeking Circuits have shown their biomimetic capability in both vision and inverse kinematics and here we demonstrate their potential usefulness for intelligent exploration of unknown terrain using a multi-articulated linear robot. A robot that could handle any degree of terrain complexity would be useful for exploring inaccessible crowded spaces such as rubble piles in emergency situations, patrolling/intelligence gathering in tough terrain, tunnel exploration, and possibly even planetary exploration. Here we simulate autonomous exploratory navigation by an interaction of terrain discovery using the multi-articulated linear robot to build a local terrain map and exploitation of that growing terrain map to solve the propulsion problem of the robot.

  17. BatSLAM: Simultaneous localization and mapping using biomimetic sonar.

    PubMed

    Steckel, Jan; Peremans, Herbert

    2013-01-01

    We propose to combine a biomimetic navigation model which solves a simultaneous localization and mapping task with a biomimetic sonar mounted on a mobile robot to address two related questions. First, can robotic sonar sensing lead to intelligent interactions with complex environments? Second, can we model sonar based spatial orientation and the construction of spatial maps by bats? To address these questions we adapt the mapping module of RatSLAM, a previously published navigation system based on computational models of the rodent hippocampus. We analyze the performance of the proposed robotic implementation operating in the real world. We conclude that the biomimetic navigation model operating on the information from the biomimetic sonar allows an autonomous agent to map unmodified (office) environments efficiently and consistently. Furthermore, these results also show that successful navigation does not require the readings of the biomimetic sonar to be interpreted in terms of individual objects/landmarks in the environment. We argue that the system has applications in robotics as well as in the field of biology as a simple, first order, model for sonar based spatial orientation and map building.

  18. BatSLAM: Simultaneous Localization and Mapping Using Biomimetic Sonar

    PubMed Central

    Steckel, Jan; Peremans, Herbert

    2013-01-01

    We propose to combine a biomimetic navigation model which solves a simultaneous localization and mapping task with a biomimetic sonar mounted on a mobile robot to address two related questions. First, can robotic sonar sensing lead to intelligent interactions with complex environments? Second, can we model sonar based spatial orientation and the construction of spatial maps by bats? To address these questions we adapt the mapping module of RatSLAM, a previously published navigation system based on computational models of the rodent hippocampus. We analyze the performance of the proposed robotic implementation operating in the real world. We conclude that the biomimetic navigation model operating on the information from the biomimetic sonar allows an autonomous agent to map unmodified (office) environments efficiently and consistently. Furthermore, these results also show that successful navigation does not require the readings of the biomimetic sonar to be interpreted in terms of individual objects/landmarks in the environment. We argue that the system has applications in robotics as well as in the field of biology as a simple, first order, model for sonar based spatial orientation and map building. PMID:23365647

  19. Autonomous Flight Rules - A Concept for Self-Separation in U.S. Domestic Airspace

    NASA Technical Reports Server (NTRS)

    Wing, David J.; Cotton, William B.

    2011-01-01

    Autonomous Flight Rules (AFR) are proposed as a new set of operating regulations in which aircraft navigate on tracks of their choice while self-separating from traffic and weather. AFR would exist alongside Instrument and Visual Flight Rules (IFR and VFR) as one of three available flight options for any appropriately trained and qualified operator with the necessary certified equipment. Historically, ground-based separation services evolved by necessity as aircraft began operating in the clouds and were unable to see each other. Today, technologies for global navigation, airborne surveillance, and onboard computing enable the functions of traffic conflict management to be fully integrated with navigation procedures onboard the aircraft. By self-separating, aircraft can operate with more flexibility and fewer restrictions than are required when using ground-based separation. The AFR concept is described in detail and provides practical means by which self-separating aircraft could share the same airspace as IFR and VFR aircraft without disrupting the ongoing processes of Air Traffic Control.

  20. Optimization design about gimbal structure of high-precision autonomous celestial navigation tracking mirror system

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Yang, Xiao-xu; Han, Jun-feng; Wei, Yu; Zhang, Jing; Xie, Mei-lin; Yue, Peng

    2016-01-01

    High precision tracking platform of celestial navigation with control mirror servo structure form, to solve the disadvantages of big volume and rotational inertia, slow response speed, and so on. It improved the stability and tracking accuracy of platform. Due to optical sensor and mirror are installed on the middle-gimbal, stiffness and resonant frequency requirement for high. Based on the application of finite element modality analysis theory, doing Research on dynamic characteristics of the middle-gimbal, and ANSYS was used for the finite element dynamic emulator analysis. According to the result of the computer to find out the weak links of the structure, and Put forward improvement suggestions and reanalysis. The lowest resonant frequency of optimization middle-gimbal avoid the bandwidth of the platform servo mechanism, and much higher than the disturbance frequency of carrier aircraft, and reduces mechanical resonance of the framework. Reaching provides a theoretical basis for the whole machine structure optimization design of high-precision of autonomous Celestial navigation tracking mirror system.

  1. Improving CAR Navigation with a Vision-Based System

    NASA Astrophysics Data System (ADS)

    Kim, H.; Choi, K.; Lee, I.

    2015-08-01

    The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.

  2. Improving Car Navigation with a Vision-Based System

    NASA Astrophysics Data System (ADS)

    Kim, H.; Choi, K.; Lee, I.

    2015-08-01

    The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.

  3. OAST Space Theme Workshop. Volume 3: Working group summary. 1: Navigation, guidance, control (E-1) A. Statement. B. Technology needs (form 1). C. Priority assessment (form 2)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The six themes identified by the Workshop have many common navigation guidance and control needs. All the earth orbit themes have a strong requirement for attitude, figure and stabilization control of large space structures, a requirement not currently being supported. All but the space transportation theme have need for precision pointing of spacecraft and instruments. In addition all the themes have requirements for increasing autonomous operations for such activities as spacecraft and experiment operations, onboard mission modification, rendezvous and docking, spacecraft assembly and maintenance, navigation and guidance, and self-checkout, test and repair. Major new efforts are required to conceptualize new approaches to large space antennas and arrays that are lightweight, readily deployable, and capable of precise attitude and figure control. Conventional approaches offer little hope of meeting these requirements. Functions that can benefit from increasing automation or autonomous operations are listed.

  4. Integrated Docking Simulation and Testing with the Johnson Space Center Six-Degree of Freedom Dynamic Test System

    NASA Technical Reports Server (NTRS)

    Mitchell, Jennifer D.; Cryan, Scott P.; Baker, Kenneth; Martin, Toby; Goode, Robert; Key, Kevin W.; Manning, Thomas; Chien, Chiun-Hong

    2008-01-01

    The Exploration Systems Architecture defines missions that require rendezvous, proximity operations, and docking (RPOD) of two spacecraft both in Low Earth Orbit (LEO) and in Low Lunar Orbit (LLO). Uncrewed spacecraft must perform automated and/or autonomous rendezvous, proximity operations and docking operations (commonly known as Automated Rendezvous and Docking, AR&D). The crewed versions may also perform AR&D, possibly with a different level of automation and/or autonomy, and must also provide the crew with relative navigation information for manual piloting. The capabilities of the RPOD sensors are critical to the success of the Constellation Program; this is carried as one of the CEV Project top risks. The Exploration Technology Development Program (ETDP) AR&D Sensor Technology Project seeks to reduce this risk by increasing technology maturation of selected relative navigation sensor technologies through testing and simulation. One of the project activities is a series of "pathfinder" testing and simulation activities to integrate relative navigation sensors with the Johnson Space Center Six-Degree-of-Freedom Test System (SDTS). The SDTS will be the primary testing location for the Orion spacecraft s Low Impact Docking System (LIDS). Project team members have integrated the Orion simulation with the SDTS computer system so that real-time closed loop testing can be performed with relative navigation sensors and the docking system in the loop during docking and undocking scenarios. Two relative navigation sensors are being used as part of a "pathfinder" activity in order to pave the way for future testing with the actual Orion sensors. This paper describes the test configuration and test results.

  5. A method of real-time detection for distant moving obstacles by monocular vision

    NASA Astrophysics Data System (ADS)

    Jia, Bao-zhi; Zhu, Ming

    2013-12-01

    In this paper, we propose an approach for detection of distant moving obstacles like cars and bicycles by a monocular camera to cooperate with ultrasonic sensors in low-cost condition. We are aiming at detecting distant obstacles that move toward our autonomous navigation car in order to give alarm and keep away from them. Method of frame differencing is applied to find obstacles after compensation of camera's ego-motion. Meanwhile, each obstacle is separated from others in an independent area and given a confidence level to indicate whether it is coming closer. The results on an open dataset and our own autonomous navigation car have proved that the method is effective for detection of distant moving obstacles in real-time.

  6. Learning for autonomous navigation : extrapolating from underfoot to the far field

    NASA Technical Reports Server (NTRS)

    Matthies, Larry; Turmon, Michael; Howard, Andrew; Angelova, Anelia; Tang, Benyang; Mjolsness, Eric

    2005-01-01

    Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter. Enabling robots to learn from experience may alleviate both of these problems. We define two paradigms for this, learning from 3-D geometry and learning from proprioception, and describe initial instantiations of them we have developed under DARPA and NASA programs. Field test results show promise for learning traversability of vegetated terrain, learning to extend the lookahead range of the vision system, and learning how slip varies with slope.

  7. Design and test of a simulation system for autonomous optic-navigated planetary landing

    NASA Astrophysics Data System (ADS)

    Cai, Sheng; Yin, Yanhe; Liu, Yanjun; He, Fengyun

    2018-02-01

    In this paper, a simulation system based on commercial projector is proposed to test the optical navigation algorithms for autonomous planetary landing in laboratorial scenarios. The design work of optics, mechanics and synchronization control are carried out. Furthermore, the whole simulation system is set up and tested. Through the calibration of the system, two main problems, synchronization between the projector and CCD and pixel-level shifting caused by the low repeatability of DMD used in the projector, are settled. The experimental result shows that the RMS errors of pitch, yaw and roll angles are 0.78', 0.48', and 2.95' compared with the theoretical calculation, which can fulfill the requirement of experimental simulation for planetary landing in laboratory.

  8. Fast and reliable obstacle detection and segmentation for cross-country navigation

    NASA Technical Reports Server (NTRS)

    Talukder, A.; Manduchi, R.; Rankin, A.; Matthies, L.

    2002-01-01

    Obstacle detection is one of the main components of the control system of autonomous vehicles. In the case of indoor/urban navigation, obstacles are typically defined as surface points that are higher than the ground plane. This characterization, however, cannot be used in cross-country and unstructured environments, where the notion of ground plane is often not meaningful.

  9. Obstacle avoidance and concealed target detection using the Army Research Lab ultra-wideband synchronous impulse reconstruction (UWB SIRE) forward imaging radar

    NASA Astrophysics Data System (ADS)

    Nguyen, Lam; Wong, David; Ressler, Marc; Koenig, Francois; Stanton, Brian; Smith, Gregory; Sichina, Jeffrey; Kappra, Karl

    2007-04-01

    The U.S. Army Research Laboratory (ARL), as part of a mission and customer funded exploratory program, has developed a new low-frequency, ultra-wideband (UWB) synthetic aperture radar (SAR) for forward imaging to support the Army's vision of an autonomous navigation system for robotic ground vehicles. These unmanned vehicles, equipped with an array of imaging sensors, will be tasked to help detect man-made obstacles such as concealed targets, enemy minefields, and booby traps, as well as other natural obstacles such as ditches, and bodies of water. The ability of UWB radar technology to help detect concealed objects has been documented in the past and could provide an important obstacle avoidance capability for autonomous navigation systems, which would improve the speed and maneuverability of these vehicles and consequently increase the survivability of the U. S. forces on the battlefield. One of the primary features of the radar is the ability to collect and process data at combat pace in an affordable, compact, and lightweight package. To achieve this, the radar is based on the synchronous impulse reconstruction (SIRE) technique where several relatively slow and inexpensive analog-to-digital (A/D) converters are used to sample the wide bandwidth of the radar signals. We conducted an experiment this winter at Aberdeen Proving Ground (APG) to support the phenomenological studies of the backscatter from positive and negative obstacles for autonomous robotic vehicle navigation, as well as the detection of concealed targets of interest to the Army. In this paper, we briefly describe the UWB SIRE radar and the test setup in the experiment. We will also describe the signal processing and the forward imaging techniques used in the experiment. Finally, we will present imagery of man-made obstacles such as barriers, concertina wires, and mines.

  10. AUV SLAM and Experiments Using a Mechanical Scanning Forward-Looking Sonar

    PubMed Central

    He, Bo; Liang, Yan; Feng, Xiao; Nian, Rui; Yan, Tianhong; Li, Minghui; Zhang, Shujing

    2012-01-01

    Navigation technology is one of the most important challenges in the applications of autonomous underwater vehicles (AUVs) which navigate in the complex undersea environment. The ability of localizing a robot and accurately mapping its surroundings simultaneously, namely the simultaneous localization and mapping (SLAM) problem, is a key prerequisite of truly autonomous robots. In this paper, a modified-FastSLAM algorithm is proposed and used in the navigation for our C-Ranger research platform, an open-frame AUV. A mechanical scanning imaging sonar is chosen as the active sensor for the AUV. The modified-FastSLAM implements the update relying on the on-board sensors of C-Ranger. On the other hand, the algorithm employs the data association which combines the single particle maximum likelihood method with modified negative evidence method, and uses the rank-based resampling to overcome the particle depletion problem. In order to verify the feasibility of the proposed methods, both simulation experiments and sea trials for C-Ranger are conducted. The experimental results show the modified-FastSLAM employed for the navigation of the C-Ranger AUV is much more effective and accurate compared with the traditional methods. PMID:23012549

  11. AUV SLAM and experiments using a mechanical scanning forward-looking sonar.

    PubMed

    He, Bo; Liang, Yan; Feng, Xiao; Nian, Rui; Yan, Tianhong; Li, Minghui; Zhang, Shujing

    2012-01-01

    Navigation technology is one of the most important challenges in the applications of autonomous underwater vehicles (AUVs) which navigate in the complex undersea environment. The ability of localizing a robot and accurately mapping its surroundings simultaneously, namely the simultaneous localization and mapping (SLAM) problem, is a key prerequisite of truly autonomous robots. In this paper, a modified-FastSLAM algorithm is proposed and used in the navigation for our C-Ranger research platform, an open-frame AUV. A mechanical scanning imaging sonar is chosen as the active sensor for the AUV. The modified-FastSLAM implements the update relying on the on-board sensors of C-Ranger. On the other hand, the algorithm employs the data association which combines the single particle maximum likelihood method with modified negative evidence method, and uses the rank-based resampling to overcome the particle depletion problem. In order to verify the feasibility of the proposed methods, both simulation experiments and sea trials for C-Ranger are conducted. The experimental results show the modified-FastSLAM employed for the navigation of the C-Ranger AUV is much more effective and accurate compared with the traditional methods.

  12. Navigation Architecture for a Space Mobile Network

    NASA Technical Reports Server (NTRS)

    Valdez, Jennifer E.; Ashman, Benjamin; Gramling, Cheryl; Heckler, Gregory W.; Carpenter, Russell

    2016-01-01

    The Tracking and Data Relay Satellite System (TDRSS) Augmentation Service for Satellites (TASS) is a proposed beacon service to provide a global, space based GPS augmentation service based on the NASA Global Differential GPS (GDGPS) System. The TASS signal will be tied to the GPS time system and usable as an additional ranging and Doppler radiometric source. Additionally, it will provide data vital to autonomous navigation in the near Earth regime, including space weather information, TDRS ephemerides, Earth Orientation Parameters (EOP), and forward commanding capability. TASS benefits include enhancing situational awareness, enabling increased autonomy, and providing near real-time command access for user platforms. As NASA Headquarters' Space Communication and Navigation Office (SCaN) begins to move away from a centralized network architecture and towards a Space Mobile Network (SMN) that allows for user initiated services, autonomous navigation will be a key part of such a system. This paper explores how a TASS beacon service enables the Space Mobile Networking paradigm, what a typical user platform would require, and provides an in-depth analysis of several navigation scenarios and operations concepts. This paper provides an overview of the TASS beacon and its role within the SMN and user community. Supporting navigation analysis is presented for two user mission scenarios: an Earth observing spacecraft in low earth orbit (LEO), and a highly elliptical spacecraft in a lunar resonance orbit. These diverse flight scenarios indicate the breadth of applicability of the TASS beacon for upcoming users within the current network architecture and in the SMN.

  13. Preliminary Operational Results of the TDRSS Onboard Navigation System (TONS) for the Terra Mission

    NASA Technical Reports Server (NTRS)

    Gramling, Cheryl; Lorah, John; Santoro, Ernest; Work, Kevin; Chambers, Robert; Bauer, Frank H. (Technical Monitor)

    2000-01-01

    The Earth Observing System Terra spacecraft was launched on December 18, 1999, to provide data for the characterization of the terrestrial and oceanic surfaces, clouds, radiation, aerosols, and radiative balance. The Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (ONS) (TONS) flying on Terra provides the spacecraft with an operational real-time navigation solution. TONS is a passive system that makes judicious use of Terra's communication and computer subsystems. An objective of the ONS developed by NASA's Goddard Space Flight Center (GSFC) Guidance, Navigation and Control Center is to provide autonomous navigation with minimal power, weight, and volume impact on the user spacecraft. TONS relies on extracting tracking measurements onboard from a TDRSS forward-link communication signal and processing these measurements in an onboard extended Kalman filter to estimate Terra's current state. Terra is the first NASA low Earth orbiting mission to fly autonomous navigation which produces accurate results. The science orbital accuracy requirements for Terra are 150 meters (m) (3sigma) per axis with a goal of 5m (1 sigma) RSS which TONS is expected to meet. The TONS solutions are telemetered in real-time to the mission scientists along with their science data for immediate processing. Once set in the operational mode, TONS eliminates the need for ground orbit determination and allows for a smooth flow from the spacecraft telemetry to planning products for the mission team. This paper will present the preliminary results of the operational TONS solution available from Terra.

  14. Design of all-weather celestial navigation system

    NASA Astrophysics Data System (ADS)

    Sun, Hongchi; Mu, Rongjun; Du, Huajun; Wu, Peng

    2018-03-01

    In order to realize autonomous navigation in the atmosphere, an all-weather celestial navigation system is designed. The research of celestial navigation system include discrimination method of comentropy and the adaptive navigation algorithm based on the P value. The discrimination method of comentropy is studied to realize the independent switching of two celestial navigation modes, starlight and radio. Finally, an adaptive filtering algorithm based on P value is proposed, which can greatly improve the disturbance rejection capability of the system. The experimental results show that the accuracy of the three axis attitude is better than 10″, and it can work all weather. In perturbation environment, the position accuracy of the integrated navigation system can be increased 20% comparing with the traditional method. It basically meets the requirements of the all-weather celestial navigation system, and it has the ability of stability, reliability, high accuracy and strong anti-interference.

  15. Fuzzy Logic Based Control for Autonomous Mobile Robot Navigation

    PubMed Central

    Masmoudi, Mohamed Slim; Masmoudi, Mohamed

    2016-01-01

    This paper describes the design and the implementation of a trajectory tracking controller using fuzzy logic for mobile robot to navigate in indoor environments. Most of the previous works used two independent controllers for navigation and avoiding obstacles. The main contribution of the paper can be summarized in the fact that we use only one fuzzy controller for navigation and obstacle avoidance. The used mobile robot is equipped with DC motor, nine infrared range (IR) sensors to measure the distance to obstacles, and two optical encoders to provide the actual position and speeds. To evaluate the performances of the intelligent navigation algorithms, different trajectories are used and simulated using MATLAB software and SIMIAM navigation platform. Simulation results show the performances of the intelligent navigation algorithms in terms of simulation times and travelled path. PMID:27688748

  16. Optimal scheme of star observation of missile-borne inertial navigation system/stellar refraction integrated navigation

    NASA Astrophysics Data System (ADS)

    Lu, Jiazhen; Yang, Lie

    2018-05-01

    To achieve accurate and completely autonomous navigation for spacecraft, inertial/celestial integrated navigation gets increasing attention. In this study, a missile-borne inertial/stellar refraction integrated navigation scheme is proposed. Position Dilution of Precision (PDOP) for stellar refraction is introduced and the corresponding equation is derived. Based on the condition when PDOP reaches the minimum value, an optimized observation scheme is proposed. To verify the feasibility of the proposed scheme, numerical simulation is conducted. The results of the Extended Kalman Filter (EKF) and Unscented Kalman Filter (UKF) are compared and impact factors of navigation accuracy are studied in the simulation. The simulation results indicated that the proposed observation scheme has an accurate positioning performance, and the results of EKF and UKF are similar.

  17. Optimal scheme of star observation of missile-borne inertial navigation system/stellar refraction integrated navigation.

    PubMed

    Lu, Jiazhen; Yang, Lie

    2018-05-01

    To achieve accurate and completely autonomous navigation for spacecraft, inertial/celestial integrated navigation gets increasing attention. In this study, a missile-borne inertial/stellar refraction integrated navigation scheme is proposed. Position Dilution of Precision (PDOP) for stellar refraction is introduced and the corresponding equation is derived. Based on the condition when PDOP reaches the minimum value, an optimized observation scheme is proposed. To verify the feasibility of the proposed scheme, numerical simulation is conducted. The results of the Extended Kalman Filter (EKF) and Unscented Kalman Filter (UKF) are compared and impact factors of navigation accuracy are studied in the simulation. The simulation results indicated that the proposed observation scheme has an accurate positioning performance, and the results of EKF and UKF are similar.

  18. Conceptual Design of a Communication-Based Deep Space Navigation Network

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan J.; Chuang, C. H.

    2012-01-01

    As the need grows for increased autonomy and position knowledge accuracy to support missions beyond Earth orbit, engineers must push and develop more advanced navigation sensors and systems that operate independent of Earth-based analysis and processing. Several spacecraft are approaching this problem using inter-spacecraft radiometric tracking and onboard autonomous optical navigation methods. This paper proposes an alternative implementation to aid in spacecraft position fixing. The proposed method Network-Based Navigation technique takes advantage of the communication data being sent between spacecraft and between spacecraft and ground control to embed navigation information. The navigation system uses these packets to provide navigation estimates to an onboard navigation filter to augment traditional ground-based radiometric tracking techniques. As opposed to using digital signal measurements to capture inherent information of the transmitted signal itself, this method relies on the embedded navigation packet headers to calculate a navigation estimate. This method is heavily dependent on clock accuracy and the initial results show the promising performance of a notional system.

  19. Multiple Integrated Navigation Sensors for Improved Occupancy Grid FastSLAM

    DTIC Science & Technology

    2011-03-01

    to the Faculty Department of Electrical and Computer Engineering Graduate School of Engineering and Management Air Force Institute of Technology Air...autonomous vehicle exploration with applications to search and rescue. To current knowledge , this research presents the first SLAM solution to...solution is a key component of an autonomous vehicle, especially one whose mission involves gaining knowledge of unknown areas. It provides the ability

  20. Autonomous interplanetary constellation design

    NASA Astrophysics Data System (ADS)

    Chow, Cornelius Channing, II

    According to NASA's integrated space technology roadmaps, space-based infrastructures are envisioned as necessary ingredients to a sustained effort in continuing space exploration. Whether it be for extra-terrestrial habitats, roving/cargo vehicles, or space tourism, autonomous space networks will provide a vital communications lifeline for both future robotic and human missions alike. Projecting that the Moon will be a bustling hub of activity within a few decades, a near-term opportunity for in-situ infrastructure development is within reach. This dissertation addresses the anticipated need for in-space infrastructure by investigating a general design methodology for autonomous interplanetary constellations; to illustrate the theory, this manuscript presents results from an application to the Earth-Moon neighborhood. The constellation design methodology is formulated as an optimization problem, involving a trajectory design step followed by a spacecraft placement sequence. Modeling the dynamics as a restricted 3-body problem, the investigated design space consists of families of periodic orbits which play host to the constellations, punctuated by arrangements of spacecraft autonomously guided by a navigation strategy called LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation). Instead of more traditional exhaustive search methods, a numerical continuation approach is implemented to map the admissible configuration space. In particular, Keller's pseudo-arclength technique is used to follow folding/bifurcating solution manifolds, which are otherwise inaccessible with other parameter continuation schemes. A succinct characterization of the underlying structure of the local, as well as global, extrema is thus achievable with little a priori intuition of the solution space. Furthermore, the proposed design methodology offers benefits in computation speed plus the ability to handle mildly stochastic systems. An application of the constellation design methodology to the restricted Earth-Moon system, reveals optimal pairwise configurations for various L1, L2, and L5 (halo, axial, and vertical) periodic orbit families. Navigation accuracies, ranging from O (10+/-1) meters in position space, are obtained for the optimal Earth-Moon constellations, given measurement noise on the order of 1 meter.

  1. A simplified satellite navigation system for an autonomous Mars roving vehicle.

    NASA Technical Reports Server (NTRS)

    Janosko, R. E.; Shen, C. N.

    1972-01-01

    The use of a retroflecting satellite and a laser rangefinder to navigate a Martian roving vehicle is considered in this paper. It is shown that a simple system can be employed to perform this task. An error analysis is performed on the navigation equations and it is shown that the error inherent in the scheme proposed can be minimized by the proper choice of measurement geometry. A nonlinear programming approach is used to minimize the navigation error subject to constraints that are due to geometric and laser requirements. The problem is solved for a particular set of laser parameters and the optimal solution is presented.

  2. Autonomous navigation accuracy using simulated horizon sensor and sun sensor observations

    NASA Technical Reports Server (NTRS)

    Pease, G. E.; Hendrickson, H. T.

    1980-01-01

    A relatively simple autonomous system which would use horizon crossing indicators, a sun sensor, a quartz oscillator, and a microprogrammed computer is discussed. The sensor combination is required only to effectively measure the angle between the centers of the Earth and the Sun. Simulations for a particular orbit indicate that 2 km r.m.s. orbit determination uncertainties may be expected from a system with 0.06 deg measurement uncertainty. A key finding is that knowledge of the satellite orbit plane orientation can be maintained to this level because of the annual motion of the Sun and the predictable effects of Earth oblateness. The basic system described can be updated periodically by transits of the Moon through the IR horizon crossing indicator fields of view.

  3. Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.

    1996-01-01

    The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,

  4. The Ship Movement Trajectory Prediction Algorithm Using Navigational Data Fusion.

    PubMed

    Borkowski, Piotr

    2017-06-20

    It is essential for the marine navigator conducting maneuvers of his ship at sea to know future positions of himself and target ships in a specific time span to effectively solve collision situations. This article presents an algorithm of ship movement trajectory prediction, which, through data fusion, takes into account measurements of the ship's current position from a number of doubled autonomous devices. This increases the reliability and accuracy of prediction. The algorithm has been implemented in NAVDEC, a navigation decision support system and practically used on board ships.

  5. The Ship Movement Trajectory Prediction Algorithm Using Navigational Data Fusion

    PubMed Central

    Borkowski, Piotr

    2017-01-01

    It is essential for the marine navigator conducting maneuvers of his ship at sea to know future positions of himself and target ships in a specific time span to effectively solve collision situations. This article presents an algorithm of ship movement trajectory prediction, which, through data fusion, takes into account measurements of the ship’s current position from a number of doubled autonomous devices. This increases the reliability and accuracy of prediction. The algorithm has been implemented in NAVDEC, a navigation decision support system and practically used on board ships. PMID:28632176

  6. The Design of a Navigator for a Testbed Autonomous Underwater Vehicle

    DTIC Science & Technology

    1989-12-01

    AD-A231 733 NAVAL POSTGRADUATE SCHOOL Monterey, California DTC C ’S B’- i I A VDI ELECTE 1i EB 0 6 991| D THESIS E THE DESIGN OF A NAVIGATOR FOR A...255,255,0); /* yellow (512-1023) * for (i= 1024; i< 2048 ; ++i) mapcolor(i,255 ,0,255); 1* magenta (1024-2047 )*/ color(BLACK); clearo; swapbufferso; 1

  7. Intelligent Behavioral Action Aiding for Improved Autonomous Image Navigation

    DTIC Science & Technology

    2012-09-13

    odometry, SICK laser scanning unit ( Lidar ), Inertial Measurement Unit (IMU) and ultrasonic distance measurement system (Figure 32). The Lidar , IMU...2010, July) GPS world. [Online]. http://www.gpsworld.com/tech-talk- blog/gnss-independent-navigation-solution-using-integrated- lidar -data-11378 [4...Milford, David McKinnon, Michael Warren, Gordon Wyeth, and Ben Upcroft, "Feature-based Visual Odometry and Featureless Place Recognition for SLAM in

  8. Wind-based navigation of a hot-air balloon on Titan: a feasibility study

    NASA Astrophysics Data System (ADS)

    Furfaro, Roberto; Lunine, Jonathan I.; Elfes, Alberto; Reh, Kim

    2008-04-01

    Current analysis of data streamed back to Earth by the Cassini spacecraft features Titan as one of the most exciting places in the solar system. NASA centers and universities around the US, as well as the European Space Agency, are studying the possibility of sending, as part of the next mission to this giant moon of Saturn, a hot-air balloon (Montgolfier-type) for further and more in-depth exploration. The basic idea would be to design a reliable, semi-autonomous, and yet cheap Montgolfier capable of using continuous flow of waste heat from a power source to lift the balloon and sustain its altitude in the Titan environment. In this paper we study the problem of locally navigating a hot-air balloon in the nitrogen-based Titan atmosphere. The basic idea is to define a strategy (i.e. design of a suitable guidance system) that allows autonomous and semi-autonomous navigation of the balloon using the available (and partial) knowledge of the wind structure blowing on the saturnian satellite surface. Starting from first principles we determined the appropriate thermal and dynamical models describing (a) the vertical dynamics of the balloon and (b) the dynamics of the balloon moving on a vertical plane (2-D motion). Next, various non-linear fuzzy-based control strategies have been evaluated, analyzed and implemented in MATLAB to numerically simulate the capability of the system to simultaneously maintain altitude, as well as a scientifically desirable trajectory. We also looked at the ability of the balloon to perform station keeping. The results of the simulation are encouraging and show the effectiveness of such a system to cheaply and effectively perform semi-autonomous exploration of Titan.

  9. An interactive physics-based unmanned ground vehicle simulator leveraging open source gaming technology: progress in the development and application of the virtual autonomous navigation environment (VANE) desktop

    NASA Astrophysics Data System (ADS)

    Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.

    2009-05-01

    It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.

  10. Immune systems are not just for making you feel better: they are for controlling autonomous robots

    NASA Astrophysics Data System (ADS)

    Rosenblum, Mark

    2005-05-01

    The typical algorithm for robot autonomous navigation in off-road complex environments involves building a 3D map of the robot's surrounding environment using a 3D sensing modality such as stereo vision or active laser scanning, and generating an instantaneous plan to navigate around hazards. Although there has been steady progress using these methods, these systems suffer from several limitations that cannot be overcome with 3D sensing and planning alone. Geometric sensing alone has no ability to distinguish between compressible and non-compressible materials. As a result, these systems have difficulty in heavily vegetated environments and require sensitivity adjustments across different terrain types. On the planning side, these systems have no ability to learn from their mistakes and avoid problematic environmental situations on subsequent encounters. We have implemented an adaptive terrain classification system based on the Artificial Immune System (AIS) computational model, which is loosely based on the biological immune system, that combines various forms of imaging sensor inputs to produce a "feature labeled" image of the scene categorizing areas as benign or detrimental for autonomous robot navigation. Because of the qualities of the AIS computation model, the resulting system will be able to learn and adapt on its own through interaction with the environment by modifying its interpretation of the sensor data. The feature labeled results from the AIS analysis are inserted into a map and can then be used by a planner to generate a safe route to a goal point. The coupling of diverse visual cues with the malleable AIS computational model will lead to autonomous robotic ground vehicles that require less human intervention for deployment in novel environments and more robust operation as a result of the system's ability to improve its performance through interaction with the environment.

  11. Semi autonomous mine detection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Douglas Few; Roelof Versteeg; Herman Herman

    2010-04-01

    CMMAD is a risk reduction effort for the AMDS program. As part of CMMAD, multiple instances of semi autonomous robotic mine detection systems were created. Each instance consists of a robotic vehicle equipped with sensors required for navigation and marking, a countermine sensors and a number of integrated software packages which provide for real time processing of the countermine sensor data as well as integrated control of the robotic vehicle, the sensor actuator and the sensor. These systems were used to investigate critical interest functions (CIF) related to countermine robotic systems. To address the autonomy CIF, the INL developed RIKmore » was extended to allow for interaction with a mine sensor processing code (MSPC). In limited field testing this system performed well in detecting, marking and avoiding both AT and AP mines. Based on the results of the CMMAD investigation we conclude that autonomous robotic mine detection is feasible. In addition, CMMAD contributed critical technical advances with regard to sensing, data processing and sensor manipulation, which will advance the performance of future fieldable systems. As a result, no substantial technical barriers exist which preclude – from an autonomous robotic perspective – the rapid development and deployment of fieldable systems.« less

  12. Agent Based Software for the Autonomous Control of Formation Flying Spacecraft

    NASA Technical Reports Server (NTRS)

    How, Jonathan P.; Campbell, Mark; Dennehy, Neil (Technical Monitor)

    2003-01-01

    Distributed satellite systems is an enabling technology for many future NASA/DoD earth and space science missions, such as MMS, MAXIM, Leonardo, and LISA [1, 2, 3]. While formation flying offers significant science benefits, to reduce the operating costs for these missions it will be essential that these multiple vehicles effectively act as a single spacecraft by performing coordinated observations. Autonomous guidance, navigation, and control as part of a coordinated fleet-autonomy is a key technology that will help accomplish this complex goal. This is no small task, as most current space missions require significant input from the ground for even relatively simple decisions such as thruster burns. Work for the NMP DS1 mission focused on the development of the New Millennium Remote Agent (NMRA) architecture for autonomous spacecraft control systems. NMRA integrates traditional real-time monitoring and control with components for constraint-based planning, robust multi-threaded execution, and model-based diagnosis and reconfiguration. The complexity of using an autonomous approach for space flight software was evident when most of its capabilities were stripped off prior to launch (although more capability was uplinked subsequently, and the resulting demonstration was very successful).

  13. New High-Altitude GPS Navigation Results from the Magnetospheric Multiscale Spacecraft and Simulations at Lunar Distances

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke B.; Bamford, William A.; Price, Samuel R.

    2017-01-01

    As reported in a companion work, in its first phase, NASA's 2015 highly elliptic Magnetospheric Multiscale (MMS) mission set a record for the highest altitude operational use of on-board GPS-based navigation, returning state estimates at 12 Earth radii. In early 2017 MMS transitioned to its second phase which doubled the apogee distance to 25 Earth radii, approaching halfway to the Moon. This paper will present results for GPS observability and navigation performance achieved in MMS Phase 2. Additionally, it will provide simulation results predicting the performance of the MMS navigation system applied to a pair of concept missions at Lunar distances. These studies will demonstrate how high-sensitivity GPS (or GNSS) receivers paired with onboard navigation software, as in MMS-Navigation system, can extend the envelope of autonomous onboard GPS navigation far from the Earth.

  14. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1991-01-01

    A program of research embracing teleoperator and automatic navigational control of freely flying satellite robots is presented. Current research goals include: (1) developing visual operator interfaces for improved vehicle teleoperation; (2) determining the effects of different visual interface system designs on operator performance; and (3) achieving autonomous vision-based vehicle navigation and control. This research program combines virtual-environment teleoperation studies and neutral-buoyancy experiments using a space-robot simulator vehicle currently under development. Visual-interface design options under investigation include monoscopic versus stereoscopic displays and cameras, helmet-mounted versus panel-mounted display monitors, head-tracking versus fixed or manually steerable remote cameras, and the provision of vehicle-fixed visual cues, or markers, in the remote scene for improved sensing of vehicle position, orientation, and motion.

  15. Autonomous navigation of structured city roads

    NASA Astrophysics Data System (ADS)

    Aubert, Didier; Kluge, Karl C.; Thorpe, Chuck E.

    1991-03-01

    Autonomous road following is a domain which spans a range of complexity from poorly defined unmarked dirt roads to well defined well marked highly struc-. tured highways. The YARF system (for Yet Another Road Follower) is designed to operate in the middle of this range of complexity driving on urban streets. Our research program has focused on the use of feature- and situation-specific segmentation techniques driven by an explicit model of the appearance and geometry of the road features in the environment. We report results in robust detection of white and yellow painted stripes fitting a road model to detected feature locations to determine vehicle position and local road geometry and automatic location of road features in an initial image. We also describe our planned extensions to include intersection navigation.

  16. A Conceptual Design of an Inertial Navigation System for an Autonomous Submersible Testbed Vehicle.

    DTIC Science & Technology

    1987-09-01

    new stronger hull material. A simple change in hull material can have a dramatic effect on diving performance . A stronger material directly relates to a...continuous turns, with a negligible effect on system performance . 4. The possibility of coning motion rectification due to synchronous boat oscillations...T O , OW G A ’ 6 S N O d istribu tion is unlim ited . 4a NAME Of PERFORMING ORGANIZATION 613 OFFICE SYMIOL ?a NAME 0$ MONiTORIN4G ORGANIZATION

  17. Vision-based control for flight relative to dynamic environments

    NASA Astrophysics Data System (ADS)

    Causey, Ryan Scott

    The concept of autonomous systems has been considered an enabling technology for a diverse group of military and civilian applications. The current direction for autonomous systems is increased capabilities through more advanced systems that are useful for missions that require autonomous avoidance, navigation, tracking, and docking. To facilitate this level of mission capability, passive sensors, such as cameras, and complex software are added to the vehicle. By incorporating an on-board camera, visual information can be processed to interpret the surroundings. This information allows decision making with increased situational awareness without the cost of a sensor signature, which is critical in military applications. The concepts presented in this dissertation facilitate the issues inherent to vision-based state estimation of moving objects for a monocular camera configuration. The process consists of several stages involving image processing such as detection, estimation, and modeling. The detection algorithm segments the motion field through a least-squares approach and classifies motions not obeying the dominant trend as independently moving objects. An approach to state estimation of moving targets is derived using a homography approach. The algorithm requires knowledge of the camera motion, a reference motion, and additional feature point geometry for both the target and reference objects. The target state estimates are then observed over time to model the dynamics using a probabilistic technique. The effects of uncertainty on state estimation due to camera calibration are considered through a bounded deterministic approach. The system framework focuses on an aircraft platform of which the system dynamics are derived to relate vehicle states to image plane quantities. Control designs using standard guidance and navigation schemes are then applied to the tracking and homing problems using the derived state estimation. Four simulations are implemented in MATLAB that build on the image concepts present in this dissertation. The first two simulations deal with feature point computations and the effects of uncertainty. The third simulation demonstrates the open-loop estimation of a target ground vehicle in pursuit whereas the four implements a homing control design for the Autonomous Aerial Refueling (AAR) using target estimates as feedback.

  18. The Mathematics of Navigating the Solar System

    NASA Technical Reports Server (NTRS)

    Hintz, Gerald

    2000-01-01

    In navigating spacecraft throughout the solar system, the space navigator relies on three academic disciplines - optimization, estimation, and control - that work on mathematical models of the real world. Thus, the navigator determines the flight path that will consume propellant and other resources in an efficient manner, determines where the craft is and predicts where it will go, and transfers it onto the optimal trajectory that meets operational and mission constraints. Mission requirements, for example, demand that observational measurements be made with sufficient precision that relativity must be modeled in collecting and fitting (the estimation process) the data, and propagating the trajectory. Thousands of parameters are now determined in near real-time to model the gravitational forces acting on a spacecraft in the vicinity of an irregularly shaped body. Completing these tasks requires mathematical models, analyses, and processing techniques. Newton, Gauss, Lambert, Legendre, and others are justly famous for their contributions to the mathematics of these tasks. More recently, graduate students participated in research to update the gravity model of the Saturnian system, including higher order gravity harmonics, tidal effects, and the influence of the rings. This investigation was conducted for the Cassini project to incorporate new trajectory modeling features in the navigation software. The resulting trajectory model will be used in navigating the 4-year tour of the Saturnian satellites. Also, undergraduate students are determining the ephemerides (locations versus time) of asteroids that will be used as reference objects in navigating the New Millennium's Deep Space 1 spacecraft autonomously.

  19. Error Analysis System for Spacecraft Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.; Hart, R. C.; Hartman, K. R.; Tomcsik, T. L.; Searl, J. E.; Bernstein, A.

    1997-01-01

    The Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) is currently developing improved space-navigation filtering algorithms to use the Global Positioning System (GPS) for autonomous real-time onboard orbit determination. In connection with a GPS technology demonstration on the Small Satellite Technology Initiative (SSTI)/Lewis spacecraft, FDD analysts and programmers have teamed with the GSFC Guidance, Navigation, and Control Branch to develop the GPS Enhanced Orbit Determination Experiment (GEODE) system. The GEODE system consists of a Kalman filter operating as a navigation tool for estimating the position, velocity, and additional states required to accurately navigate the orbiting Lewis spacecraft by using astrodynamic modeling and GPS measurements from the receiver. A parallel effort at the FDD is the development of a GPS Error Analysis System (GEAS) that will be used to analyze and improve navigation filtering algorithms during development phases and during in-flight calibration. For GEAS, the Kalman filter theory is extended to estimate the errors in position, velocity, and other error states of interest. The estimation of errors in physical variables at regular intervals will allow the time, cause, and effect of navigation system weaknesses to be identified. In addition, by modeling a sufficient set of navigation system errors, a system failure that causes an observed error anomaly can be traced and accounted for. The GEAS software is formulated using Object Oriented Design (OOD) techniques implemented in the C++ programming language on a Sun SPARC workstation. The Phase 1 of this effort is the development of a basic system to be used to evaluate navigation algorithms implemented in the GEODE system. This paper presents the GEAS mathematical methodology, systems and operations concepts, and software design and implementation. Results from the use of the basic system to evaluate navigation algorithms implemented on GEODE are also discussed. In addition, recommendations for generalization of GEAS functions and for new techniques to optimize the accuracy and control of the GPS autonomous onboard navigation are presented.

  20. An integrated autonomous rendezvous and docking system architecture using Centaur modern avionics

    NASA Technical Reports Server (NTRS)

    Nelson, Kurt

    1991-01-01

    The avionics system for the Centaur upper stage is in the process of being modernized with the current state-of-the-art in strapdown inertial guidance equipment. This equipment includes an integrated flight control processor with a ring laser gyro based inertial guidance system. This inertial navigation unit (INU) uses two MIL-STD-1750A processors and communicates over the MIL-STD-1553B data bus. Commands are translated into load activation through a Remote Control Unit (RCU) which incorporates the use of solid state relays. Also, a programmable data acquisition system replaces separate multiplexer and signal conditioning units. This modern avionics suite is currently being enhanced through independent research and development programs to provide autonomous rendezvous and docking capability using advanced cruise missile image processing technology and integrated GPS navigational aids. A system concept was developed to combine these technologies in order to achieve a fully autonomous rendezvous, docking, and autoland capability. The current system architecture and the evolution of this architecture using advanced modular avionics concepts being pursued for the National Launch System are discussed.

  1. PRIMUS: autonomous navigation in open terrain with a tracked vehicle

    NASA Astrophysics Data System (ADS)

    Schaub, Guenter W.; Pfaendner, Alfred H.; Schaefer, Christoph

    2004-09-01

    The German experimental robotics program PRIMUS (PRogram for Intelligent Mobile Unmanned Systems) is focused on solutions for autonomous driving in unknown open terrain, over several project phases under specific realization aspects for more than 12 years. The main task of the program is to develop algorithms for a high degree of autonomous navigation skills with off-the-shelf available hardware/sensor technology and to integrate this into military vehicles. For obstacle detection a Dornier-3D-LADAR is integrated on a tracked vehicle "Digitized WIESEL 2". For road-following a digital video camera and a visual perception module from the Universitaet der Bundeswehr Munchen (UBM) has been integrated. This paper gives an overview of the PRIMUS program with a focus on the last program phase D (2001 - 2003). This includes the system architecture, the description of the modes of operation and the technology development with the focus on obstacle avoidance and obstacle classification using a 3-D LADAR. A collection of experimental results and a short look at the next steps in the German robotics program will conclude the paper.

  2. Autonomous Navigation by a Mobile Robot

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Aghazarian, Hrand

    2005-01-01

    ROAMAN is a computer program for autonomous navigation of a mobile robot on a long (as much as hundreds of meters) traversal of terrain. Developed for use aboard a robotic vehicle (rover) exploring the surface of a remote planet, ROAMAN could also be adapted to similar use on terrestrial mobile robots. ROAMAN implements a combination of algorithms for (1) long-range path planning based on images acquired by mast-mounted, wide-baseline stereoscopic cameras, and (2) local path planning based on images acquired by body-mounted, narrow-baseline stereoscopic cameras. The long-range path-planning algorithm autonomously generates a series of waypoints that are passed to the local path-planning algorithm, which plans obstacle-avoiding legs between the waypoints. Both the long- and short-range algorithms use an occupancy-grid representation in computations to detect obstacles and plan paths. Maps that are maintained by the long- and short-range portions of the software are not shared because substantial localization errors can accumulate during any long traverse. ROAMAN is not guaranteed to generate an optimal shortest path, but does maintain the safety of the rover.

  3. Bio-Inspired Navigation of Chemical Plumes

    DTIC Science & Technology

    2006-07-01

    Bio-Inspired Navigation of Chemical Plumes Maynard J. Porter III, Captain, USAF Department of Electrical and Computer Engineering Air Force Institute...Li. " Chemical plume tracing via an autonomous underwater vehicle". IEEE Journal of Ocean Engineering , 30(2):428— 442, 2005. [6] G. A. Nevitt...Electrical and Computer Engineering Air Force Institute of Technology Dayton, OH 45433-7765, U.S.A. juan.vasquez@afit.edu May 31, 2006 Abstract - The

  4. Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion

    NASA Astrophysics Data System (ADS)

    Steinhaus, Peter; Strand, Marcus; Dillmann, Rüdiger

    2007-12-01

    Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO) for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media) in Karlsruhe.

  5. Autonomous Spacecraft Navigation Using Above-the-Constellation GPS Signals

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke

    2017-01-01

    GPS-based spacecraft navigation offers many performance and cost benefits, and GPS receivers are now standard GNC components for LEO missions. Recently, more and more high-altitude missions are taking advantage of the benefits of GPS navigation as well. High-altitude applications pose challenges, however, because receivers operating above the GPS constellations are subject to reduced signal strength and availability, and uncertain signal quality. This presentation will present the history and state-of-the-art in high-altitude GPS spacecraft navigation, including early experiments, current missions and receivers, and efforts to characterize and protect signals available to high-altitude users. Recent results from the very-high altitude MMS mission are also provided.

  6. Design and validation of a GNC system for missions to asteroids: the AIM scenario

    NASA Astrophysics Data System (ADS)

    Pellacani, A.; Kicman, P.; Suatoni, M.; Casasco, M.; Gil, J.; Carnelli, I.

    2017-12-01

    Deep space missions, and in particular missions to asteroids, impose a certain level of autonomy that depends on the mission objectives. If the mission requires the spacecraft to perform close approaches to the target body (the extreme case being a landing scenario), the autonomy level must be increased to guarantee the fast and reactive response which is required in both nominal and contingency operations. The GNC system must be designed in accordance with the required level of autonomy. The GNC system designed and tested in the frame of ESA's Asteroid Impact Mission (AIM) system studies (Phase A/B1 and Consolidation Phase) is an example of an autonomous GNC system that meets the challenging objectives of AIM. The paper reports the design of such GNC system and its validation through a DDVV plan that includes Model-in-the-Loop and Hardware-in-the-Loop testing. Main focus is the translational navigation, which is able to provide online the relative state estimation with respect to the target body using exclusively cameras as relative navigation sensors. The relative navigation outputs are meant to be used for nominal spacecraft trajectory corrections as well as to estimate the collision risk with the asteroid and, if needed, to command the execution of a collision avoidance manoeuvre to guarantee spacecraft safety

  7. High-Fidelity Flash Lidar Model Development

    NASA Technical Reports Server (NTRS)

    Hines, Glenn D.; Pierrottet, Diego F.; Amzajerdian, Farzin

    2014-01-01

    NASA's Autonomous Landing and Hazard Avoidance Technologies (ALHAT) project is currently developing the critical technologies to safely and precisely navigate and land crew, cargo and robotic spacecraft vehicles on and around planetary bodies. One key element of this project is a high-fidelity Flash Lidar sensor that can generate three-dimensional (3-D) images of the planetary surface. These images are processed with hazard detection and avoidance and hazard relative navigation algorithms, and then are subsequently used by the Guidance, Navigation and Control subsystem to generate an optimal navigation solution. A complex, high-fidelity model of the Flash Lidar was developed in order to evaluate the performance of the sensor and its interaction with the interfacing ALHAT components on vehicles with different configurations and under different flight trajectories. The model contains a parameterized, general approach to Flash Lidar detection and reflects physical attributes such as range and electronic noise sources, and laser pulse temporal and spatial profiles. It also provides the realistic interaction of the laser pulse with terrain features that include varying albedo, boulders, craters slopes and shadows. This paper gives a description of the Flash Lidar model and presents results from the Lidar operating under different scenarios.

  8. Compute Element and Interface Box for the Hazard Detection System

    NASA Technical Reports Server (NTRS)

    Villalpando, Carlos Y.; Khanoyan, Garen; Stern, Ryan A.; Some, Raphael R.; Bailey, Erik S.; Carson, John M.; Vaughan, Geoffrey M.; Werner, Robert A.; Salomon, Phil M.; Martin, Keith E.; hide

    2013-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is building a sensor that enables a spacecraft to evaluate autonomously a potential landing area to generate a list of hazardous and safe landing sites. It will also provide navigation inputs relative to those safe sites. The Hazard Detection System Compute Element (HDS-CE) box combines a field-programmable gate array (FPGA) board for sensor integration and timing, with a multicore computer board for processing. The FPGA does system-level timing and data aggregation, and acts as a go-between, removing the real-time requirements from the processor and labeling events with a high resolution time. The processor manages the behavior of the system, controls the instruments connected to the HDS-CE, and services the "heavy lifting" computational requirements for analyzing the potential landing spots.

  9. Sample Return Robot Centennial Challenge

    NASA Image and Video Library

    2012-06-16

    NASA Deputy Administrator Lori Garver, left, listens as Worcester Polytechnic Institute (WPI) Robotics Resource Center Director and NASA-WPI Sample Return Robot Centennial Challenge Judge Ken Stafford points out how the robots navigate the playing field during the challenge on Saturday, June 16, 2012 in Worcester, Mass. Teams were challenged to build autonomous robots that can identify, collect and return samples. NASA needs autonomous robotic capability for future planetary exploration. Photo Credit: (NASA/Bill Ingalls)

  10. Sample Return Robot Centennial Challenge

    NASA Image and Video Library

    2012-06-16

    NASA Deputy Administrator Lori Garver, right, listens as Worcester Polytechnic Institute (WPI) Robotics Resource Center Director and NASA-WPI Sample Return Robot Centennial Challenge Judge Ken Stafford points out how the robots navigate the playing field during the challenge on Saturday, June 16, 2012 in Worcester, Mass. Teams were challenged to build autonomous robots that can identify, collect and return samples. NASA needs autonomous robotic capability for future planetary exploration. Photo Credit: (NASA/Bill Ingalls)

  11. Autonomous Underwater Vehicle Navigation

    DTIC Science & Technology

    2008-02-01

    three standard deviations are ignored as indicated by the × marker. 25 7. REFERENCES [1] R. G. Brown and P. Y. C. Hwang , Introduction to Random Signals...autonomous underwater vehicle with six degrees of freedom. We approach this problem using an error state formulation of the Kalman filter. Integration...each position fix, but is this ad-hoc method optimal? Here, we present an approach using an error state formulation of the Kalman filter to provide an

  12. On-Line Point Positioning with Single Frame Camera Data

    DTIC Science & Technology

    1992-03-15

    tion algorithms and methods will be found in robotics and industrial quality control. 1. Project data The project has been defined as "On-line point...development and use of the OLT algorithms and meth- ods for applications in robotics , industrial quality control and autonomous vehicle naviga- tion...Of particular interest in robotics and autonomous vehicle navigation is, for example, the task of determining the position and orientation of a mobile

  13. EnEx-RANGE - Robust autonomous Acoustic Navigation in Glacial icE

    NASA Astrophysics Data System (ADS)

    Heinen, Dirk; Eliseev, Dmitry; Henke, Christoph; Jeschke, Sabina; Linder, Peter; Reuter, Sebastian; Schönitz, Sebastian; Scholz, Franziska; Weinstock, Lars Steffen; Wickmann, Stefan; Wiebusch, Christopher; Zierke, Simon

    2017-03-01

    Within the Enceladus Explorer Initiative of the DLR Space Administration navigation technologies for a future space mission are in development. Those technologies are the basis for the search for extraterrestrial life on the Saturn moon Enceladus. An autonomous melting probe, the EnEx probe, aims to extract a liquid sample from a water reservoir below the icy crust. A first EnEx probe was developed and demonstrated in a terrestrial scenario at the Bloodfalls, Taylor Glacier, Antarctica in November 2014. To enable navigation in glacier ice two acoustic systems were integrated into the probe in addition to conventional navigation technologies. The first acoustic system determines the position of the probe during the run based on propagation times of acoustic signals from emitters at reference positions at the glacier surface to receivers in the probe. The second system provides information about the forefield of the probe. It is based on sonographic principles with phased array technology integrated in the probe's melting head. Information about obstacles or sampling regions in the probe's forefield can be acquired. The development of both systems is now continued in the project EnEx-RANGE. The emitters of the localization system are replaced by a network of intelligent acoustic enabled melting probes. These localize each other by means of acoustic signals and create the reference system for the EnEx probe. This presentation includes the discussion of the intelligent acoustic network, the acoustic navigation systems of the EnEx probe and results of terrestrial tests.

  14. Nonlinear unbiased minimum-variance filter for Mars entry autonomous navigation under large uncertainties and unknown measurement bias.

    PubMed

    Xiao, Mengli; Zhang, Yongbo; Fu, Huimin; Wang, Zhihua

    2018-05-01

    High-precision navigation algorithm is essential for the future Mars pinpoint landing mission. The unknown inputs caused by large uncertainties of atmospheric density and aerodynamic coefficients as well as unknown measurement biases may cause large estimation errors of conventional Kalman filters. This paper proposes a derivative-free version of nonlinear unbiased minimum variance filter for Mars entry navigation. This filter has been designed to solve this problem by estimating the state and unknown measurement biases simultaneously with derivative-free character, leading to a high-precision algorithm for the Mars entry navigation. IMU/radio beacons integrated navigation is introduced in the simulation, and the result shows that with or without radio blackout, our proposed filter could achieve an accurate state estimation, much better than the conventional unscented Kalman filter, showing the ability of high-precision Mars entry navigation algorithm. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Integrated orbit and attitude hardware-in-the-loop simulations for autonomous satellite formation flying

    NASA Astrophysics Data System (ADS)

    Park, Han-Earl; Park, Sang-Young; Kim, Sung-Woo; Park, Chandeok

    2013-12-01

    Development and experiment of an integrated orbit and attitude hardware-in-the-loop (HIL) simulator for autonomous satellite formation flying are presented. The integrated simulator system consists of an orbit HIL simulator for orbit determination and control, and an attitude HIL simulator for attitude determination and control. The integrated simulator involves four processes (orbit determination, orbit control, attitude determination, and attitude control), which interact with each other in the same way as actual flight processes do. Orbit determination is conducted by a relative navigation algorithm using double-difference GPS measurements based on the extended Kalman filter (EKF). Orbit control is performed by a state-dependent Riccati equation (SDRE) technique that is utilized as a nonlinear controller for the formation control problem. Attitude is determined from an attitude heading reference system (AHRS) sensor, and a proportional-derivative (PD) feedback controller is used to control the attitude HIL simulator using three momentum wheel assemblies. Integrated orbit and attitude simulations are performed for a formation reconfiguration scenario. By performing the four processes adequately, the desired formation reconfiguration from a baseline of 500-1000 m was achieved with meter-level position error and millimeter-level relative position navigation. This HIL simulation demonstrates the performance of the integrated HIL simulator and the feasibility of the applied algorithms in a real-time environment. Furthermore, the integrated HIL simulator system developed in the current study can be used as a ground-based testing environment to reproduce possible actual satellite formation operations.

  16. Deep Impact Autonomous Navigation : the trials of targeting the unknown

    NASA Technical Reports Server (NTRS)

    Kubitschek, Daniel G.; Mastrodemos, Nickolaos; Werner, Robert A.; Kennedy, Brian M.; Synnott, Stephen P.; Null, George W.; Bhaskaran, Shyam; Riedel, Joseph E.; Vaughan, Andrew T.

    2006-01-01

    On July 4, 2005 at 05:44:34.2 UTC the Impactor Spacecraft (s/c) impacted comet Tempel 1 with a relative speed of 10.3 km/s capturing high-resolution images of the surface of a cometary nucleus just seconds before impact. Meanwhile, the Flyby s/c captured the impact event using both the Medium Resolution Imager (MRI) and the High Resolution Imager (HRI) and tracked the nucleus for the entire 800 sec period between impact and shield attitude transition. The objective of the Impactor s/c was to impact in an illuminated area viewable from the Flyby s/c and capture high-resolution context images of the impact site. This was accomplished by using autonomous navigation (AutoNav) algorithms and precise attitude information from the attitude determination and control subsystem (ADCS). The Flyby s/c had two primary objectives: 1) capture the impact event with the highest temporal resolution possible in order to observe the ejecta plume expansion dynamics; and 2) track the impact site for at least 800 sec to observe the crater formation and capture the highest resolution images possible of the fully developed crater. These two objectives were met by estimating the Flyby s/c trajectory relative to Tempel 1 using the same AutoNav algorithms along with precise attitude information from ADCS and independently selecting the best impact site. This paper describes the AutoNav system, what happened during the encounter with Tempel 1 and what could have happened.

  17. Autonomous navigation method for substation inspection robot based on travelling deviation

    NASA Astrophysics Data System (ADS)

    Yang, Guoqing; Xu, Wei; Li, Jian; Fu, Chongguang; Zhou, Hao; Zhang, Chuanyou; Shao, Guangting

    2017-06-01

    A new method of edge detection is proposed in substation environment, which can realize the autonomous navigation of the substation inspection robot. First of all, the road image and information are obtained by using an image acquisition device. Secondly, the noise in the region of interest which is selected in the road image, is removed with the digital image processing algorithm, the road edge is extracted by Canny operator, and the road boundaries are extracted by Hough transform. Finally, the distance between the robot and the left and the right boundaries is calculated, and the travelling distance is obtained. The robot's walking route is controlled according to the travel deviation and the preset threshold. Experimental results show that the proposed method can detect the road area in real time, and the algorithm has high accuracy and stable performance.

  18. Vision-based mapping with cooperative robots

    NASA Astrophysics Data System (ADS)

    Little, James J.; Jennings, Cullen; Murray, Don

    1998-10-01

    Two stereo-vision-based mobile robots navigate and autonomously explore their environment safely while building occupancy grid maps of the environment. The robots maintain position estimates within a global coordinate frame using landmark recognition. This allows them to build a common map by sharing position information and stereo data. Stereo vision processing and map updates are done at 3 Hz and the robots move at speeds of 200 cm/s. Cooperative mapping is achieved through autonomous exploration of unstructured and dynamic environments. The map is constructed conservatively, so as to be useful for collision-free path planning. Each robot maintains a separate copy of a shared map, and then posts updates to the common map when it returns to observe a landmark at home base. Issues include synchronization, mutual localization, navigation, exploration, registration of maps, merging repeated views (fusion), centralized vs decentralized maps.

  19. A Survey of LIDAR Technology and Its Use in Spacecraft Relative Navigation

    NASA Technical Reports Server (NTRS)

    Christian, John A.; Cryan, Scott P.

    2013-01-01

    This paper provides a survey of modern LIght Detection And Ranging (LIDAR) sensors from a perspective of how they can be used for spacecraft relative navigation. In addition to LIDAR technology commonly used in space applications today (e.g. scanning, flash), this paper reviews emerging LIDAR technologies gaining traction in other non-aerospace fields. The discussion will include an overview of sensor operating principles and specific pros/cons for each type of LIDAR. This paper provides a comprehensive review of LIDAR technology as applied specifically to spacecraft relative navigation. HE problem of orbital rendezvous and docking has been a consistent challenge for complex space missions since before the Gemini 8 spacecraft performed the first successful on-orbit docking of two spacecraft in 1966. Over the years, a great deal of effort has been devoted to advancing technology associated with all aspects of the rendezvous, proximity operations, and docking (RPOD) flight phase. After years of perfecting the art of crewed rendezvous with the Gemini, Apollo, and Space Shuttle programs, NASA began investigating the problem of autonomous rendezvous and docking (AR&D) to support a host of different mission applications. Some of these applications include autonomous resupply of the International Space Station (ISS), robotic servicing/refueling of existing orbital assets, and on-orbit assembly.1 The push towards a robust AR&D capability has led to an intensified interest in a number of different sensors capable of providing insight into the relative state of two spacecraft. The present work focuses on exploring the state-of-the-art in one of these sensors - LIght Detection And Ranging (LIDAR) sensors. It should be noted that the military community frequently uses the acronym LADAR (LAser Detection And Ranging) to refer to what this paper calls LIDARs. A LIDAR is an active remote sensing device that is typically used in space applications to obtain the range to one or more points on a target spacecraft. As the name suggests, LIDAR sensors use light (typically a laser) to illuminate the target and measure the time it takes for the emitted signal to return to the sensor. Because the light must travel from the source, to

  20. Navigation Doppler lidar sensor for precision altitude and vector velocity measurements: flight test results

    NASA Astrophysics Data System (ADS)

    Pierrottet, Diego; Amzajerdian, Farzin; Petway, Larry; Barnes, Bruce; Lockard, George; Hines, Glenn

    2011-06-01

    An all fiber Navigation Doppler Lidar (NDL) system is under development at NASA Langley Research Center (LaRC) for precision descent and landing applications on planetary bodies. The sensor produces high-resolution line of sight range, altitude above ground, ground relative attitude, and high precision velocity vector measurements. Previous helicopter flight test results demonstrated the NDL measurement concepts, including measurement precision, accuracies, and operational range. This paper discusses the results obtained from a recent campaign to test the improved sensor hardware, and various signal processing algorithms applicable to real-time processing. The NDL was mounted in an instrumentation pod aboard an Erickson Air-Crane helicopter and flown over various terrains. The sensor was one of several sensors tested in this field test by NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) project.

  1. Navigation Doppler Lidar Sensor for Precision Altitude and Vector Velocity Measurements Flight Test Results

    NASA Technical Reports Server (NTRS)

    Pierrottet, Diego F.; Lockhard, George; Amzajerdian, Farzin; Petway, Larry B.; Barnes, Bruce; Hines, Glenn D.

    2011-01-01

    An all fiber Navigation Doppler Lidar (NDL) system is under development at NASA Langley Research Center (LaRC) for precision descent and landing applications on planetary bodies. The sensor produces high resolution line of sight range, altitude above ground, ground relative attitude, and high precision velocity vector measurements. Previous helicopter flight test results demonstrated the NDL measurement concepts, including measurement precision, accuracies, and operational range. This paper discusses the results obtained from a recent campaign to test the improved sensor hardware, and various signal processing algorithms applicable to real-time processing. The NDL was mounted in an instrumentation pod aboard an Erickson Air-Crane helicopter and flown over vegetation free terrain. The sensor was one of several sensors tested in this field test by NASA?s Autonomous Landing and Hazard Avoidance Technology (ALHAT) project.

  2. Analysis of key technologies in geomagnetic navigation

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoming; Zhao, Yan

    2008-10-01

    Because of the costly price and the error accumulation of high precise Inertial Navigation Systems (INS) and the vulnerability of Global Navigation Satellite Systems (GNSS), the geomagnetic navigation technology, a passive autonomous navigation method, is paid attention again. Geomagnetic field is a natural spatial physical field, and is a function of position and time in near earth space. The navigation technology based on geomagnetic field is researched in a wide range of commercial and military applications. This paper presents the main features and the state-of-the-art of Geomagnetic Navigation System (GMNS). Geomagnetic field models and reference maps are described. Obtaining, modeling and updating accurate Anomaly Magnetic Field information is an important step for high precision geomagnetic navigation. In addition, the errors of geomagnetic measurement using strapdown magnetometers are analyzed. The precise geomagnetic data is obtained by means of magnetometer calibration and vehicle magnetic field compensation. According to the measurement data and reference map or model of geomagnetic field, the vehicle's position and attitude can be obtained using matching algorithm or state-estimating method. The tendency of geomagnetic navigation in near future is introduced at the end of this paper.

  3. Precise relative navigation using augmented CDGPS

    NASA Astrophysics Data System (ADS)

    Park, Chan-Woo

    2001-10-01

    Autonomous formation flying of multiple vehicles is a revolutionary enabling technology for many future space and earth science missions that require distributed measurements, such as sparse aperture radars and stellar interferometry. The techniques developed for the space applications will also have a significant impact on many terrestrial formation flying missions. One of the key requirements of formation flying is accurate knowledge of the relative positions and velocities between the vehicles. Several researchers have shown that the GPS is a viable sensor to perform this relative navigation. However, there are several limitations in the use of GPS because it requires adequate visibility to the NAVSTAR constellation. For some mission scenarios, such as MEO, GEO and tight formation missions, the visibility/geometry of the constellation may not be sufficient to accurately estimate the relative states. One solution to these problems is to include an RF ranging device onboard the vehicles in the formation and form a local constellation that augments the existing NAVSTAR constellation. These local range measurements, combined with the GPS measurements, can provide a sufficient number of measurements and adequate geometry to solve for the relative states. Furthermore, these RF ranging devices can be designed to provide substantially more accurate measures of the vehicle relative states than the traditional GPS pseudolites. The local range measurements also allow relative vehicle motion to be used to efficiently solve for the cycle ambiguities in real-time. This dissertation presents the development of an onboard ranging sensor and the extension of several related algorithms for a formation of vehicles with both GPS and local transmitters. Key among these are a robust cycle ambiguity estimation method and a decentralized relative navigation filter. The efficient decentralized approach to the GPS-only relative navigation problem is extended to an iterative cascade extended Kalman filtering (ICEKF) algorithm when the vehicles have onboard transmitters. Several ground testbeds were developed to demonstrate the feasibility of the augmentation concept and the relative navigation algorithms. The testbed includes the Stanford Pseudolite Transceiver Crosslink (SPTC), which was developed and extensively tested with a formation of outdoor ground vehicles.

  4. Hybrid optical navigation by crater detection for lunar pin-point landing: trajectories from helicopter flight tests

    NASA Astrophysics Data System (ADS)

    Trigo, Guilherme F.; Maass, Bolko; Krüger, Hans; Theil, Stephan

    2018-01-01

    Accurate autonomous navigation capabilities are essential for future lunar robotic landing missions with a pin-point landing requirement, since in the absence of direct line of sight to ground control during critical approach and landing phases, or when facing long signal delays the herein before mentioned capability is needed to establish a guidance solution to reach the landing site reliably. This paper focuses on the processing and evaluation of data collected from flight tests that consisted of scaled descent scenarios where the unmanned helicopter of approximately 85 kg approached a landing site from altitudes of 50 m down to 1 m for a downrange distance of 200 m. Printed crater targets were distributed along the ground track and their detection provided earth-fixed measurements. The Crater Navigation (CNav) algorithm used to detect and match the crater targets is an unmodified method used for real lunar imagery. We analyze the absolute position and attitude solutions of CNav obtained and recorded during these flight tests, and investigate the attainable quality of vehicle pose estimation using both CNav and measurements from a tactical-grade inertial measurement unit. The navigation filter proposed for this end corrects and calibrates the high-rate inertial propagation with the less frequent crater navigation fixes through a closed-loop, loosely coupled hybrid setup. Finally, the attainable accuracy of the fused solution is evaluated by comparison with the on-board ground-truth solution of a dual-antenna high-grade GNSS receiver. It is shown that the CNav is an enabler for building autonomous navigation systems with high quality and suitability for exploration mission scenarios.

  5. Integrating Terrain Maps Into a Reactive Navigation Strategy

    NASA Technical Reports Server (NTRS)

    Howard, Ayanna; Werger, Barry; Seraji, Homayoun

    2006-01-01

    An improved method of processing information for autonomous navigation of a robotic vehicle across rough terrain involves the integration of terrain maps into a reactive navigation strategy. Somewhat more precisely, the method involves the incorporation, into navigation logic, of data equivalent to regional traversability maps. The terrain characteristic is mapped using a fuzzy-logic representation of the difficulty of traversing the terrain. The method is robust in that it integrates a global path-planning strategy with sensor-based regional and local navigation strategies to ensure a high probability of success in reaching a destination and avoiding obstacles along the way. The sensor-based strategies use cameras aboard the vehicle to observe the regional terrain, defined as the area of the terrain that covers the immediate vicinity near the vehicle to a specified distance a few meters away.

  6. Comparative analysis of ROS-based monocular SLAM methods for indoor navigation

    NASA Astrophysics Data System (ADS)

    Buyval, Alexander; Afanasyev, Ilya; Magid, Evgeni

    2017-03-01

    This paper presents a comparison of four most recent ROS-based monocular SLAM-related methods: ORB-SLAM, REMODE, LSD-SLAM, and DPPTAM, and analyzes their feasibility for a mobile robot application in indoor environment. We tested these methods using video data that was recorded from a conventional wide-angle full HD webcam with a rolling shutter. The camera was mounted on a human-operated prototype of an unmanned ground vehicle, which followed a closed-loop trajectory. Both feature-based methods (ORB-SLAM, REMODE) and direct SLAMrelated algorithms (LSD-SLAM, DPPTAM) demonstrated reasonably good results in detection of volumetric objects, corners, obstacles and other local features. However, we met difficulties with recovering typical for offices homogeneously colored walls, since all of these methods created empty spaces in a reconstructed sparse 3D scene. This may cause collisions of an autonomously guided robot with unfeatured walls and thus limits applicability of maps, which are obtained by the considered monocular SLAM-related methods for indoor robot navigation.

  7. An Autonomous Control System for an Intra-Vehicular Spacecraft Mobile Monitor Prototype

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Desiano, Salvatore D.; Gawdiak, Yuri; Nicewarner, Keith

    2003-01-01

    This paper presents an overview of an ongoing research and development effort at the NASA Ames Research Center to create an autonomous control system for an internal spacecraft autonomous mobile monitor. It primary functions are to provide crew support and perform intra- vehicular sensing activities by autonomously navigating onboard the International Space Station. We describe the mission roles and high-level functional requirements for an autonomous mobile monitor. The mobile monitor prototypes, of which two are operational and one is actively being designed, physical test facilities used to perform ground testing, including a 3D micro-gravity test facility, and simulators are briefly described. We provide an overview of the autonomy framework and describe each of its components, including those used for automated planning, goal-oriented task execution, diagnosis, and fault recovery. A sample mission test scenario is also described.

  8. Autonomous landing and ingress of micro-air-vehicles in urban environments based on monocular vision

    NASA Astrophysics Data System (ADS)

    Brockers, Roland; Bouffard, Patrick; Ma, Jeremy; Matthies, Larry; Tomlin, Claire

    2011-06-01

    Unmanned micro air vehicles (MAVs) will play an important role in future reconnaissance and search and rescue applications. In order to conduct persistent surveillance and to conserve energy, MAVs need the ability to land, and they need the ability to enter (ingress) buildings and other structures to conduct reconnaissance. To be safe and practical under a wide range of environmental conditions, landing and ingress maneuvers must be autonomous, using real-time, onboard sensor feedback. To address these key behaviors, we present a novel method for vision-based autonomous MAV landing and ingress using a single camera for two urban scenarios: landing on an elevated surface, representative of a rooftop, and ingress through a rectangular opening, representative of a door or window. Real-world scenarios will not include special navigation markers, so we rely on tracking arbitrary scene features; however, we do currently exploit planarity of the scene. Our vision system uses a planar homography decomposition to detect navigation targets and to produce approach waypoints as inputs to the vehicle control algorithm. Scene perception, planning, and control run onboard in real-time; at present we obtain aircraft position knowledge from an external motion capture system, but we expect to replace this in the near future with a fully self-contained, onboard, vision-aided state estimation algorithm. We demonstrate autonomous vision-based landing and ingress target detection with two different quadrotor MAV platforms. To our knowledge, this is the first demonstration of onboard, vision-based autonomous landing and ingress algorithms that do not use special purpose scene markers to identify the destination.

  9. Autonomous Landing and Ingress of Micro-Air-Vehicles in Urban Environments Based on Monocular Vision

    NASA Technical Reports Server (NTRS)

    Brockers, Roland; Bouffard, Patrick; Ma, Jeremy; Matthies, Larry; Tomlin, Claire

    2011-01-01

    Unmanned micro air vehicles (MAVs) will play an important role in future reconnaissance and search and rescue applications. In order to conduct persistent surveillance and to conserve energy, MAVs need the ability to land, and they need the ability to enter (ingress) buildings and other structures to conduct reconnaissance. To be safe and practical under a wide range of environmental conditions, landing and ingress maneuvers must be autonomous, using real-time, onboard sensor feedback. To address these key behaviors, we present a novel method for vision-based autonomous MAV landing and ingress using a single camera for two urban scenarios: landing on an elevated surface, representative of a rooftop, and ingress through a rectangular opening, representative of a door or window. Real-world scenarios will not include special navigation markers, so we rely on tracking arbitrary scene features; however, we do currently exploit planarity of the scene. Our vision system uses a planar homography decomposition to detect navigation targets and to produce approach waypoints as inputs to the vehicle control algorithm. Scene perception, planning, and control run onboard in real-time; at present we obtain aircraft position knowledge from an external motion capture system, but we expect to replace this in the near future with a fully self-contained, onboard, vision-aided state estimation algorithm. We demonstrate autonomous vision-based landing and ingress target detection with two different quadrotor MAV platforms. To our knowledge, this is the first demonstration of onboard, vision-based autonomous landing and ingress algorithms that do not use special purpose scene markers to identify the destination.

  10. An autonomous rendezvous and docking system using cruise missile technologies

    NASA Technical Reports Server (NTRS)

    Jones, Ruel Edwin

    1991-01-01

    In November 1990 the Autonomous Rendezvous & Docking (AR&D) system was first demonstrated for members of NASA's Strategic Avionics Technology Working Group. This simulation utilized prototype hardware from the Cruise Missile and Advanced Centaur Avionics systems. The object was to show that all the accuracy, reliability and operational requirements established for a space craft to dock with Space Station Freedom could be met by the proposed system. The rapid prototyping capabilities of the Advanced Avionics Systems Development Laboratory were used to evaluate the proposed system in a real time, hardware in the loop simulation of the rendezvous and docking reference mission. The simulation permits manual, supervised automatic and fully autonomous operations to be evaluated. It is also being upgraded to be able to test an Autonomous Approach and Landing (AA&L) system. The AA&L and AR&D systems are very similar. Both use inertial guidance and control systems supplemented by GPS. Both use an Image Processing System (IPS), for target recognition and tracking. The IPS includes a general purpose multiprocessor computer and a selected suite of sensors that will provide the required relative position and orientation data. Graphic displays can also be generated by the computer, providing the astronaut / operator with real-time guidance and navigation data with enhanced video or sensor imagery.

  11. Autonomous Aerial Refueling Ground Test Demonstration—A Sensor-in-the-Loop, Non-Tracking Method

    PubMed Central

    Chen, Chao-I; Koseluk, Robert; Buchanan, Chase; Duerner, Andrew; Jeppesen, Brian; Laux, Hunter

    2015-01-01

    An essential capability for an unmanned aerial vehicle (UAV) to extend its airborne duration without increasing the size of the aircraft is called the autonomous aerial refueling (AAR). This paper proposes a sensor-in-the-loop, non-tracking method for probe-and-drogue style autonomous aerial refueling tasks by combining sensitivity adjustments of a 3D Flash LIDAR camera with computer vision based image-processing techniques. The method overcomes the inherit ambiguity issues when reconstructing 3D information from traditional 2D images by taking advantage of ready to use 3D point cloud data from the camera, followed by well-established computer vision techniques. These techniques include curve fitting algorithms and outlier removal with the random sample consensus (RANSAC) algorithm to reliably estimate the drogue center in 3D space, as well as to establish the relative position between the probe and the drogue. To demonstrate the feasibility of the proposed method on a real system, a ground navigation robot was designed and fabricated. Results presented in the paper show that using images acquired from a 3D Flash LIDAR camera as real time visual feedback, the ground robot is able to track a moving simulated drogue and continuously narrow the gap between the robot and the target autonomously. PMID:25970254

  12. SPARTAN: A High-Fidelity Simulation for Automated Rendezvous and Docking Applications

    NASA Technical Reports Server (NTRS)

    Turbe, Michael A.; McDuffie, James H.; DeKock, Brandon K.; Betts, Kevin M.; Carrington, Connie K.

    2007-01-01

    bd Systems (a subsidiary of SAIC) has developed the Simulation Package for Autonomous Rendezvous Test and ANalysis (SPARTAN), a high-fidelity on-orbit simulation featuring multiple six-degree-of-freedom (6DOF) vehicles. SPARTAN has been developed in a modular fashion in Matlab/Simulink to test next-generation automated rendezvous and docking guidance, navigation,and control algorithms for NASA's new Vision for Space Exploration. SPARTAN includes autonomous state-based mission manager algorithms responsible for sequencing the vehicle through various flight phases based on on-board sensor inputs and closed-loop guidance algorithms, including Lambert transfers, Clohessy-Wiltshire maneuvers, and glideslope approaches The guidance commands are implemented using an integrated translation and attitude control system to provide 6DOF control of each vehicle in the simulation. SPARTAN also includes high-fidelity representations of a variety of absolute and relative navigation sensors that maybe used for NASA missions, including radio frequency, lidar, and video-based rendezvous sensors. Proprietary navigation sensor fusion algorithms have been developed that allow the integration of these sensor measurements through an extended Kalman filter framework to create a single optimal estimate of the relative state of the vehicles. SPARTAN provides capability for Monte Carlo dispersion analysis, allowing for rigorous evaluation of the performance of the complete proposed AR&D system, including software, sensors, and mechanisms. SPARTAN also supports hardware-in-the-loop testing through conversion of the algorithms to C code using Real-Time Workshop in order to be hosted in a mission computer engineering development unit running an embedded real-time operating system. SPARTAN also contains both runtime TCP/IP socket interface and post-processing compatibility with bdStudio, a visualization tool developed by bd Systems, allowing for intuitive evaluation of simulation results. A description of the SPARTAN architecture and capabilities is provided, along with details on the models and algorithms utilized and results from representative missions.

  13. Control of autonomous robot using neural networks

    NASA Astrophysics Data System (ADS)

    Barton, Adam; Volna, Eva

    2017-07-01

    The aim of the article is to design a method of control of an autonomous robot using artificial neural networks. The introductory part describes control issues from the perspective of autonomous robot navigation and the current mobile robots controlled by neural networks. The core of the article is the design of the controlling neural network, and generation and filtration of the training set using ART1 (Adaptive Resonance Theory). The outcome of the practical part is an assembled Lego Mindstorms EV3 robot solving the problem of avoiding obstacles in space. To verify models of an autonomous robot behavior, a set of experiments was created as well as evaluation criteria. The speed of each motor was adjusted by the controlling neural network with respect to the situation in which the robot was found.

  14. VisNAV 100: a robust, compact imaging sensor for enabling autonomous air-to-air refueling of aircraft and unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Katake, Anup; Choi, Heeyoul

    2010-01-01

    To enable autonomous air-to-refueling of manned and unmanned vehicles a robust high speed relative navigation sensor capable of proving high accuracy 3DOF information in diverse operating conditions is required. To help address this problem, StarVision Technologies Inc. has been developing a compact, high update rate (100Hz), wide field-of-view (90deg) direction and range estimation imaging sensor called VisNAV 100. The sensor is fully autonomous requiring no communication from the tanker aircraft and contains high reliability embedded avionics to provide range, azimuth, elevation (3 degrees of freedom solution 3DOF) and closing speed relative to the tanker aircraft. The sensor is capable of providing 3DOF with an error of 1% in range and 0.1deg in azimuth/elevation up to a range of 30m and 1 deg error in direction for ranges up to 200m at 100Hz update rates. In this paper we will discuss the algorithms that were developed in-house to enable robust beacon pattern detection, outlier rejection and 3DOF estimation in adverse conditions and present the results of several outdoor tests. Results from the long range single beacon detection tests will also be discussed.

  15. Flight Test Results from the Low Power Transceiver Communications and Navigation Demonstration on Shuttle (CANDOS)

    NASA Technical Reports Server (NTRS)

    Rush, John; Israel, David; Harlacher, Marc; Haas, Lin

    2003-01-01

    The Low Power Transceiver (LPT) is an advanced signal processing platform that offers a configurable and reprogrammable capability for supporting communications, navigation and sensor functions for mission applications ranging from spacecraft TT&C and autonomous orbit determination to sophisticated networks that use crosslinks to support communications and real-time relative navigation for formation flying. The LPT is the result of extensive collaborative research under NASNGSFC s Advanced Technology Program and ITT Industries internal research and development efforts. Its modular, multi-channel design currently enables transmitting and receiving communication signals on L- or S-band frequencies and processing GPS L-band signals for precision navigation. The LPT flew as a part of the GSFC Hitchhiker payload named Fast Reaction Experiments Enabling Science Technology And Research (FREESTAR) on-board Space Shuttle Columbia s final mission. The experiment demonstrated functionality in GPS-based navigation and orbit determination, NASA STDN Ground Network communications, space relay communications via the NASA TDRSS, on-orbit reconfiguration of the software radio, the use of the Internet Protocol (IP) for TT&C, and communication concepts for space based range safety. All data from the experiment was recovered and, as a result, all primary and secondary objectives of the experiment were successful. This paper presents the results of the LPTs maiden space flight as a part of STS- 107.

  16. Autonomous Rovers for Polar Science Campaigns

    NASA Astrophysics Data System (ADS)

    Lever, J. H.; Ray, L. E.; Williams, R. M.; Morlock, A. M.; Burzynski, A. M.

    2012-12-01

    We have developed and deployed two over-snow autonomous rovers able to conduct remote science campaigns on Polar ice sheets. Yeti is an 80-kg, four-wheel-drive (4WD) battery-powered robot with 3 - 4 hr endurance, and Cool Robot is a 60-kg 4WD solar-powered robot with unlimited endurance during Polar summers. Both robots navigate using GPS waypoint-following to execute pre-planned courses autonomously, and they can each carry or tow 20 - 160 kg instrument payloads over typically firm Polar snowfields. In 2008 - 12, we deployed Yeti to conduct autonomous ground-penetrating radar (GPR) surveys to detect hidden crevasses to help establish safe routes for overland resupply of research stations at South Pole, Antarctica, and Summit, Greenland. We also deployed Yeti with GPR at South Pole in 2011 to identify the locations of potentially hazardous buried buildings from the original 1950's-era station. Autonomous surveys remove personnel from safety risks posed during manual GPR surveys by undetected crevasses or buried buildings. Furthermore, autonomous surveys can yield higher quality and more comprehensive data than manual ones: Yeti's low ground pressure (20 kPa) allows it to cross thinly bridged crevasses or other voids without interrupting a survey, and well-defined survey grids allow repeated detection of buried voids to improve detection reliability and map their extent. To improve survey efficiency, we have automated the mapping of detected hazards, currently identified via post-survey manual review of the GPR data. Additionally, we are developing machine-learning algorithms to detect crevasses autonomously in real time, with reliability potentially higher than manual real-time detection. These algorithms will enable the rover to relay crevasse locations to a base station for near real-time mapping and decision-making. We deployed Cool Robot at Summit Station in 2005 to verify its mobility and power budget over Polar snowfields. Using solar power, this zero-emissions rover could travel more than 500 km per week during Polar summers and provide 100 - 200 W to power instrument payloads to help investigate the atmosphere, magnetosphere, glaciology and sub-glacial geology in Antarctica and Greenland. We are currently upgrading Cool Robot's navigation and solar-power systems and will deploy it during 2013 to map the emissions footprint around Summit Station to demonstrate its potential to execute long-endurance Polar science campaigns. These rovers could assist science traverses to chart safe routes into the interior of Antarctica and Greenland or conduct autonomous, remote science campaigns to extend spatial and temporal coverage for data collection. Our goals include 1,000 - 2,000-km summertime traverses of Antarctica and Greenland, safe navigation through 0.5-m amplitude sastrugi fields, survival in blizzards, and rover-network adaptation to research events of opportunity. We are seeking Polar scientists interested in autonomous, mobile data collection and can adapt the rovers to meet their requirements.

  17. Spaceflight dynamics 1993; AAS/NASA International Symposium, 8th, Greenbelt, MD, Apr. 26-30, 1993, Parts 1 & 2

    NASA Technical Reports Server (NTRS)

    Teles, Jerome (Editor); Samii, Mina V. (Editor)

    1993-01-01

    A conference on spaceflight dynamics produced papers in the areas of orbit determination, spacecraft tracking, autonomous navigation, the Deep Space Program Science Experiment Mission (DSPSE), the Global Positioning System, attitude control, geostationary satellites, interplanetary missions and trajectories, applications of estimation theory, flight dynamics systems, low-Earth orbit missions, orbital mechanics, mission experience in attitude dynamics, mission experience in sensor studies, attitude dynamics theory and simulations, and orbit-related experience. These papaers covered NASA, European, Russian, Japanese, Chinese, and Brazilian space programs and hardware.

  18. Search Problems in Mission Planning and Navigation of Autonomous Aircraft. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Krozel, James A.

    1988-01-01

    An architecture for the control of an autonomous aircraft is presented. The architecture is a hierarchical system representing an anthropomorphic breakdown of the control problem into planner, navigator, and pilot systems. The planner system determines high level global plans from overall mission objectives. This abstract mission planning is investigated by focusing on the Traveling Salesman Problem with variations on local and global constraints. Tree search techniques are applied including the breadth first, depth first, and best first algorithms. The minimum-column and row entries for the Traveling Salesman Problem cost matrix provides a powerful heuristic to guide these search techniques. Mission planning subgoals are directed from the planner to the navigator for planning routes in mountainous terrain with threats. Terrain/threat information is abstracted into a graph of possible paths for which graph searches are performed. It is shown that paths can be well represented by a search graph based on the Voronoi diagram of points representing the vertices of mountain boundaries. A comparison of Dijkstra's dynamic programming algorithm and the A* graph search algorithm from artificial intelligence/operations research is performed for several navigation path planning examples. These examples illustrate paths that minimize a combination of distance and exposure to threats. Finally, the pilot system synthesizes the flight trajectory by creating the control commands to fly the aircraft.

  19. Experiment D005: Star occultation navigation

    NASA Technical Reports Server (NTRS)

    Silva, R. M.; Jorris, T. R.; Vallerie, E. M., III

    1971-01-01

    The usefulness of star occultation measurements for space navigation and the determination of a horizon density profile which could be used to update atmospheric models for horizon-based measurement systems were studied. The time of occultation of a known star by a celestial body, as seen by an orbiting observer, determines a cylinder of position, the axis of which is the line through the star and the body center, and the radius of which is equal to the occulting-body radius. The dimming percentage, with respect to the altitude of this grazing ray from the star to the observer, is a percentage altitude for occultation. That is, the star can be assumed to be occulted when it reaches a predetermined percentage of its unattenuated value. The procedure used was to measure this attenuation with respect to time to determine the usefulness of the measurements for autonomous space navigation. In this experiment, the crewmembers had to accomplish star acquisition, identification, calibration, and tracking. Instrumentation was required only for measurement of the relative intensity of the star as it set into the atmosphere.

  20. Autonomous Navigation and Control

    DTIC Science & Technology

    1988-10-01

    Ball Aerospace, Cincinnati Electronics, COMSAT, DSI, Harris Corporation, M/A-COM, Qualcomm , RCA, Rockwell and SPACECOM. The objective of the satellite...constellation was to provide global prioritized data-voice service during peacetime and essential communications during crises . This was to be

  1. Design considerations for imaging charge-coupled device

    NASA Astrophysics Data System (ADS)

    1981-04-01

    The image dissector tube, which was formerly used as detector in star trackers, will be replaced by solid state imaging devices. The technology advances of charge transfer devices, like the charge-coupled device (CCD) and the charge-injection device (CID) have made their application to star trackers an immediate reality. The Air Force in 1979 funded an American Aerospace company to develop an imaging CCD (ICCD) star sensor for the Multimission Attitude Determination and Autonomous Navigation (MADAN) system. The MADAN system is a technology development for a strapdown attitude and navigation system which can be used on all Air Force 3-axis stabilized satellites. The system will be autonomous and will provide real-time satellite attitude and position information. The star sensor accuracy provides an overall MADAN attitude accuracy of 2 arcsec for star rates up to 300 arcsec/sec. The ICCD is basically an integrating device. Its pixel resolution in not yet satisfactory for precision applications.

  2. A Self Contained Method for Safe and Precise Lunar Landing

    NASA Technical Reports Server (NTRS)

    Paschall, Stephen C., II; Brady, Tye; Cohanim, Babak; Sostaric, Ronald

    2008-01-01

    The return of humans to the Moon will require increased capability beyond that of the previous Apollo missions. Longer stay times and a greater flexibility with regards to landing locations are among the many improvements planned. A descent and landing system that can land the vehicle more accurately than Apollo with a greater ability to detect and avoid hazards is essential to the development of a Lunar Outpost, and also for increasing the number of potentially reachable Lunar Sortie locations. This descent and landing system should allow landings in more challenging terrain and provide more flexibility with regards to mission timing and lighting considerations, while maintaining safety as the top priority. The lunar landing system under development by the ALHAT (Autonomous precision Landing and Hazard detection Avoidance Technology) project is addressing this by providing terrain-relative navigation measurements to enhance global-scale precision, an onboard hazard-detection system to select safe landing locations, and an Autonomous GNC (Guidance, Navigation, and Control) capability to process these measurements and safely direct the vehicle to this landing location. This ALHAT landing system will enable safe and precise lunar landings without requiring lunar infrastructure in the form of navigation aids or a priori identified hazard-free landing locations. The safe landing capability provided by ALHAT uses onboard active sensing to detect hazards that are large enough to be a danger to the vehicle but too small to be detected from orbit, given currently planned orbital terrain resolution limits. Algorithms to interpret raw active sensor terrain data and generate hazard maps as well as identify safe sites and recalculate new trajectories to those sites are included as part of the ALHAT System. These improvements to descent and landing will help contribute to repeated safe and precise landings for a wide variety of terrain on the Moon.

  3. Optimal Path Planning Program for Autonomous Speed Sprayer in Orchard Using Order-Picking Algorithm

    NASA Astrophysics Data System (ADS)

    Park, T. S.; Park, S. J.; Hwang, K. Y.; Cho, S. I.

    This study was conducted to develop a software program which computes optimal path for autonomous navigation in orchard, especially for speed sprayer. Possibilities of autonomous navigation in orchard were shown by other researches which have minimized distance error between planned path and performed path. But, research of planning an optimal path for speed sprayer in orchard is hardly founded. In this study, a digital map and a database for orchard which contains GPS coordinate information (coordinates of trees and boundary of orchard) and entity information (heights and widths of trees, radius of main stem of trees, disease of trees) was designed. An orderpicking algorithm which has been used for management of warehouse was used to calculate optimum path based on the digital map. Database for digital map was created by using Microsoft Access and graphic interface for database was made by using Microsoft Visual C++ 6.0. It was possible to search and display information about boundary of an orchard, locations of trees, daily plan for scattering chemicals and plan optimal path on different orchard based on digital map, on each circumstance (starting speed sprayer in different location, scattering chemicals for only selected trees).

  4. Preliminary study of a millimeter wave FMCW InSAR for UAS indoor navigation.

    PubMed

    Scannapieco, Antonio F; Renga, Alfredo; Moccia, Antonio

    2015-01-22

    Small autonomous unmanned aerial systems (UAS) could be used for indoor inspection in emergency missions, such as damage assessment or the search for survivors in dangerous environments, e.g., power plants, underground railways, mines and industrial warehouses. Two basic functions are required to carry out these tasks, that is autonomous GPS-denied navigation with obstacle detection and high-resolution 3Dmapping with moving target detection. State-of-the-art sensors for UAS are very sensitive to environmental conditions and often fail in the case of poor visibility caused by dust, fog, smoke, flames or other factors that are met as nominal mission scenarios when operating indoors. This paper is a preliminary study concerning an innovative radar sensor based on the interferometric Synthetic Aperture Radar (SAR) principle, which has the potential to satisfy stringent requirements set by indoor autonomous operation. An architectural solution based on a frequency-modulated continuous wave (FMCW) scheme is proposed after a detailed analysis of existing compact and lightweight SAR. A preliminary system design is obtained, and the main imaging peculiarities of the novel sensor are discussed, demonstrating that high-resolution, high-quality observation of an assigned control volume can be achieved.

  5. Preliminary Study of a Millimeter Wave FMCW InSAR for UAS Indoor Navigation

    PubMed Central

    Scannapieco, Antonio F.; Renga, Alfredo; Moccia, Antonio

    2015-01-01

    Small autonomous unmanned aerial systems (UAS) could be used for indoor inspection in emergency missions, such as damage assessment or the search for survivors in dangerous environments, e.g., power plants, underground railways, mines and industrial warehouses. Two basic functions are required to carry out these tasks, that is autonomous GPS-denied navigation with obstacle detection and high-resolution 3D mapping with moving target detection. State-of-the-art sensors for UAS are very sensitive to environmental conditions and often fail in the case of poor visibility caused by dust, fog, smoke, flames or other factors that are met as nominal mission scenarios when operating indoors. This paper is a preliminary study concerning an innovative radar sensor based on the interferometric Synthetic Aperture Radar (SAR) principle, which has the potential to satisfy stringent requirements set by indoor autonomous operation. An architectural solution based on a frequency-modulated continuous wave (FMCW) scheme is proposed after a detailed analysis of existing compact and lightweight SAR. A preliminary system design is obtained, and the main imaging peculiarities of the novel sensor are discussed, demonstrating that high-resolution, high-quality observation of an assigned control volume can be achieved. PMID:25621606

  6. Real-time adaptive off-road vehicle navigation and terrain classification

    NASA Astrophysics Data System (ADS)

    Muller, Urs A.; Jackel, Lawrence D.; LeCun, Yann; Flepp, Beat

    2013-05-01

    We are developing a complete, self-contained autonomous navigation system for mobile robots that learns quickly, uses commodity components, and has the added benefit of emitting no radiation signature. It builds on the au­tonomous navigation technology developed by Net-Scale and New York University during the Defense Advanced Research Projects Agency (DARPA) Learning Applied to Ground Robots (LAGR) program and takes advantage of recent scientific advancements achieved during the DARPA Deep Learning program. In this paper we will present our approach and algorithms, show results from our vision system, discuss lessons learned from the past, and present our plans for further advancing vehicle autonomy.

  7. Coordinating teams of autonomous vehicles: an architectural perspective

    NASA Astrophysics Data System (ADS)

    Czichon, Cary; Peterson, Robert W.; Mettala, Erik G.; Vondrak, Ivo

    2005-05-01

    In defense-related robotics research, a mission level integration gap exists between mission tasks (tactical) performed by ground, sea, or air applications and elementary behaviors enacted by processing, communications, sensors, and weaponry resources (platform specific). The gap spans ensemble (heterogeneous team) behaviors, automatic MOE/MOP tracking, and tactical task modeling/simulation for virtual and mixed teams comprised of robotic and human combatants. This study surveys robotic system architectures, compares approaches for navigating problem/state spaces by autonomous systems, describes an architecture for an integrated, repository-based modeling, simulation, and execution environment, and outlines a multi-tiered scheme for robotic behavior components that is agent-based, platform-independent, and extendable via plug-ins. Tools for this integrated environment, along with a distributed agent framework for collaborative task performance are being developed by a U.S. Army funded SBIR project (RDECOM Contract N61339-04-C-0005).

  8. Reactive Sequencing for Autonomous Navigation Evolving from Phoenix Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Grasso, Christopher A.; Riedel, Joseph E.; Vaughan, Andrew T.

    2010-01-01

    Virtual Machine Language (VML) is an award-winning advanced procedural sequencing language in use on NASA deep-space missions since 1997, and was used for the successful entry, descent, and landing (EDL) of the Phoenix spacecraft onto the surface of Mars. Phoenix EDL utilized a state-oriented operations architecture which executed within the constraints of the existing VML 2.0 flight capability, compatible with the linear "land or die" nature of the mission. The intricacies of Phoenix EDL included the planned discarding of portions of the vehicle, the complex communications management for relay through on-orbit assets, the presence of temporally indeterminate physical events, and the need to rapidly catch up four days of sequencing should a reboot of the spacecraft flight computer occur shortly before atmospheric entry. These formidable operational challenges led to new techniques for packaging and coordinating reusable sequences called blocks using one-way synchronization via VML sequencing global variable events. The coordinated blocks acted as an ensemble to land the spacecraft, while individually managing various elements in as simple a fashion as possible. This paper outlines prototype VML 2.1 flight capabilities that have evolved from the one-way synchronization techniques in order to implement even more ambitious autonomous mission capabilities. Target missions for these new capabilities include autonomous touch-and-go sampling of cometary and asteroidal bodies, lunar landing of robotic missions, and ultimately landing of crewed lunar vehicles. Close proximity guidance, navigation, and control operations, on-orbit rendezvous, and descent and landing events featured in these missions require elaborate abort capability, manifesting highly non-linear scenarios that are so complex as to overtax traditional sequencing, or even the sort of one-way coordinated sequencing used during EDL. Foreseeing advanced command and control needs for small body and lunar landing guidance, navigation and control scenarios, work began three years ago on substantial upgrades to VML that are now being exercised in scenarios for lunar landing and comet/asteroid rendezvous. The advanced state-based approach includes coordinated state transition machines with distributed decision-making logic. These state machines are not merely sequences - they are reactive logic constructs capable of autonomous decision making within a well-defined domain. Combined with the JPL's AutoNav software used on Deep Space 1 and Deep Impact, the system allows spacecraft to autonomously navigate to an unmapped surface, soft-contact, and either land or ascend. The state machine architecture enabled by VML 2.1 has successfully performed sampling missions and lunar descent missions in a simulated environment, and is progressing toward flight capability. The authors are also investigating using the VML 2.1 flight director architecture to perform autonomous activities like rendezvous with a passive hypothetical Mars sample return capsule. The approach being pursued is similar to the touch-and-go sampling state machines, with the added complications associated with the search for, physical capture of, and securing of a separate spacecraft. Complications include optically finding and tracking the Orbiting Sample Capsule (OSC), keeping the OSC illuminated, making orbital adjustments, and physically capturing the OSC. Other applications could include autonomous science collection and fault compensation.

  9. Integrity Analysis of Real-Time Ppp Technique with Igs-Rts Service for Maritime Navigation

    NASA Astrophysics Data System (ADS)

    El-Diasty, M.

    2017-10-01

    Open sea and inland waterways are the most widely used mode for transporting goods worldwide. It is the International Maritime Organization (IMO) that defines the requirements for position fixing equipment for a worldwide radio-navigation system, in terms of accuracy, integrity, continuity, availability and coverage for the various phases of navigation. Satellite positioning systems can contribute to meet these requirements, as well as optimize marine transportation. Marine navigation usually consists of three major phases identified as Ocean/Coastal/Port approach/Inland waterway, in port navigation and automatic docking with alert limit ranges from 25 m to 0.25 m. GPS positioning is widely used for many applications and is currently recognized by IMO for a future maritime navigation. With the advancement in autonomous GPS positioning techniques such as Precise Point Positioning (PPP) and with the advent of new real-time GNSS correction services such as IGS-Real-Time-Service (RTS), it is necessary to investigate the integrity of the PPP-based positioning technique along with IGS-RTS service in terms of availability and reliability for safe navigation in maritime application. This paper monitors the integrity of an autonomous real-time PPP-based GPS positioning system using the IGS real-time service (RTS) for maritime applications that require minimum availability of integrity of 99.8 % to fulfil the IMO integrity standards. To examine the integrity of the real-time IGS-RTS PPP-based technique for maritime applications, kinematic data from a dual frequency GPS receiver is collected onboard a vessel and investigated with the real-time IGS-RTS PPP-based GPS positioning technique. It is shown that the availability of integrity of the real-time IGS-RTS PPP-based GPS solution is 100 % for all navigation phases and therefore fulfil the IMO integrity standards (99.8 % availability) immediately (after 1 second), after 2 minutes and after 42 minutes of convergence time for Ocean/Coastal/Port approach/Inland waterway, in port navigation and automatic docking, respectively. Moreover, the misleading information is about 2 % for all navigation phases that is considered less safe is not in immediate danger because the horizontal position error is less than the navigation alert limits.

  10. Navigation d'un vehicule autonome autour d'un asteroide

    NASA Astrophysics Data System (ADS)

    Dionne, Karine

    Les missions d'exploration planetaire utilisent des vehicules spatiaux pour acquerir les donnees scientifiques qui font avancer notre connaissance du systeme solaire. Depuis les annees 90, ces missions ciblent non seulement les planetes, mais aussi les corps celestes de plus petite taille comme les asteroides. Ces astres representent un defi particulier du point de vue des systemes de navigation, car leur environnement dynamique est complexe. Une sonde spatiale doit reagir rapidement face aux perturbations gravitationnelles en presence, sans quoi sa securite pourrait etre compromise. Les delais de communication avec la Terre pouvant souvent atteindre plusieurs dizaines de minutes, il est necessaire de developper des logiciels permettant une plus grande autonomie d'operation pour ce type de mission. Ce memoire presente un systeme de navigation autonome qui determine la position et la vitesse d'un satellite en orbite autour d'un asteroide. Il s'agit d'un filtre de Kalman etendu adaptatif a trois degres de liberte. Le systeme propose se base sur l'imagerie optique pour detecter des " points de reperes " qui ont ete prealablement cartographies. Il peut s'agir de crateres, de rochers ou de n'importe quel trait physique discernable a la camera. Les travaux de recherche realises se concentrent sur les techniques d'estimation d'etat propres a la navigation autonome. Ainsi, on suppose l'existence d'un logiciel approprie qui realise les fonctions de traitement d'image. La principale contribution de recherche consiste en l'inclusion, a chaque cycle d'estimation, d'une mesure de distance afin d'ameliorer les performances de navigation. Un estimateur d'etat de type adaptatif est necessaire pour le traitement de ces mesures, car leur precision varie dans le temps en raison de l'erreur de pointage. Les contributions secondaires de recherche sont liees a l'analyse de l'observabilite du systeme ainsi qu'a une analyse de sensibilite pour six parametres principaux de conception. Les resultats de simulation montrent que l'ajout d'une mesure de distance par cycle de mise a jour entraine une amelioration significative des performances de navigation. Ce procede reduit l'erreur d'estimation ainsi que les periodes de non-observabilite en plus de contrer la dilution de precision des mesures. Les analyses de sensibilite confirment quant a elles la contribution des mesures de distance a la diminution globale de l'erreur d'estimation et ce pour une large gamme de parametres de conception. Elles indiquent egalement que l'erreur de cartographie est un parametre critique pour les performances du systeme de navigation developpe. Mots cles : Estimation d'etat, filtre de Kalman adaptatif, navigation optique, lidar, asteroide, simulations numeriques

  11. Aspect-dependent radiated noise analysis of an underway autonomous underwater vehicle.

    PubMed

    Gebbie, John; Siderius, Martin; Allen, John S

    2012-11-01

    This paper presents an analysis of the acoustic emissions emitted by an underway REMUS-100 autonomous underwater vehicle (AUV) that were obtained near Honolulu Harbor, HI using a fixed, bottom-mounted horizontal line array (HLA). Spectral analysis, beamforming, and cross-correlation facilitate identification of independent sources of noise originating from the AUV. Fusion of navigational records from the AUV with acoustic data from the HLA allows for an aspect-dependent presentation of calculated source levels of the strongest propulsion tone.

  12. Autonomous Navigation, Dynamic Path and Work Flow Planning in Multi-Agent Robotic Swarms Project

    NASA Technical Reports Server (NTRS)

    Falker, John; Zeitlin, Nancy; Leucht, Kurt; Stolleis, Karl

    2015-01-01

    Kennedy Space Center has teamed up with the Biological Computation Lab at the University of New Mexico to create a swarm of small, low-cost, autonomous robots, called Swarmies, to be used as a ground-based research platform for in-situ resource utilization missions. The behavior of the robot swarm mimics the central-place foraging strategy of ants to find and collect resources in an unknown environment and return those resources to a central site.

  13. Integration for navigation on the UMASS mobile perception lab

    NASA Technical Reports Server (NTRS)

    Draper, Bruce; Fennema, Claude; Rochwerger, Benny; Riseman, Edward; Hanson, Allen

    1994-01-01

    Integration of real-time visual procedures for use on the Mobile Perception Lab (MPL) was presented. The MPL is an autonomous vehicle designed for testing visually guided behavior. Two critical areas of focus in the system design were data storage/exchange and process control. The Intermediate Symbolic Representation (ISR3) supported data storage and exchange, and the MPL script monitor provided process control. Resource allocation, inter-process communication, and real-time control are difficult problems which must be solved in order to construct strong autonomous systems.

  14. A Robust Mechanical Sensing System for Unmanned Sea Surface Vehicles

    NASA Technical Reports Server (NTRS)

    Kulczycki, Eric A.; Magnone, Lee J.; Huntsberger, Terrance; Aghazarian, Hrand; Padgett, Curtis W.; Trotz, David C.; Garrett, Michael S.

    2009-01-01

    The need for autonomous navigation and intelligent control of unmanned sea surface vehicles requires a mechanically robust sensing architecture that is watertight, durable, and insensitive to vibration and shock loading. The sensing system developed here comprises four black and white cameras and a single color camera. The cameras are rigidly mounted to a camera bar that can be reconfigured to mount multiple vehicles, and act as both navigational cameras and application cameras. The cameras are housed in watertight casings to protect them and their electronics from moisture and wave splashes. Two of the black and white cameras are positioned to provide lateral vision. They are angled away from the front of the vehicle at horizontal angles to provide ideal fields of view for mapping and autonomous navigation. The other two black and white cameras are positioned at an angle into the color camera's field of view to support vehicle applications. These two cameras provide an overlap, as well as a backup to the front camera. The color camera is positioned directly in the middle of the bar, aimed straight ahead. This system is applicable to any sea-going vehicle, both on Earth and in space.

  15. Flight Analysis of an Autonomously Navigated Experimental Lander for High Altitude Recovery

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey; Niehaus, Justin; Goodenow, Debra; Dunker, Storm; Montague, David

    2016-01-01

    First steps have been taken to qualify a family of parafoil systems capable of increasing the survivability and reusability of high-altitude balloon payloads. The research is motivated by the common risk facing balloon payloads where expensive flight hardware can often land in inaccessible areas that make them difficult or impossible to recover. The Autonomously Navigated Experimental Lander (ANGEL) flight test introduced a commercial Guided Parachute Aerial Delivery System (GPADS) to a previously untested environment at 108,000ft MSL to determine its high-altitude survivability and capabilities. Following release, ANGEL descended under a drogue until approximately 25,000ft, at which point the drogue was jettisoned and the main parachute was deployed, commencing navigation. Multiple data acquisition platforms were used to characterize the return-to-point technology performance and help determine its suitability for returning future scientific payloads ranging from 180 to 10,000lbs to safer and more convenient landing locations. This report describes the test vehicle design, and summarizes the captured sensor data. Various post-flight analyses are used to quantify the system's performance, gondola load data, and serve as a reference point for subsequent missions.

  16. Flight Analysis of an Autonomously Navigated Experimental Lander

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey; Niehaus, Justin; Goodenow, Debra; Dunker, Storm; Montague, David

    2016-01-01

    First steps have been taken to qualify a family of parafoil systems capable of increasing the survivability and reusability of high-altitude balloon payloads. The research is motivated by the common risk facing balloon payloads where expensive flight hardware can often land in inaccessible areas that make them difficult or impossible to recover. The Autonomously Navigated Experimental Lander (ANGEL) flight test introduced a commercial Guided Parachute Aerial Delivery System (GPADS) to a previously untested environment at 108,000 feet Mean Sea Level (MSL) to determine its high-altitude survivability and capabilities. Following release, ANGEL descended under a drogue until approximately 25,000 feet, at which point the drogue was jettisoned and the main parachute was deployed, commencing navigation. Multiple data acquisition platforms were used to characterize the return-to-point technology performance and help determine its suitability for returning future scientific payloads ranging from 180 to 10,000 pounds to safer and more convenient landing locations. This report describes the test vehicle design, and summarizes the captured sensor data. Various post-flight analyses are used to quantify the systems performance, gondola load data, and serve as a reference point for subsequent missions.

  17. Under-vehicle autonomous inspection through undercarriage signatures

    NASA Astrophysics Data System (ADS)

    Schoenherr, Edward; Smuda, Bill

    2005-05-01

    Increased threats to gate security have caused recent need for improved vehicle inspection methods at security checkpoints in various fields of defense and security. A fast, reliable system of under-vehicle inspection that detects possibly harmful or unwanted materials hidden on vehicle undercarriages and notifies the user of the presence of these materials while allowing the user a safe standoff distance from the inspection site is desirable. An autonomous under-vehicle inspection system would provide for this. The proposed system would function as follows: A low-clearance tele-operated robotic platform would be equipped with sonar/laser range finding sensors as well as a video camera. As a vehicle to be inspected enters a checkpoint, the robot would autonomously navigate under the vehicle, using algorithms to detect tire locations for weigh points. During this navigation, data would be collected from the sonar/laser range finding hardware. This range data would be used to compile an impression of the vehicle undercarriage. Once this impression is complete, the system would compare it to a database of pre-scanned undercarriage impressions. Based on vehicle makes and models, any variance between the undercarriage being inspected and the impression compared against in the database would be marked as potentially threatening. If such variances exist, the robot would navigate to these locations and place the video camera in such a manner that the location in question can be viewed from a standoff position through a TV monitor. At this time, manual control of the robot navigation and camera control can be taken to imply further, more detailed inspection of the area/materials in question. After-market vehicle modifications would provide some difficulty, yet with enough pre-screening of such modifications, the system should still prove accurate. Also, impression scans that are taken in the field can be stored and tagged with a vehicles's license plate number, and future inspections of that vehicle can be compared to already screened and cleared impressions of the same vehicle in order to search for variance.

  18. POSTMAN: Point of Sail Tacking for Maritime Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance L.; Reinhart, Felix

    2012-01-01

    Waves apply significant forces to small boats, in particular when such vessels are moving at a high speed in severe sea conditions. In addition, small high-speed boats run the risk of diving with the bow into the next wave crest during operations in the wavelengths and wave speeds that are typical for shallow water. In order to mitigate the issues of autonomous navigation in rough water, a hybrid controller called POSTMAN combines the concept of POS (point of sail) tack planning from the sailing domain with a standard PID (proportional-integral-derivative) controller that implements reliable target reaching for the motorized small boat control task. This is an embedded, adaptive software controller that uses look-ahead sensing in a closed loop method to perform path planning for safer navigation in rough waters. State-of-the-art controllers for small boats are based on complex models of the vessel's kinematics and dynamics. They enable the vessel to follow preplanned paths accurately and can theoretically control all of the small boat s six degrees of freedom. However, the problems of bow diving and other undesirable incidents are not addressed, and it is questionable if a six-DOF controller with basically a single actuator is possible at all. POSTMAN builds an adaptive capability into the controller based on sensed wave characteristics. This software will bring a muchneeded capability to unmanned small boats moving at high speeds. Previously, this class of boat was limited to wave heights of less than one meter in the sea states in which it could operate. POSTMAN is a major advance in autonomous safety for small maritime craft.

  19. The Deep Space Atomic Clock: Ushering in a New Paradigm for Radio Navigation and Science

    NASA Technical Reports Server (NTRS)

    Ely, Todd; Seubert, Jill; Prestage, John; Tjoelker, Robert

    2013-01-01

    The Deep Space Atomic Clock (DSAC) mission will demonstrate the on-orbit performance of a high-accuracy, high-stability miniaturized mercury ion atomic clock during a year-long experiment in Low Earth Orbit. DSAC's timing error requirement provides the frequency stability necessary to perform deep space navigation based solely on one-way radiometric tracking data. Compared to a two-way tracking paradigm, DSAC-enabled one-way tracking will benefit navigation and radio science by increasing the quantity and quality of tracking data. Additionally, DSAC also enables fully-autonomous onboard navigation useful for time-sensitive situations. The technology behind the mercury ion atomic clock and a DSAC mission overview are presented. Example deep space applications of DSAC, including navigation of a Mars orbiter and Europa flyby gravity science, highlight the benefits of DSAC-enabled one-way Doppler tracking.

  20. A Spatial Cognitive Map and a Human-Like Memory Model Dedicated to Pedestrian Navigation in Virtual Urban Environments

    NASA Astrophysics Data System (ADS)

    Thomas, Romain; Donikian, Stéphane

    Many articles dealing with agent navigation in an urban environment involve the use of various heuristics. Among them, one is prevalent: the search of the shortest path between two points. This strategy impairs the realism of the resulting behaviour. Indeed, psychological studies state that such a navigation behaviour is conditioned by the knowledge the subject has of its environment. Furthermore, the path a city dweller can follow may be influenced by many factors like his daily habits, or the path simplicity in term of minimum of direction changes. It appeared interesting to us to investigate how to mimic human navigation behavior with an autonomous agent. The solution we propose relies on an architecture based on a generic model of informed environment, a spatial cognitive map model merged with a human-like memory model, representing the agent's temporal knowledge of the environment, it gained along its experiences of navigation.

  1. Fast Kalman Filtering for Relative Spacecraft Position and Attitude Estimation for the Raven ISS Hosted Payload

    NASA Technical Reports Server (NTRS)

    Galante, Joseph M.; Van Eepoel, John; D'Souza, Chris; Patrick, Bryan

    2016-01-01

    The Raven ISS Hosted Payload will feature several pose measurement sensors on a pan/tilt gimbal which will be used to autonomously track resupply vehicles as they approach and depart the International Space Station. This paper discusses the derivation of a Relative Navigation Filter (RNF) to fuse measurements from the different pose measurement sensors to produce relative position and attitude estimates. The RNF relies on relative translation and orientation kinematics and careful pose sensor modeling to eliminate dependence on orbital position information and associated orbital dynamics models. The filter state is augmented with sensor biases to provide a mechanism for the filter to estimate and mitigate the offset between the measurements from different pose sensors

  2. Fast Kalman Filtering for Relative Spacecraft Position and Attitude Estimation for the Raven ISS Hosted Payload

    NASA Technical Reports Server (NTRS)

    Galante, Joseph M.; Van Eepoel, John; D' Souza, Chris; Patrick, Bryan

    2016-01-01

    The Raven ISS Hosted Payload will feature several pose measurement sensors on a pan/tilt gimbal which will be used to autonomously track resupply vehicles as they approach and depart the International Space Station. This paper discusses the derivation of a Relative Navigation Filter (RNF) to fuse measurements from the different pose measurement sensors to produce relative position and attitude estimates. The RNF relies on relative translation and orientation kinematics and careful pose sensor modeling to eliminate dependence on orbital position information and associated orbital dynamics models. The filter state is augmented with sensor biases to provide a mechanism for the filter to estimate and mitigate the offset between the measurements from different pose sensors.

  3. Canoe: An Autonomous Infrastructure-Free Indoor Navigation System.

    PubMed

    Dong, Kai; Wu, Wenjia; Ye, Haibo; Yang, Ming; Ling, Zhen; Yu, Wei

    2017-04-30

    The development of the Internet of Things (IoT) has accelerated research in indoor navigation systems, a majority of which rely on adequate wireless signals and sources. Nonetheless, deploying such a system requires periodic site-survey, which is time consuming and labor intensive. To address this issue, in this paper we present Canoe , an indoor navigation system that considers shopping mall scenarios. In our system, we do not assume any prior knowledge, such as floor-plan or the shop locations, access point placement or power settings, historical RSS measurements or fingerprints, etc. Instead, Canoe requires only that the shop owners collect and publish RSS values at the entrances of their shops and can direct a consumer to any of these shops by comparing the observed RSS values. The locations of the consumers and the shops are estimated using maximum likelihood estimation. In doing this, the direction of the target shop relative to the current orientation of the consumer can be precisely computed, such that the direction that a consumer should move can be determined. We have conducted extensive simulations using a real-world dataset. Our experiments in a real shopping mall demonstrate that if 50% of the shops publish their RSS values, Canoe can precisely navigate a consumer within 30 s, with an error rate below 9%.

  4. Canoe: An Autonomous Infrastructure-Free Indoor Navigation System

    PubMed Central

    Dong, Kai; Wu, Wenjia; Ye, Haibo; Yang, Ming; Ling, Zhen; Yu, Wei

    2017-01-01

    The development of the Internet of Things (IoT) has accelerated research in indoor navigation systems, a majority of which rely on adequate wireless signals and sources. Nonetheless, deploying such a system requires periodic site-survey, which is time consuming and labor intensive. To address this issue, in this paper we present Canoe, an indoor navigation system that considers shopping mall scenarios. In our system, we do not assume any prior knowledge, such as floor-plan or the shop locations, access point placement or power settings, historical RSS measurements or fingerprints, etc. Instead, Canoe requires only that the shop owners collect and publish RSS values at the entrances of their shops and can direct a consumer to any of these shops by comparing the observed RSS values. The locations of the consumers and the shops are estimated using maximum likelihood estimation. In doing this, the direction of the target shop relative to the current orientation of the consumer can be precisely computed, such that the direction that a consumer should move can be determined. We have conducted extensive simulations using a real-world dataset. Our experiments in a real shopping mall demonstrate that if 50% of the shops publish their RSS values, Canoe can precisely navigate a consumer within 30 s, with an error rate below 9%. PMID:28468291

  5. Autonomous integrated GPS/INS navigation experiment for OMV. Phase 1: Feasibility study

    NASA Technical Reports Server (NTRS)

    Upadhyay, Triveni N.; Priovolos, George J.; Rhodehamel, Harley

    1990-01-01

    The phase 1 research focused on the experiment definition. A tightly integrated Global Positioning System/Inertial Navigation System (GPS/INS) navigation filter design was analyzed and was shown, via detailed computer simulation, to provide precise position, velocity, and attitude (alignment) data to support navigation and attitude control requirements of future NASA missions. The application of the integrated filter was also shown to provide the opportunity to calibrate inertial instrument errors which is particularly useful in reducing INS error growth during times of GPS outages. While the Orbital Maneuvering Vehicle (OMV) provides a good target platform for demonstration and for possible flight implementation to provide improved capability, a successful proof-of-concept ground demonstration can be obtained using any simulated mission scenario data, such as Space Transfer Vehicle, Shuttle-C, Space Station.

  6. Characteristic changes in the physiological components of cybersickness.

    PubMed

    Kim, Young Youn; Kim, Hyun Ju; Kim, Eun Nam; Ko, Hee Dong; Kim, Hyun Taek

    2005-09-01

    We investigated the characteristic changes in the physiology of cybersickness when subjects were exposed to virtual reality. Sixty-one participants experienced a virtual navigation for a total of 9.5 min, and were required to detect specific virtual objects. Three questionnaires for sickness susceptibility and immersive tendency were obtained before the navigation. Sixteen electrophysiological signals were recorded before, during, and after the navigation. The severity of cybersickness experienced by participants was reported from a simulator sickness questionnaire after the navigation. The total severity of cybersickness had a significant positive correlation with gastric tachyarrhythmia, eyeblink rate, heart period, and EEG delta wave and a negative correlation with EEG beta wave. These results suggest that cybersickness accompanies the pattern changes in the activities of the central and the autonomic nervous systems.

  7. 78 FR 23226 - 36(b)(1) Arms Sales Notification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-18

    ..., Communication, Computer and Intelligence/Communication, Navigational and Identification (C4I/CNI); Autonomic.../ integration, aircraft ferry and tanker support, support equipment, tools and test equipment, communication... aircraft equipment includes: Electronic Warfare Systems; Command, Control, Communication, Computer and...

  8. Flight Testing a Real-Time Hazard Detection System for Safe Lunar Landing on the Rocket-Powered Morpheus Vehicle

    NASA Technical Reports Server (NTRS)

    Trawny, Nikolas; Huertas, Andres; Luna, Michael E.; Villalpando, Carlos Y.; Martin, Keith E.; Carson, John M.; Johnson, Andrew E.; Restrepo, Carolina; Roback, Vincent E.

    2015-01-01

    The Hazard Detection System (HDS) is a component of the ALHAT (Autonomous Landing and Hazard Avoidance Technology) sensor suite, which together provide a lander Guidance, Navigation and Control (GN&C) system with the relevant measurements necessary to enable safe precision landing under any lighting conditions. The HDS consists of a stand-alone compute element (CE), an Inertial Measurement Unit (IMU), and a gimbaled flash LIDAR sensor that are used, in real-time, to generate a Digital Elevation Map (DEM) of the landing terrain, detect candidate safe landing sites for the vehicle through Hazard Detection (HD), and generate hazard-relative navigation (HRN) measurements used for safe precision landing. Following an extensive ground and helicopter test campaign, ALHAT was integrated onto the Morpheus rocket-powered terrestrial test vehicle in March 2014. Morpheus and ALHAT then performed five successful free flights at the simulated lunar hazard field constructed at the Shuttle Landing Facility (SLF) at Kennedy Space Center, for the first time testing the full system on a lunar-like approach geometry in a relevant dynamic environment. During these flights, the HDS successfully generated DEMs, correctly identified safe landing sites and provided HRN measurements to the vehicle, marking the first autonomous landing of a NASA rocket-powered vehicle in hazardous terrain. This paper provides a brief overview of the HDS architecture and describes its in-flight performance.

  9. Analysis of the Pointing Accuracy of a 6U CubeSat Mission for Proximity Operations and Resident Space Object Imaging

    DTIC Science & Technology

    2013-05-29

    not necessarily express the views of and should not be attributed to ESA. 1 and visual navigation to maneuver autonomously to reduce the size of the...successful orbit and three-dimensional imaging of an RSO, using passive visual -only navigation and real-time near-optimal guidance. The mission design...Kit ( STK ) in the Earth-centered Earth-fixed (ECF) co- ordinate system, loaded to Simulink and transformed to the BFF for calculation of the SRP

  10. Real-Time and High-Fidelity Simulation Environment for Autonomous Ground Vehicle Dynamics

    NASA Technical Reports Server (NTRS)

    Cameron, Jonathan; Myint, Steven; Kuo, Calvin; Jain, Abhi; Grip, Havard; Jayakumar, Paramsothy; Overholt, Jim

    2013-01-01

    This paper reports on a collaborative project between U.S. Army TARDEC and Jet Propulsion Laboratory (JPL) to develop a unmanned ground vehicle (UGV) simulation model using the ROAMS vehicle modeling framework. Besides modeling the physical suspension of the vehicle, the sensing and navigation of the HMMWV vehicle are simulated. Using models of urban and off-road environments, the HMMWV simulation was tested in several ways, including navigation in an urban environment with obstacle avoidance and the performance of a lane change maneuver.

  11. Teaching Young Adults with Intellectual and Developmental Disabilities Community-Based Navigation Skills to Take Public Transportation.

    PubMed

    Price, Richard; Marsh, Abbie J; Fisher, Marisa H

    2018-03-01

    Facilitating the use of public transportation enhances opportunities for independent living and competitive, community-based employment for individuals with intellectual and developmental disabilities (IDD). Four young adults with IDD were taught through total-task chaining to use the Google Maps application, a self-prompting, visual navigation system, to take the bus to locations around a college campus and the community. Three of four participants learned to use Google Maps to independently navigate public transportation. Google Maps may be helpful in supporting independent travel, highlighting the importance of future research in teaching navigation skills. Learning to independently use public transportation increases access to autonomous activities, such as opportunities to work and to attend postsecondary education programs on large college campuses.Individuals with IDD can be taught through chaining procedures to use the Google Maps application to navigate public transportation.Mobile map applications are an effective and functional modern tool that can be used to teach community navigation.

  12. Precision analysis of autonomous orbit determination using star sensor for Beidou MEO satellite

    NASA Astrophysics Data System (ADS)

    Shang, Lin; Chang, Jiachao; Zhang, Jun; Li, Guotong

    2018-04-01

    This paper focuses on the autonomous orbit determination accuracy of Beidou MEO satellite using the onboard observations of the star sensors and infrared horizon sensor. A polynomial fitting method is proposed to calibrate the periodic error in the observation of the infrared horizon sensor, which will greatly influence the accuracy of autonomous orbit determination. Test results show that the periodic error can be eliminated using the polynomial fitting method. The User Range Error (URE) of Beidou MEO satellite is less than 2 km using the observations of the star sensors and infrared horizon sensor for autonomous orbit determination. The error of the Right Ascension of Ascending Node (RAAN) is less than 60 μrad and the observations of star sensors can be used as a spatial basis for Beidou MEO navigation constellation.

  13. A Super-Resolution Algorithm for Enhancement of FLASH LIDAR Data: Flight Test Results

    NASA Technical Reports Server (NTRS)

    Bulyshev, Alexander; Amzajerdian, Farzin; Roback, Eric; Reisse Robert

    2014-01-01

    This paper describes the results of a 3D super-resolution algorithm applied to the range data obtained from a recent Flash Lidar helicopter flight test. The flight test was conducted by the NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) project over a simulated lunar terrain facility at NASA Kennedy Space Center. ALHAT is developing the technology for safe autonomous landing on the surface of celestial bodies: Moon, Mars, asteroids. One of the test objectives was to verify the ability of 3D super-resolution technique to generate high resolution digital elevation models (DEMs) and to determine time resolved relative positions and orientations of the vehicle. 3D super-resolution algorithm was developed earlier and tested in computational modeling, and laboratory experiments, and in a few dynamic experiments using a moving truck. Prior to the helicopter flight test campaign, a 100mX100m hazard field was constructed having most of the relevant extraterrestrial hazard: slopes, rocks, and craters with different sizes. Data were collected during the flight and then processed by the super-resolution code. The detailed DEM of the hazard field was constructed using independent measurement to be used for comparison. ALHAT navigation system data were used to verify abilities of super-resolution method to provide accurate relative navigation information. Namely, the 6 degree of freedom state vector of the instrument as a function of time was restored from super-resolution data. The results of comparisons show that the super-resolution method can construct high quality DEMs and allows for identifying hazards like rocks and craters within the accordance of ALHAT requirements.

  14. A super-resolution algorithm for enhancement of flash lidar data: flight test results

    NASA Astrophysics Data System (ADS)

    Bulyshev, Alexander; Amzajerdian, Farzin; Roback, Eric; Reisse, Robert

    2013-03-01

    This paper describes the results of a 3D super-resolution algorithm applied to the range data obtained from a recent Flash Lidar helicopter flight test. The flight test was conducted by the NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) project over a simulated lunar terrain facility at NASA Kennedy Space Center. ALHAT is developing the technology for safe autonomous landing on the surface of celestial bodies: Moon, Mars, asteroids. One of the test objectives was to verify the ability of 3D super-resolution technique to generate high resolution digital elevation models (DEMs) and to determine time resolved relative positions and orientations of the vehicle. 3D super-resolution algorithm was developed earlier and tested in computational modeling, and laboratory experiments, and in a few dynamic experiments using a moving truck. Prior to the helicopter flight test campaign, a 100mX100m hazard field was constructed having most of the relevant extraterrestrial hazard: slopes, rocks, and craters with different sizes. Data were collected during the flight and then processed by the super-resolution code. The detailed DEM of the hazard field was constructed using independent measurement to be used for comparison. ALHAT navigation system data were used to verify abilities of super-resolution method to provide accurate relative navigation information. Namely, the 6 degree of freedom state vector of the instrument as a function of time was restored from super-resolution data. The results of comparisons show that the super-resolution method can construct high quality DEMs and allows for identifying hazards like rocks and craters within the accordance of ALHAT requirements.

  15. Proceedings of the 1989 CESAR/CEA (Center for Engineering Systems Advanced Research/Commissariat a l'Energie Atomique) workshop on autonomous mobile robots (May 30--June 1, 1989)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harber, K.S.; Pin, F.G.

    1990-03-01

    The US DOE Center for Engineering Systems Advanced Research (CESAR) at the Oak Ridge National Laboratory (ORNL) and the Commissariat a l'Energie Atomique's (CEA) Office de Robotique et Productique within the Directorat a la Valorization are working toward a long-term cooperative agreement and relationship in the area of Intelligent Systems Research (ISR). This report presents the proceedings of the first CESAR/CEA Workshop on Autonomous Mobile Robots which took place at ORNL on May 30, 31 and June 1, 1989. The purpose of the workshop was to present and discuss methodologies and algorithms under development at the two facilities in themore » area of perception and navigation for autonomous mobile robots in unstructured environments. Experimental demonstration of the algorithms and comparison of some of their features were proposed to take place within the framework of a previously mutually agreed-upon demonstration scenario or base-case.'' The base-case scenario described in detail in Appendix A, involved autonomous navigation by the robot in an a priori unknown environment with dynamic obstacles, in order to reach a predetermined goal. From the intermediate goal location, the robot had to search for and locate a control panel, move toward it, and dock in front of the panel face. The CESAR demonstration was successfully accomplished using the HERMIES-IIB robot while subsets of the CEA demonstration performed using the ARES robot simulation and animation system were presented. The first session of the workshop focused on these experimental demonstrations and on the needs and considerations for establishing benchmarks'' for testing autonomous robot control algorithms.« less

  16. Raven: An On-Orbit Relative Navigation Demonstration Using International Space Station Visiting Vehicles

    NASA Technical Reports Server (NTRS)

    Strube, Matthew; Henry, Ross; Skeleton, Eugene; Eepoel, John Van; Gill, Nat; McKenna, Reed

    2015-01-01

    Since the last Hubble Servicing Mission five years ago, the Satellite Servicing Capabilities Office (SSCO) at the NASA Goddard Space Flight Center (GSFC) has been focusing on maturing the technologies necessary to robotically service orbiting legacy assets-spacecraft not necessarily designed for in-flight service. Raven, SSCO's next orbital experiment to the International Space Station (ISS), is a real-time autonomous non-cooperative relative navigation system that will mature the estimation algorithms required for rendezvous and proximity operations for a satellite-servicing mission. Raven will fly as a hosted payload as part of the Space Test Program's STP-H5 mission, which will be mounted on an external ExPRESS Logistics Carrier (ELC) and will image the many visiting vehicles arriving and departing from the ISS as targets for observation. Raven will host multiple sensors: a visible camera with a variable field of view lens, a long-wave infrared camera, and a short-wave flash lidar. This sensor suite can be pointed via a two-axis gimbal to provide a wide field of regard to track the visiting vehicles as they make their approach. Various real-time vision processing algorithms will produce range, bearing, and six degree of freedom pose measurements that will be processed in a relative navigation filter to produce an optimal relative state estimate. In this overview paper, we will cover top-level requirements, experimental concept of operations, system design, and the status of Raven integration and test activities.

  17. Autonomous sensor-transponder RFID with supply energy conditioning for object navigation systems

    NASA Astrophysics Data System (ADS)

    Skoczylas, M.; Kamuda, K.; Jankowski-Mihułowicz, P.; Kalita, W.; Weglarski, Mariusz

    2014-08-01

    The properties of energy conditioning electrical circuits that are developed for powering additional functional blocks of autonomous RFID transponders working in the HF band have been analyzed and presented in the paper. The concept of autonomy is realized by implementing extra functions in the typical transponder. First of all, the autonomous system should harvest energy, e.g. from the electromagnetic field of read/write devices but also the possibility of gathering information about environment should be available, e.g. by measuring different kind of physical quantities. In such an electrical device, the crucial problem consists in energy conditioning because the output voltage-current characteristic of an front-end (antenna with matching and harvesting circuit) as well as the total and instantaneous power load generated by internal circuits are strongly dependent on a realized function but also on energy and communication conditions in the RFID interface. The properly designed solution should improve harvesting efficiency, current leakage of supply storage, matching between antenna and input circuits, in order to save energy and increase operating time in such a battery-free system. The authors present methods how to increase the autonomous operation time even at advanced measuring algorithms. The measuring system with wide spectrum of sensors dedicated for different quantities (physical, chemical, etc.) has also been presented. The results of model calculations and experimental verifications have been also discussed on the basis of investigations conducted in the unique laboratory stand of object navigation systems.

  18. Fully Self-Contained Vision-Aided Navigation and Landing of a Micro Air Vehicle Independent from External Sensor Inputs

    NASA Technical Reports Server (NTRS)

    Brockers, Roland; Susca, Sara; Zhu, David; Matthies, Larry

    2012-01-01

    Direct-lift micro air vehicles have important applications in reconnaissance. In order to conduct persistent surveillance in urban environments, it is essential that these systems can perform autonomous landing maneuvers on elevated surfaces that provide high vantage points without the help of any external sensor and with a fully contained on-board software solution. In this paper, we present a micro air vehicle that uses vision feedback from a single down looking camera to navigate autonomously and detect an elevated landing platform as a surrogate for a roof top. Our method requires no special preparation (labels or markers) of the landing location. Rather, leveraging the planar character of urban structure, the landing platform detection system uses a planar homography decomposition to detect landing targets and produce approach waypoints for autonomous landing. The vehicle control algorithm uses a Kalman filter based approach for pose estimation to fuse visual SLAM (PTAM) position estimates with IMU data to correct for high latency SLAM inputs and to increase the position estimate update rate in order to improve control stability. Scale recovery is achieved using inputs from a sonar altimeter. In experimental runs, we demonstrate a real-time implementation running on-board a micro aerial vehicle that is fully self-contained and independent from any external sensor information. With this method, the vehicle is able to search autonomously for a landing location and perform precision landing maneuvers on the detected targets.

  19. 2014 NASA Centennial Challenges Sample Return Robot Challenge

    NASA Image and Video Library

    2014-06-11

    Team KuuKulgur waits to begin the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)

  20. Quantifying navigational information: The catchment volumes of panoramic snapshots in outdoor scenes.

    PubMed

    Murray, Trevor; Zeil, Jochen

    2017-01-01

    Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its 'catchment area') has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the 'catchment volumes' within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots.

  1. Quantifying navigational information: The catchment volumes of panoramic snapshots in outdoor scenes

    PubMed Central

    Zeil, Jochen

    2017-01-01

    Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its ‘catchment area’) has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the ‘catchment volumes’ within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots. PMID:29088300

  2. In-motion initial alignment and positioning with INS/CNS/ODO integrated navigation system for lunar rovers

    NASA Astrophysics Data System (ADS)

    Lu, Jiazhen; Lei, Chaohua; Yang, Yanqiang; Liu, Ming

    2017-06-01

    Many countries have been paying great attention to space exploration, especially about the Moon and the Mars. Autonomous and high-accuracy navigation systems are needed for probers and rovers to accomplish missions. Inertial navigation system (INS)/celestial navigation system (CNS) based navigation system has been used widely on the lunar rovers. Initialization is a particularly important step for navigation. This paper presents an in-motion alignment and positioning method for lunar rovers by INS/CNS/odometer integrated navigation. The method can estimate not only the position and attitude errors, but also the biases of the accelerometers and gyros using the standard Kalman filter. The differences between the platform star azimuth, elevation angles and the computed star azimuth, elevation angles, and the difference between the velocity measured by odometer and the velocity measured by inertial sensors are taken as measurements. The semi-physical experiments are implemented to demonstrate that the position error can reduce to 10 m and attitude error is within 2″ during 5 min. The experiment results prove that it is an effective and attractive initialization approach for lunar rovers.

  3. MSR Fetch Rover Capability Development at the Canadian Space Agency

    NASA Astrophysics Data System (ADS)

    Picard, M.; Hipkin, V.; Gingras, D.; Allard, P.; Lamarche, T.; Rocheleau, S. G.; Gemme, S.

    2018-04-01

    Describes Fetch Rover technology testing during CSA's 2016 Mars Sample Return Analogue Deployment which demonstrated autonomous navigation to 'cache depots' of M-2020-like sample tubes, acquisition of six such tubes, and transfer to a MAV mock up.

  4. Semi-autonomous parking for enhanced safety and efficiency.

    DOT National Transportation Integrated Search

    2017-06-01

    This project focuses on the use of tools from a combination of computer vision and localization based navigation schemes to aid the process of efficient and safe parking of vehicles in high density parking spaces. The principles of collision avoidanc...

  5. Autonomous terrain characterization and modelling for dynamic control of unmanned vehicles

    NASA Technical Reports Server (NTRS)

    Talukder, A.; Manduchi, R.; Castano, R.; Owens, K.; Matthies, L.; Castano, A.; Hogg, R.

    2002-01-01

    This end-to-end obstacle negotiation system is envisioned to be useful in optimized path planning and vehicle navigation in terrain conditions cluttered with vegetation, bushes, rocks, etc. Results on natural terrain with various natural materials are presented.

  6. GPS/GLONASS RAIM augmentation to WAAS for CAT 1 precision approach

    DOT National Transportation Integrated Search

    1997-06-30

    This paper deals with the potential use of Receiver Autonomous Integrity Monitoring @AIM) to supplement the FAAs Wide Area Augmentation System (WAAS). Integrity refers to the capability of a navigation or landing system to provide a timely warning...

  7. Interaction dynamics of multiple mobile robots with simple navigation strategies

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1989-01-01

    The global dynamic behavior of multiple interacting autonomous mobile robots with simple navigation strategies is studied. Here, the effective spatial domain of each robot is taken to be a closed ball about its mass center. It is assumed that each robot has a specified cone of visibility such that interaction with other robots takes place only when they enter its visibility cone. Based on a particle model for the robots, various simple homing and collision-avoidance navigation strategies are derived. Then, an analysis of the dynamical behavior of the interacting robots in unbounded spatial domains is made. The article concludes with the results of computer simulations studies of two or more interacting robots.

  8. Autonomous Flight Safety System

    NASA Technical Reports Server (NTRS)

    Ferrell, Bob; Santuro, Steve; Simpson, James; Zoerner, Roger; Bull, Barton; Lanzi, Jim

    2004-01-01

    Autonomous Flight Safety System (AFSS) is an independent flight safety system designed for small to medium sized expendable launch vehicles launching from or needing range safety protection while overlying relatively remote locations. AFSS replaces the need for a man-in-the-loop to make decisions for flight termination. AFSS could also serve as the prototype for an autonomous manned flight crew escape advisory system. AFSS utilizes onboard sensors and processors to emulate the human decision-making process using rule-based software logic and can dramatically reduce safety response time during critical launch phases. The Range Safety flight path nominal trajectory, its deviation allowances, limit zones and other flight safety rules are stored in the onboard computers. Position, velocity and attitude data obtained from onboard global positioning system (GPS) and inertial navigation system (INS) sensors are compared with these rules to determine the appropriate action to ensure that people and property are not jeopardized. The final system will be fully redundant and independent with multiple processors, sensors, and dead man switches to prevent inadvertent flight termination. AFSS is currently in Phase III which includes updated algorithms, integrated GPS/INS sensors, large scale simulation testing and initial aircraft flight testing.

  9. 3D photo mosaicing of Tagiri shallow vent field by an autonomous underwater vehicle (3rd report) - Mosaicing method based on navigation data and visual features -

    NASA Astrophysics Data System (ADS)

    Maki, Toshihiro; Ura, Tamaki; Singh, Hanumant; Sakamaki, Takashi

    Large-area seafloor imaging will bring significant benefits to various fields such as academics, resource survey, marine development, security, and search-and-rescue. The authors have proposed a navigation method of an autonomous underwater vehicle for seafloor imaging, and verified its performance through mapping tubeworm colonies with the area of 3,000 square meters using the AUV Tri-Dog 1 at Tagiri vent field, Kagoshima bay in Japan (Maki et al., 2008, 2009). This paper proposes a post-processing method to build a natural photo mosaic from a number of pictures taken by an underwater platform. The method firstly removes lens distortion, invariances of color and lighting from each image, and then ortho-rectification is performed based on camera pose and seafloor estimated by navigation data. The image alignment is based on both navigation data and visual characteristics, implemented as an expansion of the image based method (Pizarro et al., 2003). Using the two types of information realizes an image alignment that is consistent both globally and locally, as well as making the method applicable to data sets with little visual keys. The method was evaluated using a data set obtained by the AUV Tri-Dog 1 at the vent field in Sep. 2009. A seamless, uniformly illuminated photo mosaic covering the area of around 500 square meters was created from 391 pictures, which covers unique features of the field such as bacteria mats and tubeworm colonies.

  10. Handling Trajectory Uncertainties for Airborne Conflict Management

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Doble, Nathan A.; Karr, David; Palmer, Michael T.

    2005-01-01

    Airborne conflict management is an enabling capability for NASA's Distributed Air-Ground Traffic Management (DAG-TM) concept. DAGTM has the goal of significantly increasing capacity within the National Airspace System, while maintaining or improving safety. Under DAG-TM, autonomous aircraft maintain separation from each other and from managed aircraft unequipped for autonomous flight. NASA Langley Research Center has developed the Autonomous Operations Planner (AOP), an onboard decision support system that provides airborne conflict management (ACM) and strategic flight planning support for autonomous aircraft pilots. The AOP performs conflict detection, prevention, and resolution from nearby traffic aircraft and area hazards. Traffic trajectory information is assumed to be provided by Automatic Dependent Surveillance Broadcast (ADS-B). Reliable trajectory prediction is a key capability for providing effective ACM functions. Trajectory uncertainties due to environmental effects, differences in aircraft systems and performance, and unknown intent information lead to prediction errors that can adversely affect AOP performance. To accommodate these uncertainties, the AOP has been enhanced to create cross-track, vertical, and along-track buffers along the predicted trajectories of both ownship and traffic aircraft. These buffers will be structured based on prediction errors noted from previous simulations such as a recent Joint Experiment between NASA Ames and Langley Research Centers and from other outside studies. Currently defined ADS-B parameters related to navigation capability, trajectory type, and path conformance will be used to support the algorithms that generate the buffers.

  11. Advancing Navigation, Timing, and Science with the Deep Space Atomic Clock

    NASA Technical Reports Server (NTRS)

    Ely, Todd A.; Seubert, Jill; Bell, Julia

    2014-01-01

    NASA's Deep Space Atomic Clock mission is developing a small, highly stable mercury ion atomic clock with an Allan deviation of at most 1e-14 at one day, and with current estimates near 3e-15. This stability enables one-way radiometric tracking data with accuracy equivalent to and, in certain conditions, better than current two-way deep space tracking data; allowing a shift to a more efficient and flexible one-way deep space navigation architecture. DSAC-enabled one-way tracking will benefit navigation and radio science by increasing the quantity and quality of tracking data. Additionally, DSAC would be a key component to fully-autonomous onboard radio navigation useful for time-sensitive situations. Potential deep space applications of DSAC are presented, including orbit determination of a Mars orbiter and gravity science on a Europa flyby mission.

  12. A Rover Mobility Platform with Autonomous Capability to Enable Mars Sample Return

    NASA Astrophysics Data System (ADS)

    Fulford, P.; Langley, C.; Shaw, A.

    2018-04-01

    The next step in understanding Mars is sample return. In Fall 2016, the CSA conducted an analogue deployment using the Mars Exploration Science Rover. An objective was to demonstrate the maturity of the rover's guidance, navigation, and control.

  13. CERTAIN: City Environment Range Testing for Autonomous Integrated Navigation

    NASA Technical Reports Server (NTRS)

    Brown, Jill

    2016-01-01

    This is a presentation to the DOI UAS Interagency group that is informational in nature; just sharing publicly available and previously released information on UAS regulations and some of the UAS operations Langley Center has, to create awareness with other government agencies.

  14. Titan Aerial Daughtercraft (TAD) for Surface Studies from a Lander or Balloon

    NASA Astrophysics Data System (ADS)

    Matthies, L.; Tokumaru, P.; Sherrit, S.; Beauchamp, P.

    2014-06-01

    Recent rapid progress on autonomous navigation of micro air vehicles for terrestrial applications opens new possibilities for a small aerial vehicle that could deploy from a Titan lander or balloon to acquire samples for analysis on the mothership.

  15. Constrained navigation for unmanned systems

    NASA Astrophysics Data System (ADS)

    Vasseur, Laurent; Gosset, Philippe; Carpentier, Luc; Marion, Vincent; Morillon, Joel G.; Ropars, Patrice

    2005-05-01

    The French Military Robotic Study Program (introduced in Aerosense 2003), sponsored by the French Defense Procurement Agency and managed by Thales as the prime contractor, focuses on about 15 robotic themes which can provide an immediate "operational add-on value". The paper details the "constrained navigation" study (named TEL2), which main goal is to identify and test a well-balanced task sharing between man and machine to accomplish a robotic task that cannot be performed autonomously at the moment because of technological limitations. The chosen function is "obstacle avoidance" on rough ground and quite high speed (40 km/h). State of the art algorithms have been implemented to perform autonomous obstacle avoidance and following of forest borders, using scanner laser sensor and standard localization functions. Such an "obstacle avoidance" function works well most of the time, BUT fails sometimes. The study analyzed how the remote operator can manage such failures so that the system remains fully operationally reliable; he can act according to two ways: a) finely adjust the vehicle current heading; b) take the control of the vehicle "on the fly" (without stopping) and bring it back to autonomous behavior when motion is secured again. The paper also presents the results got from the military acceptance tests performed on French 4x4 DARDS ATD.

  16. Localization system for use in GPS denied environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trueblood, J. J.

    The military uses to autonomous platforms to complete missions to provide standoff for the warfighters. However autonomous platforms rely on GPS to provide their global position. In many missions spaces the autonomous platforms may encounter GPS denied environments which limits where the platform operates and requires the warfighters to takes its place. GPS denied environments can occur due to tall building, trees, canyon wall blocking the GPS satellite signals or a lack of coverage. An Inertial Navigation System (INS) uses sensors to detect the vehicle movement and direction its traveling to calculate the vehicle. One of biggest challenges with anmore » INS system is the accuracy and accumulation of errors over time of the sensors. If these challenges can be overcome the INS would provide accurate positioning information to the autonomous vehicle in GPS denied environments and allow them to provide the desired standoff for the warfighters.« less

  17. GPS/DR Error Estimation for Autonomous Vehicle Localization.

    PubMed

    Lee, Byung-Hyun; Song, Jong-Hwa; Im, Jun-Hyuck; Im, Sung-Hyuck; Heo, Moon-Beom; Jee, Gyu-In

    2015-08-21

    Autonomous vehicles require highly reliable navigation capabilities. For example, a lane-following method cannot be applied in an intersection without lanes, and since typical lane detection is performed using a straight-line model, errors can occur when the lateral distance is estimated in curved sections due to a model mismatch. Therefore, this paper proposes a localization method that uses GPS/DR error estimation based on a lane detection method with curved lane models, stop line detection, and curve matching in order to improve the performance during waypoint following procedures. The advantage of using the proposed method is that position information can be provided for autonomous driving through intersections, in sections with sharp curves, and in curved sections following a straight section. The proposed method was applied in autonomous vehicles at an experimental site to evaluate its performance, and the results indicate that the positioning achieved accuracy at the sub-meter level.

  18. GPS/DR Error Estimation for Autonomous Vehicle Localization

    PubMed Central

    Lee, Byung-Hyun; Song, Jong-Hwa; Im, Jun-Hyuck; Im, Sung-Hyuck; Heo, Moon-Beom; Jee, Gyu-In

    2015-01-01

    Autonomous vehicles require highly reliable navigation capabilities. For example, a lane-following method cannot be applied in an intersection without lanes, and since typical lane detection is performed using a straight-line model, errors can occur when the lateral distance is estimated in curved sections due to a model mismatch. Therefore, this paper proposes a localization method that uses GPS/DR error estimation based on a lane detection method with curved lane models, stop line detection, and curve matching in order to improve the performance during waypoint following procedures. The advantage of using the proposed method is that position information can be provided for autonomous driving through intersections, in sections with sharp curves, and in curved sections following a straight section. The proposed method was applied in autonomous vehicles at an experimental site to evaluate its performance, and the results indicate that the positioning achieved accuracy at the sub-meter level. PMID:26307997

  19. Wind-Based Navigation of a Hot-air Balloon on Titan: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Furfaro, Roberto; Lunine, Jonathan I.; Elfes, Alberto; Reh, Kim

    2008-01-01

    Current analysis of data streamed back to Earth by the Cassini spacecraft features Titan as one of the most exciting places in the solar system. NASA centers and universities around the US, as well as the European Space Agency, are studying the possibility of sending, as part of the next mission to this giant moon of Saturn, a hot-air balloon (Montgolfier-type) for further and more in-depth exploration. The basic idea would be to design a reliable, semi-autonomous, and yet cheap Montgolfier capable of using continuous flow of waste heat from a power source to lift the balloon and sustain its altitude in the Titan environment. In this paper we study the problem of locally navigating a hot-air balloon in the nitrogen-based Titan atmosphere. The basic idea is to define a strategy (i.e. design of a suitable guidance system) that allows autonomous and semi-autonomous navigation of the balloon using the available (and partial) knowledge of the wind structure blowing on the saturnian satellite surface. Starting from first principles we determined the appropriate thermal and dynamical models describing (a) the vertical dynamics of the balloon and (b) the dynamics of the balloon moving on a vertical plane (2-D motion). Next, various non-linear fuzzy-based control strategies have been evaluated, analyzed and implemented in MATLAB to numerically simulate the capability of the system to simultaneously maintain altitude, as well as a scientifically desirable trajectory. We also looked at the ability of the balloon to perform station keeping. The results of the simulation are encouraging and show the effectiveness of such a system to cheaply and effectively perform semiautonomous exploration of Titan.

  20. Autonomous spacecraft maintenance study group

    NASA Technical Reports Server (NTRS)

    Marshall, M. H.; Low, G. D.

    1981-01-01

    A plan to incorporate autonomous spacecraft maintenance (ASM) capabilities into Air Force spacecraft by 1989 is outlined. It includes the successful operation of the spacecraft without ground operator intervention for extended periods of time. Mechanisms, along with a fault tolerant data processing system (including a nonvolatile backup memory) and an autonomous navigation capability, are needed to replace the routine servicing that is presently performed by the ground system. The state of the art fault handling capabilities of various spacecraft and computers are described, and a set conceptual design requirements needed to achieve ASM is established. Implementations for near term technology development needed for an ASM proof of concept demonstration by 1985, and a research agenda addressing long range academic research for an advanced ASM system for 1990s are established.

  1. Theseus: tethered distributed robotics (TDR)

    NASA Astrophysics Data System (ADS)

    Digney, Bruce L.; Penzes, Steven G.

    2003-09-01

    The Defence Research and Development Canada's (DRDC) Autonomous Intelligent System's program conducts research to increase the independence and effectiveness of military vehicles and systems. DRDC-Suffield's Autonomous Land Systems (ALS) is creating new concept vehicles and autonomous control systems for use in outdoor areas, urban streets, urban interiors and urban subspaces. This paper will first give an overview of the ALS program and then give a specific description of the work being done for mobility in urban subspaces. Discussed will be the Theseus: Thethered Distributed Robotics (TDR) system, which will not only manage an unavoidable tether but exploit it for mobility and navigation. Also discussed will be the prototype robot called the Hedgehog, which uses conformal 3D mobility in ducts, sewer pipes, collapsed rubble voids and chimneys.

  2. Efforts toward an autonomous wheelchair - biomed 2011.

    PubMed

    Barrett, Steven; Streeter, Robert

    2011-01-01

    An autonomous wheelchair is in development to provide mobility to those with significant physical challenges. The overall goal of the project is to develop a wheelchair that is fully autonomous with the ability to navigate about an environment and negotiate obstacles. As a starting point for the project, we have reversed engineered the joystick control system of an off-the-shelf commercially available wheelchair. The joystick control has been replaced with a microcontroller based system. The microcontroller has the capability to interface with a number of subsystems currently under development including wheel odometers, obstacle avoidance sensors, and ultrasonic-based wall sensors. This paper will discuss the microcontroller based system and provide a detailed system description. Results of this study may be adapted to commercial or military robot control.

  3. Development for SSV on a parallel processing system (PARAGON)

    NASA Astrophysics Data System (ADS)

    Gothard, Benny M.; Allmen, Mark; Carroll, Michael J.; Rich, Dan

    1995-12-01

    A goal of the surrogate semi-autonomous vehicle (SSV) program is to have multiple vehicles navigate autonomously and cooperatively with other vehicles. This paper describes the process and tools used in porting UGV/SSV (unmanned ground vehicle) autonomous mobility and target recognition algorithms from a SISD (single instruction single data) processor architecture (i.e., a Sun SPARC workstation running C/UNIX) to a MIMD (multiple instruction multiple data) parallel processor architecture (i.e., PARAGON-a parallel set of i860 processors running C/UNIX). It discusses the gains in performance and the pitfalls of such a venture. It also examines the merits of this processor architecture (based on this conceptual prototyping effort) and programming paradigm to meet the final SSV demonstration requirements.

  4. For Spacious Skies: Self-Separation with "Autonomous Flight Rules" in US Domestic Airspace

    NASA Technical Reports Server (NTRS)

    Wing, David J.; Cotton, William B.

    2011-01-01

    Autonomous Flight Rules (AFR) are proposed as a new set of operating regulations in which aircraft navigate on tracks of their choice while self-separating from traffic and weather. AFR would exist alongside Instrument and Visual Flight Rules (IFR and VFR) as one of three available flight options for any appropriately trained and qualified operator with the necessary certified equipment. Historically, ground-based separation services evolved by necessity as aircraft began operating in the clouds and were unable to see each other. Today, technologies for global precision navigation, emerging airborne surveillance, and onboard computing enable traffic conflict management to be fully integrated with navigation procedures onboard the aircraft. By self-separating, aircraft can operate with more flexibility and fewer flight restrictions than are required when using ground-based separation. The AFR concept proposes a practical means in which self-separating aircraft could share the same airspace as IFR and VFR aircraft without disrupting the ongoing processes of Air Traffic Control. The paper discusses the context and motivation for implementing self-separation in US domestic airspace. It presents a historical perspective on separation, the proposed way forward in AFR, the rationale behind mixed operations, and the expected benefits of AFR for the airspace user community.

  5. A positional estimation technique for an autonomous land vehicle in an unstructured environment

    NASA Technical Reports Server (NTRS)

    Talluri, Raj; Aggarwal, J. K.

    1990-01-01

    This paper presents a solution to the positional estimation problem of an autonomous land vehicle navigating in an unstructured mountainous terrain. A Digital Elevation Map (DEM) of the area in which the robot is to navigate is assumed to be given. It is also assumed that the robot is equipped with a camera that can be panned and tilted, and a device to measure the elevation of the robot above the ground surface. No recognizable landmarks are assumed to be present in the environment in which the robot is to navigate. The solution presented makes use of the DEM information, and structures the problem as a heuristic search in the DEM for the possible robot location. The shape and position of the horizon line in the image plane and the known camera geometry of the perspective projection are used as parameters to search the DEM. Various heuristics drawn from the geometric constraints are used to prune the search space significantly. The algorithm is made robust to errors in the imaging process by accounting for the worst care errors. The approach is tested using DEM data of areas in Colorado and Texas. The method is suitable for use in outdoor mobile robots and planetary rovers.

  6. Survey of computer vision technology for UVA navigation

    NASA Astrophysics Data System (ADS)

    Xie, Bo; Fan, Xiang; Li, Sijian

    2017-11-01

    Navigation based on computer version technology, which has the characteristics of strong independence, high precision and is not susceptible to electrical interference, has attracted more and more attention in the filed of UAV navigation research. Early navigation project based on computer version technology mainly applied to autonomous ground robot. In recent years, the visual navigation system is widely applied to unmanned machine, deep space detector and underwater robot. That further stimulate the research of integrated navigation algorithm based on computer version technology. In China, with many types of UAV development and two lunar exploration, the three phase of the project started, there has been significant progress in the study of visual navigation. The paper expounds the development of navigation based on computer version technology in the filed of UAV navigation research and draw a conclusion that visual navigation is mainly applied to three aspects as follows.(1) Acquisition of UAV navigation parameters. The parameters, including UAV attitude, position and velocity information could be got according to the relationship between the images from sensors and carrier's attitude, the relationship between instant matching images and the reference images and the relationship between carrier's velocity and characteristics of sequential images.(2) Autonomous obstacle avoidance. There are many ways to achieve obstacle avoidance in UAV navigation. The methods based on computer version technology ,including feature matching, template matching, image frames and so on, are mainly introduced. (3) The target tracking, positioning. Using the obtained images, UAV position is calculated by using optical flow method, MeanShift algorithm, CamShift algorithm, Kalman filtering and particle filter algotithm. The paper expounds three kinds of mainstream visual system. (1) High speed visual system. It uses parallel structure, with which image detection and processing are carried out at high speed. The system is applied to rapid response system. (2) The visual system of distributed network. There are several discrete image data acquisition sensor in different locations, which transmit image data to the node processor to increase the sampling rate. (3) The visual system combined with observer. The system combines image sensors with the external observers to make up for lack of visual equipment. To some degree, these systems overcome lacks of the early visual system, including low frequency, low processing efficiency and strong noise. In the end, the difficulties of navigation based on computer version technology in practical application are briefly discussed. (1) Due to the huge workload of image operation , the real-time performance of the system is poor. (2) Due to the large environmental impact , the anti-interference ability of the system is poor.(3) Due to the ability to work in a particular environment, the system has poor adaptability.

  7. COBALT Flight Demonstrations Fuse Technologies

    NASA Image and Video Library

    2017-06-07

    This 5-minute, 50-second video shows how the CoOperative Blending of Autonomous Landing Technologies (COBALT) system pairs new landing sensor technologies that promise to yield the highest precision navigation solution ever tested for NASA space landing applications. The technologies included a navigation doppler lidar (NDL), which provides ultra-precise velocity and line-of-sight range measurements, and the Lander Vision System (LVS), which provides terrain-relative navigation. Through flight campaigns conducted in March and April 2017 aboard Masten Space Systems' Xodiac, a rocket-powered vertical takeoff, vertical landing (VTVL) platform, the COBALT system was flight tested to collect sensor performance data for NDL and LVS and to check the integration and communication between COBALT and the rocket. The flight tests provided excellent performance data for both sensors, as well as valuable information on the integrated performance with the rocket that will be used for subsequent COBALT modifications prior to follow-on flight tests. Based at NASA’s Armstrong Flight Research Center in Edwards, CA, the Flight Opportunities program funds technology development flight tests on commercial suborbital space providers of which Masten is a vendor. The program has previously tested the LVS on the Masten rocket and validated the technology for the Mars 2020 rover.

  8. Biologically-inspired navigation and flight control for Mars flyer missions

    NASA Technical Reports Server (NTRS)

    Thakoor, S.; Chahl, J.; Hine, B.; Zornetzer, S.

    2003-01-01

    Bioinspired Engineering Exploration Systems (BEES), is enabling new bioinspired sensors for autonomous exploration of Mars. The steps towards autonomy in development of these BEES flyers are described. A future set of Mars mission that are uniquely enabled by surch flyers are finally described.

  9. A Robot to Help Make the Rounds

    NASA Technical Reports Server (NTRS)

    2003-01-01

    This paper presents a discussion on the Pyxis HelpMate SecurePak (SP) trackless robotic courier designed by Transitions Research Corporation, to navigate autonomously throughout medical facilities, transporting pharmaceuticals, laboratory specimens, equipment, supplies, meals, medical records, and radiology films between support departments and nursing floors.

  10. Loosely Coupled GPS-Aided Inertial Navigation System for Range Safety

    NASA Technical Reports Server (NTRS)

    Heatwole, Scott; Lanzi, Raymond J.

    2010-01-01

    The Autonomous Flight Safety System (AFSS) aims to replace the human element of range safety operations, as well as reduce reliance on expensive, downrange assets for launches of expendable launch vehicles (ELVs). The system consists of multiple navigation sensors and flight computers that provide a highly reliable platform. It is designed to ensure that single-event failures in a flight computer or sensor will not bring down the whole system. The flight computer uses a rules-based structure derived from range safety requirements to make decisions whether or not to destroy the rocket.

  11. Laser range measurement for a satellite navigation scheme and mid-range path selection and obstacle avoidance. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Zuraski, G. D.

    1972-01-01

    The functions of a laser rangefinder on board an autonomous Martian roving vehicle are discussed. The functions are: (1) navigation by means of a passive satellite and (2) mid-range path selection and obstacle avoidance. The feasibility of using a laser to make the necessary range measurements is explored and a preliminary design is presented. The two uses of the rangefinder dictate widely different operating parameters making it impossible to use the same system for both functions.

  12. Multiple estimation channel decoupling and optimization method based on inverse system

    NASA Astrophysics Data System (ADS)

    Wu, Peng; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    This paper addressed the intelligent autonomous navigation request of intelligent deformation missile, based on the intelligent deformation missile dynamics and kinematics modeling, navigation subsystem solution method and error modeling, and then focuses on the corresponding data fusion and decision fusion technology, decouples the sensitive channel of the filter input through the inverse system of design dynamics to reduce the influence of sudden change of the measurement information on the filter input. Then carrying out a series of simulation experiments, which verified the feasibility of the inverse system decoupling algorithm effectiveness.

  13. Terrain classification in navigation of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Dodds, David R.

    1991-03-01

    In this paper we describe a method of path planning that integrates terrain classification (by means of fractals) the certainty grid method of spatial representation Kehtarnavaz Griswold collision-zones Dubois Prade fuzzy temporal and spatial knowledge and non-point sized qualitative navigational planning. An initially planned (" end-to-end" ) path is piece-wise modified to accommodate known and inferred moving obstacles and includes attention to time-varying multiple subgoals which may influence a section of path at a time after the robot has begun traversing that planned path.

  14. A Long Range Science Rover For Future Mars Missions

    NASA Technical Reports Server (NTRS)

    Hayati, Samad

    1997-01-01

    This paper describes the design and implementation currently underway at the Jet Propulsion Laboratory of a long range science rover for future missions to Mars. The small rover prototype, called Rocky 7, is capable of long traverse. autonomous navigation. and science instrument control, carries three science instruments, and can be commanded from any computer platform and any location using the World Wide Web. In this paper we describe the mobility system, the sampling system, the sensor suite, navigation and control, onboard science instruments. and the ground command and control system.

  15. Visual control of navigation in insects and its relevance for robotics.

    PubMed

    Srinivasan, Mandyam V

    2011-08-01

    Flying insects display remarkable agility, despite their diminutive eyes and brains. This review describes our growing understanding of how these creatures use visual information to stabilize flight, avoid collisions with objects, regulate flight speed, detect and intercept other flying insects such as mates or prey, navigate to a distant food source, and orchestrate flawless landings. It also outlines the ways in which these insights are now being used to develop novel, biologically inspired strategies for the guidance of autonomous, airborne vehicles. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Dual RF Astrodynamic GPS Orbital Navigator Satellite

    NASA Technical Reports Server (NTRS)

    Kanipe, David B.; Provence, Robert Steve; Straube, Timothy M.; Reed, Helen; Bishop, Robert; Lightsey, Glenn

    2009-01-01

    Dual RF Astrodynamic GPS Orbital Navigator Satellite (DRAGONSat) will demonstrate autonomous rendezvous and docking (ARD) in low Earth orbit (LEO) and gather flight data with a global positioning system (GPS) receiver strictly designed for space applications. ARD is the capability of two independent spacecraft to rendezvous in orbit and dock without crew intervention. DRAGONSat consists of two picosatellites (one built by the University of Texas and one built by Texas A and M University) and the Space Shuttle Payload Launcher (SSPL); this project will ultimately demonstrate ARD in LEO.

  17. Design of a wheeled articulating land rover

    NASA Technical Reports Server (NTRS)

    Stauffer, Larry; Dilorenzo, Mathew; Yandle, Barbara

    1994-01-01

    The WALRUS is a wheeled articulating land rover that will provide Ames Research Center with a reliable, autonomous vehicle for demonstrating and evaluating advanced technologies. The vehicle is one component of the Ames Research Center's on-going Human Exploration Demonstration Project. Ames Research Center requested a system capable of traversing a broad spectrum of surface types and obstacles. In addition, this vehicle must have an autonomous navigation and control system on board and its own source of power. The resulting design is a rover that articulates in two planes of motion to allow for increased mobility and stability. The rover is driven by six conical shaped aluminum wheels, each with an independent, internally coupled motor. Mounted on the rover are two housings and a removable remote control system. In the housings, the motor controller board, tilt sensor, navigation circuitry, and QED board are mounted. Finally, the rover's motors and electronics are powered by thirty C-cell rechargeable batteries, which are located in the rover wheels and recharged by a specially designed battery charger.

  18. Demonstration of coherent Doppler lidar for navigation in GPS-denied environments

    NASA Astrophysics Data System (ADS)

    Amzajerdian, Farzin; Hines, Glenn D.; Pierrottet, Diego F.; Barnes, Bruce W.; Petway, Larry B.; Carson, John M.

    2017-05-01

    A coherent Doppler lidar has been developed to address NASA's need for a high-performance, compact, and cost-effective velocity and altitude sensor onboard its landing vehicles. Future robotic and manned missions to solar system bodies require precise ground-relative velocity vector and altitude data to execute complex descent maneuvers and safe, soft landing at a pre-designated site. This lidar sensor, referred to as a Navigation Doppler Lidar (NDL), meets the required performance of the landing missions while complying with vehicle size, mass, and power constraints. Operating from up to four kilometers altitude, the NDL obtains velocity and range precision measurements reaching 2 cm/sec and 2 meters, respectively, dominated by the vehicle motion. Terrestrial aerial vehicles will also benefit from NDL data products as enhancement or replacement to GPS systems when GPS is unavailable or redundancy is needed. The NDL offers a viable option to aircraft navigation in areas where the GPS signal can be blocked or jammed by intentional or unintentional interference. The NDL transmits three laser beams at different pointing angles toward the ground to measure range and velocity along each beam using a frequency modulated continuous wave (FMCW) technique. The three line-of-sight measurements are then combined in order to determine the three components of the vehicle velocity vector and its altitude relative to the ground. This paper describes the performance and capabilities that the NDL demonstrated through extensive ground tests, helicopter flight tests, and onboard an autonomous rocket-powered test vehicle while operating in closedloop with a guidance, navigation, and control (GN and C) system.

  19. Initial results of centralized autonomous orbit determination of the new-generation BDS satellites with inter-satellite link measurements

    NASA Astrophysics Data System (ADS)

    Tang, Chengpan; Hu, Xiaogong; Zhou, Shanshi; Liu, Li; Pan, Junyang; Chen, Liucheng; Guo, Rui; Zhu, Lingfeng; Hu, Guangming; Li, Xiaojie; He, Feng; Chang, Zhiqiao

    2018-01-01

    Autonomous orbit determination is the ability of navigation satellites to estimate the orbit parameters on-board using inter-satellite link (ISL) measurements. This study mainly focuses on data processing of the ISL measurements as a new measurement type and its application on the centralized autonomous orbit determination of the new-generation Beidou navigation satellite system satellites for the first time. The ISL measurements are dual one-way measurements that follow a time division multiple access (TDMA) structure. The ranging error of the ISL measurements is less than 0.25 ns. This paper proposes a derivation approach to the satellite clock offsets and the geometric distances from TDMA dual one-way measurements without a loss of accuracy. The derived clock offsets are used for time synchronization, and the derived geometry distances are used for autonomous orbit determination. The clock offsets from the ISL measurements are consistent with the L-band two-way satellite, and time-frequency transfer clock measurements and the detrended residuals vary within 0.5 ns. The centralized autonomous orbit determination is conducted in a batch mode on a ground-capable server for the feasibility study. Constant hardware delays are present in the geometric distances and become the largest source of error in the autonomous orbit determination. Therefore, the hardware delays are estimated simultaneously with the satellite orbits. To avoid uncertainties in the constellation orientation, a ground anchor station that "observes" the satellites with on-board ISL payloads is introduced into the orbit determination. The root-mean-square values of orbit determination residuals are within 10.0 cm, and the standard deviation of the estimated ISL hardware delays is within 0.2 ns. The accuracy of the autonomous orbits is evaluated by analysis of overlap comparison and the satellite laser ranging (SLR) residuals and is compared with the accuracy of the L-band orbits. The results indicate that the radial overlap differences between the autonomous orbits are less than 15.0 cm for the inclined geosynchronous orbit (IGSO) satellites and less than 10.0 cm for the MEO satellites. The SLR residuals are approximately 15.0 cm for the IGSO satellites and approximately 10.0 cm for the MEO satellites, representing an improvement over the L-band orbits.

  20. Advanced Integration of WiFi and Inertial Navigation Systems for Indoor Mobile Positioning

    NASA Astrophysics Data System (ADS)

    Evennou, Frédéric; Marx, François

    2006-12-01

    This paper presents an aided dead-reckoning navigation structure and signal processing algorithms for self localization of an autonomous mobile device by fusing pedestrian dead reckoning and WiFi signal strength measurements. WiFi and inertial navigation systems (INS) are used for positioning and attitude determination in a wide range of applications. Over the last few years, a number of low-cost inertial sensors have become available. Although they exhibit large errors, WiFi measurements can be used to correct the drift weakening the navigation based on this technology. On the other hand, INS sensors can interact with the WiFi positioning system as they provide high-accuracy real-time navigation. A structure based on a Kalman filter and a particle filter is proposed. It fuses the heterogeneous information coming from those two independent technologies. Finally, the benefits of the proposed architecture are evaluated and compared with the pure WiFi and INS positioning systems.

  1. Sextant X-Ray Pulsar Navigation Demonstration: Initial On-Orbit Results

    NASA Technical Reports Server (NTRS)

    Mitchell, Jason W.; Winternitz, Luke M.; Hassouneh, Munther A.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wolff, Michael T.; Kerr, Matthew; Wood, Kent S.; hide

    2018-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a technology demonstration enhancement to the Neutron-star Interior Composition Explorer (NICER) mission. SEXTANT will be a first demonstration of in-space, autonomous, X-ray pulsar navigation (XNAV). Navigating using millisecond X-ray pulsars which could provide a GPS-like navigation capability available throughout our Solar System and beyond. NICER is a NASA Astrophysics Explorer Mission of Opportunity to the International Space Station that was launched and installed in June of 2017. During NICER's nominal 18-month base mission, SEXTANT will perform a number of experiments to demonstrate XNAV and advance the technology on a number of fronts. In this work, we review the SEXTANT, its goals, and present early results from SEXTANT experiments conducted in the first six months of operation. With these results, SEXTANT has made significant progress toward meeting its primary and secondary mission goals. We also describe the SEXTANT flight operations, calibration activities, and initial results.

  2. Neural Network Based Sensory Fusion for Landmark Detection

    NASA Technical Reports Server (NTRS)

    Kumbla, Kishan -K.; Akbarzadeh, Mohammad R.

    1997-01-01

    NASA is planning to send numerous unmanned planetary missions to explore the space. This requires autonomous robotic vehicles which can navigate in an unstructured, unknown, and uncertain environment. Landmark based navigation is a new area of research which differs from the traditional goal-oriented navigation, where a mobile robot starts from an initial point and reaches a destination in accordance with a pre-planned path. The landmark based navigation has the advantage of allowing the robot to find its way without communication with the mission control station and without exact knowledge of its coordinates. Current algorithms based on landmark navigation however pose several constraints. First, they require large memories to store the images. Second, the task of comparing the images using traditional methods is computationally intensive and consequently real-time implementation is difficult. The method proposed here consists of three stages, First stage utilizes a heuristic-based algorithm to identify significant objects. The second stage utilizes a neural network (NN) to efficiently classify images of the identified objects. The third stage combines distance information with the classification results of neural networks for efficient and intelligent navigation.

  3. Galileo: The Added Value for Integrity in Harsh Environments.

    PubMed

    Borio, Daniele; Gioia, Ciro

    2016-01-16

    A global navigation satellite system (GNSS)-based navigation is a challenging task in a signal-degraded environments where GNSS signals are distorted by multipath and attenuated by fading effects: the navigation solution may be inaccurate or unavailable. A possible approach to improve accuracy and availability is the joint use of measurements from different GNSSs and quality check algorithms; this approach is investigated here using live GPS and Galileo signals. A modified receiver autonomous integrity monitoring (RAIM) algorithm, including geometry and separability checks, is proposed to detect and exclude erroneous measurements: the multi-constellation approach provides redundant measurements, and RAIM exploits them to exclude distorted observations. The synergy between combined GPS/Galileo navigation and RAIM is analyzed using live data; the performance is compared to the accuracy and availability of a GPS-only solution. The tests performed demonstrate that the methods developed are effective techniques for GNSS-based navigation in signal-degraded environments. The joint use of the multi-constellation approach and of modified RAIM algorithms improves the performance of the navigation system in terms of both accuracy and availability.

  4. Galileo: The Added Value for Integrity in Harsh Environments

    PubMed Central

    Borio, Daniele; Gioia, Ciro

    2016-01-01

    A global navigation satellite system (GNSS)-based navigation is a challenging task in a signal-degraded environments where GNSS signals are distorted by multipath and attenuated by fading effects: the navigation solution may be inaccurate or unavailable. A possible approach to improve accuracy and availability is the joint use of measurements from different GNSSs and quality check algorithms; this approach is investigated here using live GPS and Galileo signals. A modified receiver autonomous integrity monitoring (RAIM) algorithm, including geometry and separability checks, is proposed to detect and exclude erroneous measurements: the multi-constellation approach provides redundant measurements, and RAIM exploits them to exclude distorted observations. The synergy between combined GPS/Galileo navigation and RAIM is analyzed using live data; the performance is compared to the accuracy and availability of a GPS-only solution. The tests performed demonstrate that the methods developed are effective techniques for GNSS-based navigation in signal-degraded environments. The joint use of the multi-constellation approach and of modified RAIM algorithms improves the performance of the navigation system in terms of both accuracy and availability. PMID:26784205

  5. Optical Navigation for the Orion Vehicle

    NASA Technical Reports Server (NTRS)

    Crain, Timothy; Getchius, Joel; D'Souza, Christopher

    2008-01-01

    The Orion vehicle is being designed to provide nominal crew transport to the lunar transportation stack in low Earth orbit, crew abort prior during transit to the moon, and crew return to Earth once lunar orbit is achieved. One of the design requirements levied on the Orion vehicle is the ability to return to the vehicle and crew to Earth in the case of loss of communications and command with the Mission Control Center. Central to fulfilling this requirement, is the ability of Orion to navigate autonomously. In low-Earth orbit, this may be solved with the use of GPS, but in cis-lunar and lunar orbit this requires optical navigation. This paper documents the preliminary analyses performed by members of the Orion Orbit GN&C System team.

  6. Crew-Aided Autonomous Navigation Project

    NASA Technical Reports Server (NTRS)

    Holt, Greg

    2015-01-01

    Manual capability to perform star/planet-limb sightings provides a cheap, simple, and robust backup navigation source for exploration missions independent from the ground. Sextant sightings from spacecraft were first exercised in Gemini and flew as the loss-of-communications backup for all Apollo missions. This study seeks to procure and characterize error sources of navigation-grade sextants for feasibility of taking star and planetary limb sightings from inside a spacecraft. A series of similar studies was performed in the early/mid-1960s in preparation for Apollo missions, and one goal of this study is to modernize and update those findings. This technique has the potential to deliver significant risk mitigation, validation, and backup to more complex low-TRL automated systems under development involving cameras.

  7. Autonomous self-navigating drug-delivery vehicles: from science fiction to reality.

    PubMed

    Petrenko, Valery A

    2017-12-01

    Low efficacy of targeted nanomedicines in biological experiments enforced us to challenge the traditional concept of drug targeting and suggest a paradigm of 'addressed self-navigating drug-delivery vehicles,' in which affinity selection of targeting peptides and vasculature-directed in vivo phage screening is replaced by the migration selection, which explores ability of 'promiscuous' phages and their proteins to migrate through the tumor-surrounding cellular barriers, using a 'hub and spoke' delivery strategy, and penetrate into the tumor affecting the diverse tumor cell population. The 'self-navigating' drug-delivery paradigm can be used as a theoretical and technical platform in design of a novel generation of molecular medications and imaging probes for precise and personal medicine. [Formula: see text].

  8. Autonomous Navigation Performance During The Hartley 2 Comet Flyby

    NASA Technical Reports Server (NTRS)

    Abrahamson, Matthew J; Kennedy, Brian A.; Bhaskaran, Shyam

    2012-01-01

    On November 4, 2010, the EPOXI spacecraft performed a 700-km flyby of the comet Hartley 2 as follow-on to the successful 2005 Deep Impact prime mission. EPOXI, an extended mission for the Deep Impact Flyby spacecraft, returned a wealth of visual and infrared data from Hartley 2, marking the fifth time that high-resolution images of a cometary nucleus have been captured by a spacecraft. The highest resolution science return, captured at closest approach to the comet nucleus, was enabled by use of an onboard autonomous navigation system called AutoNav. AutoNav estimates the comet-relative spacecraft trajectory using optical measurements from the Medium Resolution Imager (MRI) and provides this relative position information to the Attitude Determination and Control System (ADCS) for maintaining instrument pointing on the comet. For the EPOXI mission, AutoNav was tasked to enable continuous tracking of a smaller, more active Hartley 2, as compared to Tempel 1, through the full encounter while traveling at a higher velocity. To meet the mission goal of capturing the comet in all MRI science images, position knowledge accuracies of +/- 3.5 km (3-?) cross track and +/- 0.3 seconds (3-?) time of flight were required. A flight-code-in-the-loop Monte Carlo simulation assessed AutoNav's statistical performance under the Hartley 2 flyby dynamics and determined optimal configuration. The AutoNav performance at Hartley 2 was successful, capturing the comet in all of the MRI images. The maximum residual between observed and predicted comet locations was 20 MRI pixels, primarily influenced by the center of brightness offset from the center of mass in the observations and attitude knowledge errors. This paper discusses the Monte Carlo-based analysis that led to the final AutoNav configuration and a comparison of the predicted performance with the flyby performance.

  9. Polarized skylight navigation.

    PubMed

    Hamaoui, Moshe

    2017-01-20

    Vehicle state estimation is an essential prerequisite for navigation. The present approach seeks to use skylight polarization to facilitate state estimation under autonomous unconstrained flight conditions. Atmospheric scattering polarizes incident sunlight such that solar position is mathematically encoded in the resulting skylight polarization pattern. Indeed, several species of insects are able to sense skylight polarization and are believed to navigate polarimetrically. Sun-finding methodologies for polarized skylight navigation (PSN) have been proposed in the literature but typically rely on calibration updates to account for changing atmospheric conditions and/or are limited to 2D operation. To address this technology gap, a gradient-based PSN solution is developed based upon the Rayleigh sky model. The solution is validated in simulation, and effects of measurement error and changing atmospheric conditions are investigated. Finally, an experimental effort is described wherein polarimetric imagery is collected, ground-truth is established through independent imager-attitude measurement, the gradient-based PSN solution is applied, and results are analyzed.

  10. A novel interplanetary optical navigation algorithm based on Earth-Moon group photos by Chang'e-5T1 probe

    NASA Astrophysics Data System (ADS)

    Bu, Yanlong; Zhang, Qiang; Ding, Chibiao; Tang, Geshi; Wang, Hang; Qiu, Rujin; Liang, Libo; Yin, Hejun

    2017-02-01

    This paper presents an interplanetary optical navigation algorithm based on two spherical celestial bodies. The remarkable characteristic of the method is that key navigation parameters can be estimated depending entirely on known sizes and ephemerides of two celestial bodies, especially positioning is realized through a single image and does not rely on traditional terrestrial radio tracking any more. Actual Earth-Moon group photos captured by China's Chang'e-5T1 probe were used to verify the effectiveness of the algorithm. From 430,000 km away from the Earth, the camera pointing accuracy reaches 0.01° (one sigma) and the inertial positioning error is less than 200 km, respectively; meanwhile, the cost of the ground control and human resources are greatly reduced. The algorithm is flexible, easy to implement, and can provide reference to interplanetary autonomous navigation in the solar system.

  11. 3D Reconfigurable MPSoC for Unmanned Spacecraft Navigation

    NASA Astrophysics Data System (ADS)

    Dekoulis, George

    2016-07-01

    This paper describes the design of a new lightweight spacecraft navigation system for unmanned space missions. The system addresses the demands for more efficient autonomous navigation in the near-Earth environment or deep space. The proposed instrumentation is directly suitable for unmanned systems operation and testing of new airborne prototypes for remote sensing applications. The system features a new sensor technology and significant improvements over existing solutions. Fluxgate type sensors have been traditionally used in unmanned defense systems such as target drones, guided missiles, rockets and satellites, however, the guidance sensors' configurations exhibit lower specifications than the presented solution. The current implementation is based on a recently developed material in a reengineered optimum sensor configuration for unprecedented low-power consumption. The new sensor's performance characteristics qualify it for spacecraft navigation applications. A major advantage of the system is the efficiency in redundancy reduction achieved in terms of both hardware and software requirements.

  12. An Outdoor Navigation Platform with a 3D Scanner and Gyro-assisted Odometry

    NASA Astrophysics Data System (ADS)

    Yoshida, Tomoaki; Irie, Kiyoshi; Koyanagi, Eiji; Tomono, Masahiro

    This paper proposes a light-weight navigation platform that consists of gyro-assisted odometry, a 3D laser scanner and map-based localization for human-scale robots. The gyro-assisted odometry provides highly accurate positioning only by dead-reckoning. The 3D laser scanner has a wide field of view and uniform measuring-point distribution. The map-based localization is robust and computationally inexpensive by utilizing a particle filter on a 2D grid map generated by projecting 3D points on to the ground. The system uses small and low-cost sensors, and can be applied to a variety of mobile robots in human-scale environments. Outdoor navigation experiments were conducted at the Tsukuba Challenge held in 2009 and 2010, which is an open proving ground for human-scale robots. Our robot successfully navigated the assigned 1-km courses in a fully autonomous mode multiple times.

  13. Slime mold uses an externalized spatial “memory” to navigate in complex environments

    PubMed Central

    Reid, Chris R.; Latty, Tanya; Dussutour, Audrey; Beekman, Madeleine

    2012-01-01

    Spatial memory enhances an organism’s navigational ability. Memory typically resides within the brain, but what if an organism has no brain? We show that the brainless slime mold Physarum polycephalum constructs a form of spatial memory by avoiding areas it has previously explored. This mechanism allows the slime mold to solve the U-shaped trap problem—a classic test of autonomous navigational ability commonly used in robotics—requiring the slime mold to reach a chemoattractive goal behind a U-shaped barrier. Drawn into the trap, the organism must rely on other methods than gradient-following to escape and reach the goal. Our data show that spatial memory enhances the organism’s ability to navigate in complex environments. We provide a unique demonstration of a spatial memory system in a nonneuronal organism, supporting the theory that an externalized spatial memory may be the functional precursor to the internal memory of higher organisms. PMID:23045640

  14. Slime mold uses an externalized spatial "memory" to navigate in complex environments.

    PubMed

    Reid, Chris R; Latty, Tanya; Dussutour, Audrey; Beekman, Madeleine

    2012-10-23

    Spatial memory enhances an organism's navigational ability. Memory typically resides within the brain, but what if an organism has no brain? We show that the brainless slime mold Physarum polycephalum constructs a form of spatial memory by avoiding areas it has previously explored. This mechanism allows the slime mold to solve the U-shaped trap problem--a classic test of autonomous navigational ability commonly used in robotics--requiring the slime mold to reach a chemoattractive goal behind a U-shaped barrier. Drawn into the trap, the organism must rely on other methods than gradient-following to escape and reach the goal. Our data show that spatial memory enhances the organism's ability to navigate in complex environments. We provide a unique demonstration of a spatial memory system in a nonneuronal organism, supporting the theory that an externalized spatial memory may be the functional precursor to the internal memory of higher organisms.

  15. Ranging Consistency Based on Ranging-Compensated Temperature-Sensing Sensor for Inter-Satellite Link of Navigation Constellation

    PubMed Central

    Meng, Zhijun; Yang, Jun; Guo, Xiye; Zhou, Yongbin

    2017-01-01

    Global Navigation Satellite System performance can be significantly enhanced by introducing inter-satellite links (ISLs) in navigation constellation. The improvement in position, velocity, and time accuracy as well as the realization of autonomous functions requires ISL distance measurement data as the original input. To build a high-performance ISL, the ranging consistency among navigation satellites is an urgent problem to be solved. In this study, we focus on the variation in the ranging delay caused by the sensitivity of the ISL payload equipment to the ambient temperature in space and propose a simple and low-power temperature-sensing ranging compensation sensor suitable for onboard equipment. The experimental results show that, after the temperature-sensing ranging compensation of the ISL payload equipment, the ranging consistency becomes less than 0.2 ns when the temperature change is 90 °C. PMID:28608809

  16. Navigation system for a mobile robot with a visual sensor using a fish-eye lens

    NASA Astrophysics Data System (ADS)

    Kurata, Junichi; Grattan, Kenneth T. V.; Uchiyama, Hironobu

    1998-02-01

    Various position sensing and navigation systems have been proposed for the autonomous control of mobile robots. Some of these systems have been installed with an omnidirectional visual sensor system that proved very useful in obtaining information on the environment around the mobile robot for position reckoning. In this article, this type of navigation system is discussed. The sensor is composed of one TV camera with a fish-eye lens, using a reference target on a ceiling and hybrid image processing circuits. The position of the robot, with respect to the floor, is calculated by integrating the information obtained from a visual sensor and a gyroscope mounted in the mobile robot, and the use of a simple algorithm based on PTP control for guidance is discussed. An experimental trial showed that the proposed system was both valid and useful for the navigation of an indoor vehicle.

  17. Miniaturized Autonomous Extravehicular Robotic Camera (Mini AERCam)

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven E.

    2001-01-01

    The NASA Johnson Space Center (JSC) Engineering Directorate is developing the Autonomous Extravehicular Robotic Camera (AERCam), a low-volume, low-mass free-flying camera system . AERCam project team personnel recently initiated development of a miniaturized version of AERCam known as Mini AERCam. The Mini AERCam target design is a spherical "nanosatellite" free-flyer 7.5 inches in diameter and weighing 1 0 pounds. Mini AERCam is building on the success of the AERCam Sprint STS-87 flight experiment by adding new on-board sensing and processing capabilities while simultaneously reducing volume by 80%. Achieving enhanced capability in a smaller package depends on applying miniaturization technology across virtually all subsystems. Technology innovations being incorporated include micro electromechanical system (MEMS) gyros, "camera-on-a-chip" CMOS imagers, rechargeable xenon gas propulsion system , rechargeable lithium ion battery, custom avionics based on the PowerPC 740 microprocessor, GPS relative navigation, digital radio frequency communications and tracking, micropatch antennas, digital instrumentation, and dense mechanical packaging. The Mini AERCam free-flyer will initially be integrated into an approximate flight-like configuration for demonstration on an airbearing table. A pilot-in-the-loop and hardware-in-the-loop simulation to simulate on-orbit navigation and dynamics will complement the airbearing table demonstration. The Mini AERCam lab demonstration is intended to form the basis for future development of an AERCam flight system that provides beneficial on-orbit views unobtainable from fixed cameras, cameras on robotic manipulators, or cameras carried by EVA crewmembers.

  18. Volunteers Oriented Interface Design for the Remote Navigation of Rescue Robots at Large-Scale Disaster Sites

    NASA Astrophysics Data System (ADS)

    Yang, Zhixiao; Ito, Kazuyuki; Saijo, Kazuhiko; Hirotsune, Kazuyuki; Gofuku, Akio; Matsuno, Fumitoshi

    This paper aims at constructing an efficient interface being similar to those widely used in human daily life, to fulfill the need of many volunteer rescuers operating rescue robots at large-scale disaster sites. The developed system includes a force feedback steering wheel interface and an artificial neural network (ANN) based mouse-screen interface. The former consists of a force feedback steering control and a six monitors’ wall. It provides a manual operation like driving cars to navigate a rescue robot. The latter consists of a mouse and a camera’s view displayed in a monitor. It provides a semi-autonomous operation by mouse clicking to navigate a rescue robot. Results of experiments show that a novice volunteer can skillfully navigate a tank rescue robot through both interfaces after 20 to 30 minutes of learning their operation respectively. The steering wheel interface has high navigating speed in open areas, without restriction of terrains and surface conditions of a disaster site. The mouse-screen interface is good at exact navigation in complex structures, while bringing little tension to operators. The two interfaces are designed to switch into each other at any time to provide a combined efficient navigation method.

  19. Natural Models for Autonomous Control of Spatial Navigation, Sensing, and Guidance

    DTIC Science & Technology

    2013-06-26

    mantis shrimp, inspired largely by our efforts, can be found at: The Oatmeal - http://theoatmeal.com/ comics /mantis_shrimp. Courtesy of these guys...Ecology and Environmental Education (20-21 January, Tainan, Taiwan). 14 M How, NJ Marshall 2012 Polarisation vision, an unexplored channel for

  20. Cancellation of the Army’s Autonomous Navigation System

    DTIC Science & Technology

    2012-08-02

    Auto/Truck Various Vehicle Leader/Follower, Road Following Google Driverless Vehicle Google Road Following Source: GAO presentation of data from Red...both of which are estimated to cost over $300,000 per system. However, Google’s Driverless Vehicle and the Southwest Research Institute’s Mobile

  1. Natural Models for Autonomous Control of Spatial Navigation, Sensing, and Guidance, Part 1

    DTIC Science & Technology

    2011-06-16

    polarization sensitivity ten times more acute than previously documented in cephalopods (cuttlefish and octopus ) and crustaceans (mantis shrimps and crabs...responded to perceived looming stimuli in a number of ways, including swimming movements , skin colour and texture changes, which were recorded using

  2. Drift Recovery and Station Keeping for the CanX-4 & CanX-5 Nanosatellite Formation Flying Mission

    NASA Astrophysics Data System (ADS)

    Newman, Joshua Zachary

    Canadian Advanced Nanospace eXperiments 4 & 5 (CanX-4&5) are a pair of formation flying nanosatellites that demonstrated autonomous sub-metre formation control at ranges of 1000 to 50 m. To facilitate the autonomous formation flight mission, it is necessary that the two spacecraft be brought within a few kilometres of one another, with a low relative velocity. Therefore, a system to calculate fuel-efficient recovery trajectories and produce the corresponding spacecraft commands was required. This system was also extended to provide station keeping capabilities. In this thesis, the overall drift recovery strategy is outlined, and the design of the controller is detailed. A method of putting the formation into a passively safe state, where the spacecraft cannot collide, is also presented. Monte-Carlo simulations are used to estimate the fuel losses associated with navigational and attitude errors. Finally, on-orbit results are presented, validating both the design and the error expectations.

  3. Maiden Voyage of the Under-Ice Float

    NASA Astrophysics Data System (ADS)

    Shcherbina, A.; D'Asaro, E. A.; Light, B.; Deming, J. W.; Rehm, E.

    2016-02-01

    The Under-Ice Float (UIF) is a new autonomous platform for sea ice and upper ocean observations in the marginal ice zone (MIZ). UIF is based on the Mixed Layer Lagrangian Float design, inheriting its accurate buoyancy control and relatively heavy payload capability. A major challenge for sustained autonomous observations in the MIZ is detection of open water for navigation and telemetry surfacings. UIF employs the new surface classification algorithm based on the spectral analysis of surface roughness sensed by an upward-looking sonar. A prototype UIF was deployed in the MIZ of the central Arctic Ocean in late August 2015. The main payload of the first UIF was a bio-optical suit consisting of upward- and downward hyperspectral radiometers; temperature, salinity, chlorophyll, turbidity, and dissolved oxygen sensors, and a high-definition photo camera. In the early stages of its mission, the float successfully avoided ice, detected leads, surfaced in open water, and transmitted data and photographs. We will present the analysis of these observations from the full UIF mission extending into the freeze-up season.

  4. Adaptive Control for Autonomous Navigation of Mobile Robots Considering Time Delay and Uncertainty

    NASA Astrophysics Data System (ADS)

    Armah, Stephen Kofi

    Autonomous control of mobile robots has attracted considerable attention of researchers in the areas of robotics and autonomous systems during the past decades. One of the goals in the field of mobile robotics is development of platforms that robustly operate in given, partially unknown, or unpredictable environments and offer desired services to humans. Autonomous mobile robots need to be equipped with effective, robust and/or adaptive, navigation control systems. In spite of enormous reported work on autonomous navigation control systems for mobile robots, achieving the goal above is still an open problem. Robustness and reliability of the controlled system can always be improved. The fundamental issues affecting the stability of the control systems include the undesired nonlinear effects introduced by actuator saturation, time delay in the controlled system, and uncertainty in the model. This research work develops robustly stabilizing control systems by investigating and addressing such nonlinear effects through analytical, simulations, and experiments. The control systems are designed to meet specified transient and steady-state specifications. The systems used for this research are ground (Dr Robot X80SV) and aerial (Parrot AR.Drone 2.0) mobile robots. Firstly, an effective autonomous navigation control system is developed for X80SV using logic control by combining 'go-to-goal', 'avoid-obstacle', and 'follow-wall' controllers. A MATLAB robot simulator is developed to implement this control algorithm and experiments are conducted in a typical office environment. The next stage of the research develops an autonomous position (x, y, and z) and attitude (roll, pitch, and yaw) controllers for a quadrotor, and PD-feedback control is used to achieve stabilization. The quadrotor's nonlinear dynamics and kinematics are implemented using MATLAB S-function to generate the state output. Secondly, the white-box and black-box approaches are used to obtain a linearized second-order altitude models for the quadrotor, AR.Drone 2.0. Proportional (P), pole placement or proportional plus velocity (PV), linear quadratic regulator (LQR), and model reference adaptive control (MRAC) controllers are designed and validated through simulations using MATLAB/Simulink. Control input saturation and time delay in the controlled systems are also studied. MATLAB graphical user interface (GUI) and Simulink programs are developed to implement the controllers on the drone. Thirdly, the time delay in the drone's control system is estimated using analytical and experimental methods. In the experimental approach, the transient properties of the experimental altitude responses are compared to those of simulated responses. The analytical approach makes use of the Lambert W function to obtain analytical solutions of scalar first-order delay differential equations (DDEs). A time-delayed P-feedback control system (retarded type) is used in estimating the time delay. Then an improved system performance is obtained by incorporating the estimated time delay in the design of the PV control system (neutral type) and PV-MRAC control system. Furthermore, the stability of a parametric perturbed linear time-invariant (LTI) retarded-type system is studied. This is done by analytically calculating the stability radius of the system. Simulation of the control system is conducted to confirm the stability. This robust control design and uncertainty analysis are conducted for first-order and second-order quadrotor models. Lastly, the robustly designed PV and PV-MRAC control systems are used to autonomously track multiple waypoints. Also, the robustness of the PV-MRAC controller is tested against a baseline PV controller using the payload capability of the drone. It is shown that the PV-MRAC offers several benefits over the fixed-gain approach of the PV controller. The adaptive control is found to offer enhanced robustness to the payload fluctuations.

  5. A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application.

    PubMed

    Vivacqua, Rafael; Vassallo, Raquel; Martins, Felipe

    2017-10-16

    Autonomous driving in public roads requires precise localization within the range of few centimeters. Even the best current precise localization system based on the Global Navigation Satellite System (GNSS) can not always reach this level of precision, especially in an urban environment, where the signal is disturbed by surrounding buildings and artifacts. Laser range finder and stereo vision have been successfully used for obstacle detection, mapping and localization to solve the autonomous driving problem. Unfortunately, Light Detection and Ranging (LIDARs) are very expensive sensors and stereo vision requires powerful dedicated hardware to process the cameras information. In this context, this article presents a low-cost architecture of sensors and data fusion algorithm capable of autonomous driving in narrow two-way roads. Our approach exploits a combination of a short-range visual lane marking detector and a dead reckoning system to build a long and precise perception of the lane markings in the vehicle's backwards. This information is used to localize the vehicle in a map, that also contains the reference trajectory for autonomous driving. Experimental results show the successful application of the proposed system on a real autonomous driving situation.

  6. A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application

    PubMed Central

    Vassallo, Raquel

    2017-01-01

    Autonomous driving in public roads requires precise localization within the range of few centimeters. Even the best current precise localization system based on the Global Navigation Satellite System (GNSS) can not always reach this level of precision, especially in an urban environment, where the signal is disturbed by surrounding buildings and artifacts. Laser range finder and stereo vision have been successfully used for obstacle detection, mapping and localization to solve the autonomous driving problem. Unfortunately, Light Detection and Ranging (LIDARs) are very expensive sensors and stereo vision requires powerful dedicated hardware to process the cameras information. In this context, this article presents a low-cost architecture of sensors and data fusion algorithm capable of autonomous driving in narrow two-way roads. Our approach exploits a combination of a short-range visual lane marking detector and a dead reckoning system to build a long and precise perception of the lane markings in the vehicle’s backwards. This information is used to localize the vehicle in a map, that also contains the reference trajectory for autonomous driving. Experimental results show the successful application of the proposed system on a real autonomous driving situation. PMID:29035334

  7. Using Planning, Scheduling and Execution for Autonomous Mars Rover Operations

    NASA Technical Reports Server (NTRS)

    Estlin, Tara A.; Gaines, Daniel M.; Chouinard, Caroline M.; Fisher, Forest W.; Castano, Rebecca; Judd, Michele J.; Nesnas, Issa A.

    2006-01-01

    With each new rover mission to Mars, rovers are traveling significantly longer distances. This distance increase raises not only the opportunities for science data collection, but also amplifies the amount of environment and rover state uncertainty that must be handled in rover operations. This paper describes how planning, scheduling and execution techniques can be used onboard a rover to autonomously generate and execute rover activities and in particular to handle new science opportunities that have been identified dynamically. We also discuss some of the particular challenges we face in supporting autonomous rover decision-making. These include interaction with rover navigation and path-planning software and handling large amounts of uncertainty in state and resource estimations. Finally, we describe our experiences in testing this work using several Mars rover prototypes in a realistic environment.

  8. Bioinspired magnetoreception and navigation using magnetic signatures as waypoints.

    PubMed

    Taylor, Brian K

    2018-05-15

    Diverse taxa use Earth's magnetic field in conjunction with other sensory modalities to accomplish navigation tasks ranging from local homing to long-distance migration across continents and ocean basins. However, despite extensive research, the mechanisms that underlie animal magnetoreception are not clearly understood, and how animals use Earth's magnetic field to navigate is an active area of investigation. Concurrently, Earth's magnetic field offers a signal that engineered systems can leverage for navigation in environments where man-made systems such as GPS are unavailable or unreliable. Using a proxy for Earth's magnetic field, and inspired by migratory animal behavior, this work implements a behavioral strategy that uses combinations of magnetic field properties as rare or unique signatures that mark specific locations. Using a discrete number of these signatures as goal waypoints, the strategy navigates through a closed set of points several times in a variety of environmental conditions, and with various levels of sensor noise. The results from this engineering/quantitative biology approach support existing notions that some animals may use combinations of magnetic properties as navigational markers, and provides insights into features and constraints that would enable navigational success or failure. The findings also offer insights into how autonomous engineered platforms might be designed to leverage the magnetic field as a navigational resource.

  9. Enhanced Formation Flying for the Earth Observing-1 (EO-1) New Millennium Mission

    NASA Technical Reports Server (NTRS)

    Folta, David; Quinn, David

    1997-01-01

    With scientific objectives for Earth observation programs becoming more ambitious and spacecraft becoming more autonomous, the need for new technical approaches on the feasibility of achieving and maintaining formations of spacecraft has come to the forefront. The trend to develop small low cost spacecraft has led many scientists to recognize the advantage of flying several spacecraft in formation, an example of which is shown in the figure below, to achieve the correlated instrument measurements formerly possible only by flying many instruments on a single large platform. Yet, formation flying imposes additional complications on orbit maintenance, especially when each spacecraft has its own orbit requirements. However, advances in automation proposed by GSFC Codes 550 and 712 allow more of the burden in maneuver planning and execution to be placed onboard the spacecraft, mitigating some of the associated operational concerns. The purpose of this analysis is to develop the fundamentals of formation flying mechanics, concepts for understanding the relative motion of free flying spacecraft, and an operational control theory for formation maintenance of the Earth Observing-1 (EO-l) spacecraft that is part of the New Millennium. Results of this development can be used to determine the appropriateness of formation flying for a particular case as well as the operational impacts. Applications to the Mission to Planet Earth (MTPE) Earth Observing System (EOS) and New Millennium (NM) were highly considered in analysis and applications. This paper presents the proposed methods for the guidance and control of the EO-1 spacecraft to formation fly with the Landsat-7 spacecraft using an autonomous closed loop three axis navigation control, GPS, and Cross link navigation support. Simulation results using various fidelity levels of modeling, algorithms developed and implemented in MATLAB, and autonomous 'fuzzy logic' control using AutoCon will be presented. The results of these analysis on the ability to meet mission and formation flying requirements will be presented.

  10. A learning-based semi-autonomous controller for robotic exploration of unknown disaster scenes while searching for victims.

    PubMed

    Doroodgar, Barzin; Liu, Yugang; Nejat, Goldie

    2014-12-01

    Semi-autonomous control schemes can address the limitations of both teleoperation and fully autonomous robotic control of rescue robots in disaster environments by allowing a human operator to cooperate and share such tasks with a rescue robot as navigation, exploration, and victim identification. In this paper, we present a unique hierarchical reinforcement learning-based semi-autonomous control architecture for rescue robots operating in cluttered and unknown urban search and rescue (USAR) environments. The aim of the controller is to enable a rescue robot to continuously learn from its own experiences in an environment in order to improve its overall performance in exploration of unknown disaster scenes. A direction-based exploration technique is integrated in the controller to expand the search area of the robot via the classification of regions and the rubble piles within these regions. Both simulations and physical experiments in USAR-like environments verify the robustness of the proposed HRL-based semi-autonomous controller to unknown cluttered scenes with different sizes and varying types of configurations.

  11. Tier-scalable reconnaissance: the future in autonomous C4ISR systems has arrived: progress towards an outdoor testbed

    NASA Astrophysics Data System (ADS)

    Fink, Wolfgang; Brooks, Alexander J.-W.; Tarbell, Mark A.; Dohm, James M.

    2017-05-01

    Autonomous reconnaissance missions are called for in extreme environments, as well as in potentially hazardous (e.g., the theatre, disaster-stricken areas, etc.) or inaccessible operational areas (e.g., planetary surfaces, space). Such future missions will require increasing degrees of operational autonomy, especially when following up on transient events. Operational autonomy encompasses: (1) Automatic characterization of operational areas from different vantages (i.e., spaceborne, airborne, surface, subsurface); (2) automatic sensor deployment and data gathering; (3) automatic feature extraction including anomaly detection and region-of-interest identification; (4) automatic target prediction and prioritization; (5) and subsequent automatic (re-)deployment and navigation of robotic agents. This paper reports on progress towards several aspects of autonomous C4ISR systems, including: Caltech-patented and NASA award-winning multi-tiered mission paradigm, robotic platform development (air, ground, water-based), robotic behavior motifs as the building blocks for autonomous tele-commanding, and autonomous decision making based on a Caltech-patented framework comprising sensor-data-fusion (feature-vectors), anomaly detection (clustering and principal component analysis), and target prioritization (hypothetical probing).

  12. Autonomous mobile robot teams

    NASA Technical Reports Server (NTRS)

    Agah, Arvin; Bekey, George A.

    1994-01-01

    This paper describes autonomous mobile robot teams performing tasks in unstructured environments. The behavior and the intelligence of the group is distributed, and the system does not include a central command base or leader. The novel concept of the Tropism-Based Cognitive Architecture is introduced, which is used by the robots in order to produce behavior transforming their sensory information to proper action. The results of a number of simulation experiments are presented. These experiments include worlds where the robot teams must locate, decompose, and gather objects, and defend themselves against hostile predators, while navigating around stationary and mobile obstacles.

  13. A design strategy for autonomous systems

    NASA Technical Reports Server (NTRS)

    Forster, Pete

    1989-01-01

    Some solutions to crucial issues regarding the competent performance of an autonomously operating robot are identified; namely, that of handling multiple and variable data sources containing overlapping information and maintaining coherent operation while responding adequately to changes in the environment. Support for the ideas developed for the construction of such behavior are extracted from speculations in the study of cognitive psychology, an understanding of the behavior of controlled mechanisms, and the development of behavior-based robots in a few robot research laboratories. The validity of these ideas is supported by some simple simulation experiments in the field of mobile robot navigation and guidance.

  14. Mission Executor for an Autonomous Underwater Vehicle

    DTIC Science & Technology

    1991-09-01

    which must control and intepret sensory output for navigation and reconition of various obstructions and provide adaptability strategies for local...envemmjnm -, cLdvhelama 0"in idmLMLISw (.?OFchdmjmiheu 1)))) CroW4%cdmdahuwm 7b~.mWl) Film 6-1L OvatE Mission Asumma Ride complications, just that the

  15. End-to-end information system concept for the Mars Telecommunications Orbiter

    NASA Technical Reports Server (NTRS)

    Breidenthal, Julian C.; Edwards, Charles D.; Greenberg, Edward; Kazz, Greg J.; Noreen, Gary K.

    2006-01-01

    The Mars Telecommunications Orbiter (MTO) was intended to provide high-performance deep space relay links to landers, orbiters, sample-return missions, and approaching spacecraft in the vicinity of Mars, to demostrate interplanetary laser communications, to demonstrate autonomous navigation, and to carry out its own science investigations.

  16. End-to-end information system concept for the Mars Telecommunications Orbiter

    NASA Technical Reports Server (NTRS)

    Bridenthal, Julian C.; Edwards, Charles D.; Greenberg, Edward; Kazz, Greg J.; Noreen, Gary K.

    2006-01-01

    The Mars Telecommunications Orbiter (MTO) was intended to provide high-performance deep space relay links to landers, orbiters, sample-return, missions, and approaching spacecraft in the vicinity of Mars, to demonstrate interplanetary laser communications, to demonstrate autonomous navigation, and to carry out is own science investigations.

  17. Gyro and Accelerometer Based Navigation System for a Mobile Autonomous Robot.

    DTIC Science & Technology

    1985-12-02

    special thanks goes to our thesis advisor Dr. Matthew Kabrisky for having the confidence to turn us loose on this project. Additionally, we would...Wordmaster Word Processor 1 Wordstar Word Processor 1 Virtual Devices Robo A 6802 Cross Assembler 1 Modem 720 Communication Program 1 CP/M Operating

  18. Further development and flight test of an autonomous precision landing system using a parafoil

    NASA Technical Reports Server (NTRS)

    Murray, James E.; Sim, Alex G.; Neufeld, David C.; Rennich, Patrick K.; Norris, Stephen R.; Hughes, Wesley S.

    1994-01-01

    NASA Dryden Flight Research Center and NASA Johnson Space Center are jointly conducting a phased program to determine the feasibility of the autonomous recovery of a spacecraft using a ram-air parafoil system for the final stages of entry from space to a precision landing. The feasibility is being studied using a flight model of a spacecraft in the generic shape of a flattened biconic that weighs approximately 120 lb and is flown under a commercially available ram-air parafoil. Key components of the vehicle include the global positioning system (GPS) guidance for navigation, a flight control computer, an electronic compass, a yaw rate gyro, and an onboard data recorder. A flight test program is being used to develop and refine the vehicle. The primary flight goal is to demonstrate autonomous flight from an altitude of 3,000 m (10,000 ft) with a lateral offset of 1.6 km (1.0 mi) to a precision soft landing. This paper summarizes the progress to date. Much of the navigation system has been tested, including a heading tracker that was developed using parameter estimation techniques and a complementary filter. The autoland portion of the autopilot is still in development. The feasibility of conducting the flare maneuver without servoactuators was investigated as a means of significantly reducing the servoactuator rate and load requirements.

  19. Merge Fuzzy Visual Servoing and GPS-Based Planning to Obtain a Proper Navigation Behavior for a Small Crop-Inspection Robot.

    PubMed

    Bengochea-Guevara, José M; Conesa-Muñoz, Jesus; Andújar, Dionisio; Ribeiro, Angela

    2016-02-24

    The concept of precision agriculture, which proposes farming management adapted to crop variability, has emerged in recent years. To effectively implement precision agriculture, data must be gathered from the field in an automated manner at minimal cost. In this study, a small autonomous field inspection vehicle was developed to minimise the impact of the scouting on the crop and soil compaction. The proposed approach integrates a camera with a GPS receiver to obtain a set of basic behaviours required of an autonomous mobile robot to inspect a crop field with full coverage. A path planner considered the field contour and the crop type to determine the best inspection route. An image-processing method capable of extracting the central crop row under uncontrolled lighting conditions in real time from images acquired with a reflex camera positioned on the front of the robot was developed. Two fuzzy controllers were also designed and developed to achieve vision-guided navigation. A method for detecting the end of a crop row using camera-acquired images was developed. In addition, manoeuvres necessary for the robot to change rows were established. These manoeuvres enabled the robot to autonomously cover the entire crop by following a previously established plan and without stepping on the crop row, which is an essential behaviour for covering crops such as maize without damaging them.

  20. Merge Fuzzy Visual Servoing and GPS-Based Planning to Obtain a Proper Navigation Behavior for a Small Crop-Inspection Robot

    PubMed Central

    Bengochea-Guevara, José M.; Conesa-Muñoz, Jesus; Andújar, Dionisio; Ribeiro, Angela

    2016-01-01

    The concept of precision agriculture, which proposes farming management adapted to crop variability, has emerged in recent years. To effectively implement precision agriculture, data must be gathered from the field in an automated manner at minimal cost. In this study, a small autonomous field inspection vehicle was developed to minimise the impact of the scouting on the crop and soil compaction. The proposed approach integrates a camera with a GPS receiver to obtain a set of basic behaviours required of an autonomous mobile robot to inspect a crop field with full coverage. A path planner considered the field contour and the crop type to determine the best inspection route. An image-processing method capable of extracting the central crop row under uncontrolled lighting conditions in real time from images acquired with a reflex camera positioned on the front of the robot was developed. Two fuzzy controllers were also designed and developed to achieve vision-guided navigation. A method for detecting the end of a crop row using camera-acquired images was developed. In addition, manoeuvres necessary for the robot to change rows were established. These manoeuvres enabled the robot to autonomously cover the entire crop by following a previously established plan and without stepping on the crop row, which is an essential behaviour for covering crops such as maize without damaging them. PMID:26927102

Top