Autonomous Relative Navigation for Formation-Flying Satellites Using GPS
NASA Technical Reports Server (NTRS)
Gramling, Cheryl; Carpenter, J. Russell; Long, Anne; Kelbel, David; Lee, Taesul
2000-01-01
The Goddard Space Flight Center is currently developing advanced spacecraft systems to provide autonomous navigation and control of formation flyers. This paper discusses autonomous relative navigation performance for a formation of four eccentric, medium-altitude Earth-orbiting satellites using Global Positioning System (GPS) Standard Positioning Service (SPS) and "GPS-like " intersatellite measurements. The performance of several candidate relative navigation approaches is evaluated. These analyses indicate that an autonomous relative navigation position accuracy of 1meter root-mean-square can be achieved by differencing high-accuracy filtered solutions if only measurements from common GPS space vehicles are used in the independently estimated solutions.
Autonomous Navigation Using Celestial Objects
NASA Technical Reports Server (NTRS)
Folta, David; Gramling, Cheryl; Leung, Dominic; Belur, Sheela; Long, Anne
1999-01-01
In the twenty-first century, National Aeronautics and Space Administration (NASA) Enterprises envision frequent low-cost missions to explore the solar system, observe the universe, and study our planet. Satellite autonomy is a key technology required to reduce satellite operating costs. The Guidance, Navigation, and Control Center (GNCC) at the Goddard Space Flight Center (GSFC) currently sponsors several initiatives associated with the development of advanced spacecraft systems to provide autonomous navigation and control. Autonomous navigation has the potential both to increase spacecraft navigation system performance and to reduce total mission cost. By eliminating the need for routine ground-based orbit determination and special tracking services, autonomous navigation can streamline spacecraft ground systems. Autonomous navigation products can be included in the science telemetry and forwarded directly to the scientific investigators. In addition, autonomous navigation products are available onboard to enable other autonomous capabilities, such as attitude control, maneuver planning and orbit control, and communications signal acquisition. Autonomous navigation is required to support advanced mission concepts such as satellite formation flying. GNCC has successfully developed high-accuracy autonomous navigation systems for near-Earth spacecraft using NASA's space and ground communications systems and the Global Positioning System (GPS). Recently, GNCC has expanded its autonomous navigation initiative to include satellite orbits that are beyond the regime in which use of GPS is possible. Currently, GNCC is assessing the feasibility of using standard spacecraft attitude sensors and communication components to provide autonomous navigation for missions including: libration point, gravity assist, high-Earth, and interplanetary orbits. The concept being evaluated uses a combination of star, Sun, and Earth sensor measurements along with forward-link Doppler measurements from the command link carrier to autonomously estimate the spacecraft's orbit and reference oscillator's frequency. To support autonomous attitude determination and control and maneuver planning and control, the orbit determination accuracy should be on the order of kilometers in position and centimeters per second in velocity. A less accurate solution (one hundred kilometers in position) could be used for acquisition purposes for command and science downloads. This paper provides performance results for both libration point orbiting and high Earth orbiting satellites as a function of sensor measurement accuracy, measurement types, measurement frequency, initial state errors, and dynamic modeling errors.
Autonomous Deep-Space Optical Navigation Project
NASA Technical Reports Server (NTRS)
D'Souza, Christopher
2014-01-01
This project will advance the Autonomous Deep-space navigation capability applied to Autonomous Rendezvous and Docking (AR&D) Guidance, Navigation and Control (GNC) system by testing it on hardware, particularly in a flight processor, with a goal of limited testing in the Integrated Power, Avionics and Software (IPAS) with the ARCM (Asteroid Retrieval Crewed Mission) DRO (Distant Retrograde Orbit) Autonomous Rendezvous and Docking (AR&D) scenario. The technology, which will be harnessed, is called 'optical flow', also known as 'visual odometry'. It is being matured in the automotive and SLAM (Simultaneous Localization and Mapping) applications but has yet to be applied to spacecraft navigation. In light of the tremendous potential of this technique, we believe that NASA needs to design a optical navigation architecture that will use this technique. It is flexible enough to be applicable to navigating around planetary bodies, such as asteroids.
Navigation for the new millennium: Autonomous navigation for Deep Space 1
NASA Technical Reports Server (NTRS)
Reidel, J. E.; Bhaskaran, S.; Synnott, S. P.; Desai, S. D.; Bollman, W. E.; Dumont, P. J.; Halsell, C. A.; Han, D.; Kennedy, B. M.; Null, G. W.;
1997-01-01
The autonomous optical navigation system technology for the Deep Space 1 (DS1) mission is reported on. The DS1 navigation system will be the first to use autonomous navigation in deep space. The systems tasks are to: perform interplanetary cruise orbit determination using images of distant asteroids; control and maintain the orbit of the spacecraft with an ion propulsion system and conventional thrusters, and perform late knowledge updates of target position during close flybys in order to facilitate high quality data return from asteroid MaAuliffe and comet West-Kohoutek-Ikemura. To accomplish these tasks, the following functions are required: picture planning; image processing; dynamical modeling and integration; planetary ephemeris and star catalog handling; orbit determination; data filtering and estimation; maneuver estimation, and spacecraft ephemeris updating. These systems and functions are described and preliminary performance data are presented.
Autonomous assistance navigation for robotic wheelchairs in confined spaces.
Cheein, Fernando Auat; Carelli, Ricardo; De la Cruz, Celso; Muller, Sandra; Bastos Filho, Teodiano F
2010-01-01
In this work, a visual interface for the assistance of a robotic wheelchair's navigation is presented. The visual interface is developed for the navigation in confined spaces such as narrows corridors or corridor-ends. The interface performs two navigation modus: non-autonomous and autonomous. The non-autonomous driving of the robotic wheelchair is made by means of a hand-joystick. The joystick directs the motion of the vehicle within the environment. The autonomous driving is performed when the user of the wheelchair has to turn (90, 90 or 180 degrees) within the environment. The turning strategy is performed by a maneuverability algorithm compatible with the kinematics of the wheelchair and by the SLAM (Simultaneous Localization and Mapping) algorithm. The SLAM algorithm provides the interface with the information concerning the environment disposition and the pose -position and orientation-of the wheelchair within the environment. Experimental and statistical results of the interface are also shown in this work.
Relative Navigation of Formation Flying Satellites
NASA Technical Reports Server (NTRS)
Long, Anne; Kelbel, David; Lee, Taesul; Leung, Dominic; Carpenter, Russell; Gramling, Cheryl; Bauer, Frank (Technical Monitor)
2002-01-01
The Guidance, Navigation, and Control Center (GNCC) at Goddard Space Flight Center (GSFC) has successfully developed high-accuracy autonomous satellite navigation systems using the National Aeronautics and Space Administration's (NASA's) space and ground communications systems and the Global Positioning System (GPS). In addition, an autonomous navigation system that uses celestial object sensor measurements is currently under development and has been successfully tested using real Sun and Earth horizon measurements.The GNCC has developed advanced spacecraft systems that provide autonomous navigation and control of formation flyers in near-Earth, high-Earth, and libration point orbits. To support this effort, the GNCC is assessing the relative navigation accuracy achievable for proposed formations using GPS, intersatellite crosslink, ground-to-satellite Doppler, and celestial object sensor measurements. This paper evaluates the performance of these relative navigation approaches for three proposed missions with two or more vehicles maintaining relatively tight formations. High-fidelity simulations were performed to quantify the absolute and relative navigation accuracy as a function of navigation algorithm and measurement type. Realistically-simulated measurements were processed using the extended Kalman filter implemented in the GPS Enhanced Inboard Navigation System (GEONS) flight software developed by GSFC GNCC. Solutions obtained by simultaneously estimating all satellites in the formation were compared with the results obtained using a simpler approach based on differencing independently estimated state vectors.
Autonomous Navigation Improvements for High-Earth Orbiters Using GPS
NASA Technical Reports Server (NTRS)
Long, Anne; Kelbel, David; Lee, Taesul; Garrison, James; Carpenter, J. Russell; Bauer, F. (Technical Monitor)
2000-01-01
The Goddard Space Flight Center is currently developing autonomous navigation systems for satellites in high-Earth orbits where acquisition of the GPS signals is severely limited This paper discusses autonomous navigation improvements for high-Earth orbiters and assesses projected navigation performance for these satellites using Global Positioning System (GPS) Standard Positioning Service (SPS) measurements. Navigation performance is evaluated as a function of signal acquisition threshold, measurement errors, and dynamic modeling errors using realistic GPS signal strength and user antenna models. These analyses indicate that an autonomous navigation position accuracy of better than 30 meters root-mean-square (RMS) can be achieved for high-Earth orbiting satellites using a GPS receiver with a very stable oscillator. This accuracy improves to better than 15 meters RMS if the GPS receiver's signal acquisition threshold can be reduced by 5 dB-Hertz to track weaker signals.
Visual Odometry for Autonomous Deep-Space Navigation
NASA Technical Reports Server (NTRS)
Robinson, Shane; Pedrotty, Sam
2016-01-01
Visual Odometry fills two critical needs shared by all future exploration architectures considered by NASA: Autonomous Rendezvous and Docking (AR&D), and autonomous navigation during loss of comm. To do this, a camera is combined with cutting-edge algorithms (called Visual Odometry) into a unit that provides accurate relative pose between the camera and the object in the imagery. Recent simulation analyses have demonstrated the ability of this new technology to reliably, accurately, and quickly compute a relative pose. This project advances this technology by both preparing the system to process flight imagery and creating an activity to capture said imagery. This technology can provide a pioneering optical navigation platform capable of supporting a wide variety of future missions scenarios: deep space rendezvous, asteroid exploration, loss-of-comm.
Learning for autonomous navigation
NASA Technical Reports Server (NTRS)
Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric
2005-01-01
Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter.
Autonomous navigation and obstacle avoidance for unmanned surface vehicles
NASA Astrophysics Data System (ADS)
Larson, Jacoby; Bruch, Michael; Ebken, John
2006-05-01
The US Navy and other Department of Defense (DoD) and Department of Homeland Security (DHS) organizations are increasingly interested in the use of unmanned surface vehicles (USVs) for a variety of missions and applications. In order for USVs to fill these roles, they must be capable of a relatively high degree of autonomous navigation. Space and Naval Warfare Systems Center, San Diego is developing core technologies required for robust USV operation in a real-world environment, primarily focusing on autonomous navigation, obstacle avoidance, and path planning.
Autonomous Navigation for Deep Space Missions
NASA Technical Reports Server (NTRS)
Bhaskaran, Shyam
2012-01-01
Navigation (determining where the spacecraft is at any given time, controlling its path to achieve desired targets), performed using ground-in- the-loop techniques: (1) Data includes 2-way radiometric (Doppler, range), interferometric (Delta- Differential One-way Range), and optical (images of natural bodies taken by onboard camera) (2) Data received on the ground, processed to determine orbit, commands sent to execute maneuvers to control orbit. A self-contained, onboard, autonomous navigation system can: (1) Eliminate delays due to round-trip light time (2) Eliminate the human factors in ground-based processing (3) Reduce turnaround time from navigation update to minutes, down to seconds (4) React to late-breaking data. At JPL, we have developed the framework and computational elements of an autonomous navigation system, called AutoNav. It was originally developed as one of the technologies for the Deep Space 1 mission, launched in 1998; subsequently used on three other spacecraft, for four different missions. The primary use has been on comet missions to track comets during flybys, and impact one comet.
Development of Mission Enabling Infrastructure — Cislunar Autonomous Positioning System (CAPS)
NASA Astrophysics Data System (ADS)
Cheetham, B. W.
2017-10-01
Advanced Space, LLC is developing the Cislunar Autonomous Positioning System (CAPS) which would provide a scalable and evolvable architecture for navigation to reduce ground congestion and improve operations for missions throughout cislunar space.
Navigation Architecture For A Space Mobile Network
NASA Technical Reports Server (NTRS)
Valdez, Jennifer E.; Ashman, Benjamin; Gramling, Cheryl; Heckler, Gregory W.; Carpenter, Russell
2016-01-01
The Tracking and Data Relay Satellite System (TDRSS) Augmentation Service for Satellites (TASS) is a proposed beacon service to provide a global, space-based GPS augmentation service based on the NASA Global Differential GPS (GDGPS) System. The TASS signal will be tied to the GPS time system and usable as an additional ranging and Doppler radiometric source. Additionally, it will provide data vital to autonomous navigation in the near Earth regime, including space weather information, TDRS ephemerides, Earth Orientation Parameters (EOP), and forward commanding capability. TASS benefits include enhancing situational awareness, enabling increased autonomy, and providing near real-time command access for user platforms. As NASA Headquarters Space Communication and Navigation Office (SCaN) begins to move away from a centralized network architecture and towards a Space Mobile Network (SMN) that allows for user initiated services, autonomous navigation will be a key part of such a system. This paper explores how a TASS beacon service enables the Space Mobile Networking paradigm, what a typical user platform would require, and provides an in-depth analysis of several navigation scenarios and operations concepts.
Design and Development of the WVU Advanced Technology Satellite for Optical Navigation
NASA Astrophysics Data System (ADS)
Straub, Miranda
In order to meet the demands of future space missions, it is beneficial for spacecraft to have the capability to support autonomous navigation. This is true for both crewed and uncrewed vehicles. For crewed vehicles, autonomous navigation would allow the crew to safely navigate home in the event of a communication system failure. For uncrewed missions, autonomous navigation reduces the demand on ground-based infrastructure and could allow for more flexible operation. One promising technique for achieving these goals is through optical navigation. To this end, the present work considers how camera images of the Earth's surface could enable autonomous navigation of a satellite in low Earth orbit. Specifically, this study will investigate the use of coastlines and other natural land-water boundaries for navigation. Observed coastlines can be matched to a pre-existing coastline database in order to determine the location of the spacecraft. This paper examines how such measurements may be processed in an on-board extended Kalman filter (EKF) to provide completely autonomous estimates of the spacecraft state throughout the duration of the mission. In addition, future work includes implementing this work on a CubeSat mission within the WVU Applied Space Exploration Lab (ASEL). The mission titled WVU Advanced Technology Satellite for Optical Navigation (WATSON) will provide students with an opportunity to experience the life cycle of a spacecraft from design through operation while hopefully meeting the primary and secondary goals defined for mission success. The spacecraft design process, although simplified by CubeSat standards, will be discussed in this thesis as well as the current results of laboratory testing with the CubeSat model in the ASEL.
The JPL roadmap for Deep Space navigation
NASA Technical Reports Server (NTRS)
Martin-Mur, Tomas J.; Abraham, Douglas S.; Berry, David; Bhaskaran, Shyam; Cesarone, Robert J.; Wood, Lincoln
2006-01-01
This paper reviews the tentative set of deep space missions that will be supported by NASA's Deep Space Mission System in the next twenty-five years, and extracts the driving set of navigation capabilities that these missions will require. There will be many challenges including the support of new mission navigation approaches such as formation flying and rendezvous in deep space, low-energy and low-thrust orbit transfers, precise landing and ascent vehicles, and autonomous navigation. Innovative strategies and approaches will be needed to develop and field advanced navigation capabilities.
The Development of a Simulator System and Hardware Test Bed for Deep Space X-Ray Navigation
NASA Astrophysics Data System (ADS)
Doyle, Patrick T.
2013-03-01
Currently, there is a considerable interest in developing technologies that will allow using photon measurements from celestial x-ray sources for deep space navigation. The impetus for this is that many envisioned future space missions will require spacecraft to have autonomous navigation capabilities. For missions close to Earth, Global Navigation Satellite Systems (GNSS) such as GPS are readily available for use, but for missions far from Earth, other alternatives must be provided. While existing systems such as the Deep Space Network (DSN) can be used, latencies associated with servicing a fleet of vehicles may not be compatible with some autonomous operations requiring timely updates of their navigation solution. Because of their somewhat predictable emissions, pulsars are the ideal candidates for x-ray sources that can be used to provide key parameters for navigation. Algorithms and simulation tools that will enable designing and analyzing x-ray navigation concepts are presented. The development of a compact x-ray detector system is pivotal to the eventual deployment of such navigation systems. Therefore, results of a high altitude balloon test to evaluate the design of a compact x-ray detector system are described as well.
Bourbakis, N G
1997-01-01
This paper presents a generic traffic priority language, called KYKLOFORTA, used by autonomous robots for collision-free navigation in a dynamic unknown or known navigation space. In a previous work by X. Grossmman (1988), a set of traffic control rules was developed for the navigation of the robots on the lines of a two-dimensional (2-D) grid and a control center coordinated and synchronized their movements. In this work, the robots are considered autonomous: they are moving anywhere and in any direction inside the free space, and there is no need of a central control to coordinate and synchronize them. The requirements for each robot are i) visual perception, ii) range sensors, and iii) the ability of each robot to detect other moving objects in the same free navigation space, define the other objects perceived size, their velocity and their directions. Based on these assumptions, a traffic priority language is needed for each robot, making it able to decide during the navigation and avoid possible collision with other moving objects. The traffic priority language proposed here is based on a set of primitive traffic priority alphabet and rules which compose pattern of corridors for the application of the traffic priority rules.
NASA Technical Reports Server (NTRS)
Winternitz, Luke
2017-01-01
This talk will describe two first-of-their-kind technology demonstrations attached to ongoing NASA science missions, both of which aim to extend the range of autonomous spacecraft navigation far from the Earth. First, we will describe the onboard GPS navigation system for the Magnetospheric Multiscale (MMS) mission which is currently operating in elliptic orbits reaching nearly halfway to the Moon. The MMS navigation system is a key outgrowth of a larger effort at NASA Goddard Space Flight Center to advance high-altitude Global Navigation Satellite System (GNSS) navigation on multiple fronts, including developing Global Positioning System receivers and onboard navigation software, running simulation studies, and leading efforts to characterize and protect signals at high-altitude in the so-called GNSS Space-Service Volume (SSV). In the second part of the talk, we will describe the Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) mission that aims to make the first in-space demonstration of X-ray pulsar navigation (XNAV). SEXTANT is attached to the NASA astrophysics mission Neutron-star Interior Composition ExploreR (NICER) whose International Space Station mounted X-ray telescope is investigating the fundamental physics of extremes in gravity, material density, and electromagnetic fields found in neutron stars, and whose instrument provides a nearly ideal navigation sensor for XNAV.
Experiment D009: Simple navigation
NASA Technical Reports Server (NTRS)
Silva, R. M.; Jorris, T. R.; Vallerie, E. M., III
1971-01-01
Space position-fixing techniques have been investigated by collecting data on the observable phenomena of space flight that could be used to solve the problem of autonomous navigation by the use of optical data and manual computations to calculate the position of a spacecraft. After completion of the developmental and test phases, the product of the experiment would be a manual-optical technique of orbital space navigation that could be used as a backup to onboard and ground-based spacecraft-navigation systems.
The use of x-ray pulsar-based navigation method for interplanetary flight
NASA Astrophysics Data System (ADS)
Yang, Bo; Guo, Xingcan; Yang, Yong
2009-07-01
As interplanetary missions are increasingly complex, the existing unique mature interplanetary navigation method mainly based on radiometric tracking techniques of Deep Space Network can not meet the rising demands of autonomous real-time navigation. This paper studied the applications for interplanetary flights of a new navigation technology under rapid development-the X-ray pulsar-based navigation for spacecraft (XPNAV), and valued its performance with a computer simulation. The XPNAV is an excellent autonomous real-time navigation method, and can provide comprehensive navigation information, including position, velocity, attitude, attitude rate and time. In the paper the fundamental principles and time transformation of the XPNAV were analyzed, and then the Delta-correction XPNAV blending the vehicles' trajectory dynamics with the pulse time-of-arrival differences at nominal and estimated spacecraft locations within an Unscented Kalman Filter (UKF) was discussed with a background mission of Mars Pathfinder during the heliocentric transferring orbit. The XPNAV has an intractable problem of integer pulse phase cycle ambiguities similar to the GPS carrier phase navigation. This article innovatively proposed the non-ambiguity assumption approach based on an analysis of the search space array method to resolve pulse phase cycle ambiguities between the nominal position and estimated position of the spacecraft. The simulation results show that the search space array method are computationally intensive and require long processing time when the position errors are large, and the non-ambiguity assumption method can solve ambiguity problem quickly and reliably. It is deemed that autonomous real-time integrated navigation system of the XPNAV blending with DSN, celestial navigation, inertial navigation and so on will be the development direction of interplanetary flight navigation system in the future.
Autonomous GPS/INS navigation experiment for Space Transfer Vehicle
NASA Technical Reports Server (NTRS)
Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. W.
1993-01-01
An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.
Autonomous GPS/INS navigation experiment for Space Transfer Vehicle (STV)
NASA Technical Reports Server (NTRS)
Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. Wayne
1991-01-01
An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.
Autonomous GPS/INS navigation experiment for Space Transfer Vehicle
NASA Astrophysics Data System (ADS)
Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. W.
1993-07-01
An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.
Navigation Concepts for the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Long, Anne; Leung, Dominic; Kelbel, David; Beckman, Mark; Grambling, Cheryl
2003-01-01
This paper evaluates the performance that can be achieved using candidate ground and onboard navigation approaches for operation of the James Webb Space Telescope, which will be in an orbit about the Sun-Earth L2 libration point. The ground navigation approach processes standard range and Doppler measurements from the Deep Space Network The onboard navigation approach processes celestial object measurements and/or ground-to- spacecraft Doppler measurements to autonomously estimate the spacecraft s position and velocity and Doppler reference frequency. Particular attention is given to assessing the absolute position and velocity accuracy that can be achieved in the presence of the frequent spacecraft reorientations and momentum unloads planned for this mission. The ground navigation approach provides stable navigation solutions using a tracking schedule of one 30-minute contact per day. The onboard navigation approach that uses only optical quality celestial object measurements provides stable autonomous navigation solutions. This study indicates that unmodeled changes in the solar radiation pressure cross-sectional area and modeled momentum unload velocity changes are the major error sources. These errors can be mitigated by modeling these changes, by estimating corrections to compensate for the changes, or by including acceleration measurements.
Multi-Spacecraft Autonomous Positioning System
NASA Technical Reports Server (NTRS)
Anzalone, Evan
2015-01-01
As the number of spacecraft in simultaneous operation continues to grow, there is an increased dependency on ground-based navigation support. The current baseline system for deep space navigation utilizes Earth-based radiometric tracking, requiring long-duration observations to perform orbit determination and generate a state update. The age, complexity, and high utilization of the ground assets pose a risk to spacecraft navigation performance. In order to perform complex operations at large distances from Earth, such as extraterrestrial landing and proximity operations, autonomous systems are required. With increasingly complex mission operations, the need for frequent and Earth-independent navigation capabilities is further reinforced. The Multi-spacecraft Autonomous Positioning System (MAPS) takes advantage of the growing interspacecraft communication network and infrastructure to allow for Earth-autonomous state measurements to enable network-based space navigation. A notional concept of operations is given in figure 1. This network is already being implemented and routinely used in Martian communications through the use of the Mars Reconnaissance Orbiter and Mars Odyssey spacecraft as relays for surface assets. The growth of this communications architecture is continued through MAVEN, and future potential commercial Mars telecom orbiters. This growing network provides an initial Marslocal capability for inter-spacecraft communication and navigation. These navigation updates are enabled by cross-communication between assets in the network, coupled with onboard navigation estimation routines to integrate packet travel time to generate ranging measurements. Inter-spacecraft communication allows for frequent state broadcasts and time updates from trusted references. The architecture is a software-based solution, enabling its implementation on a wide variety of current assets, with the operational constraints and measurement accuracy determined by onboard systems.
NASA Technical Reports Server (NTRS)
Wagenknecht, J.; Fredrickson, S.; Manning, T.; Jones, B.
2003-01-01
Engineers at NASA Johnson Space Center have designed, developed, and tested a nanosatellite-class free-flyer intended for future external inspection and remote viewing of human spaceflight activities. The technology demonstration system, known as the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam), has been integrated into the approximate form and function of a flight system. The primary focus has been to develop a system capable of providing external views of the International Space Station. The Mini AERCam system is spherical-shaped and less than eight inches in diameter. It has a full suite of guidance, navigation, and control hardware and software, and is equipped with two digital video cameras and a high resolution still image camera. The vehicle is designed for either remotely piloted operations or supervised autonomous operations. Tests have been performed in both a six degree-of-freedom closed-loop orbital simulation and on an air-bearing table. The Mini AERCam system can also be used as a test platform for evaluating algorithms and relative navigation for autonomous proximity operations and docking around the Space Shuttle Orbiter or the ISS.
Open-Loop Performance of COBALT Precision Landing Payload on a Commercial Sub-Orbital Rocket
NASA Technical Reports Server (NTRS)
Restrepo, Carolina I.; Carson, John M., III; Amzajerdian, Farzin; Seubert, Carl R.; Lovelace, Ronney S.; McCarthy, Megan M.; Tse, Teming; Stelling, Richard; Collins, Steven M.
2018-01-01
An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) platform was conducted onboard the Masten Xodiac suborbital rocket testbed. The COBALT platform integrates NASA Guidance, Navigation and Control (GN&C) sensing technologies for autonomous, precise soft landing, including the Navigation Doppler Lidar (NDL) velocity and range sensor and the Lander Vision System (LVS) Terrain Relative Navigation (TRN) system. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a navigation solution that is independent of GPS and suitable for future, autonomous, planetary, landing systems. COBALT was a passive payload during the open loop tests. COBALT's sensors were actively taking data and processing it in real time, but the Xodiac rocket flew with its own GPS-navigation system as a risk reduction activity in the maturation of the technologies towards space flight. A future closed-loop test campaign is planned where the COBALT navigation solution will be used to fly its host vehicle.
Autonomous interplanetary constellation design
NASA Astrophysics Data System (ADS)
Chow, Cornelius Channing, II
According to NASA's integrated space technology roadmaps, space-based infrastructures are envisioned as necessary ingredients to a sustained effort in continuing space exploration. Whether it be for extra-terrestrial habitats, roving/cargo vehicles, or space tourism, autonomous space networks will provide a vital communications lifeline for both future robotic and human missions alike. Projecting that the Moon will be a bustling hub of activity within a few decades, a near-term opportunity for in-situ infrastructure development is within reach. This dissertation addresses the anticipated need for in-space infrastructure by investigating a general design methodology for autonomous interplanetary constellations; to illustrate the theory, this manuscript presents results from an application to the Earth-Moon neighborhood. The constellation design methodology is formulated as an optimization problem, involving a trajectory design step followed by a spacecraft placement sequence. Modeling the dynamics as a restricted 3-body problem, the investigated design space consists of families of periodic orbits which play host to the constellations, punctuated by arrangements of spacecraft autonomously guided by a navigation strategy called LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation). Instead of more traditional exhaustive search methods, a numerical continuation approach is implemented to map the admissible configuration space. In particular, Keller's pseudo-arclength technique is used to follow folding/bifurcating solution manifolds, which are otherwise inaccessible with other parameter continuation schemes. A succinct characterization of the underlying structure of the local, as well as global, extrema is thus achievable with little a priori intuition of the solution space. Furthermore, the proposed design methodology offers benefits in computation speed plus the ability to handle mildly stochastic systems. An application of the constellation design methodology to the restricted Earth-Moon system, reveals optimal pairwise configurations for various L1, L2, and L5 (halo, axial, and vertical) periodic orbit families. Navigation accuracies, ranging from O (10+/-1) meters in position space, are obtained for the optimal Earth-Moon constellations, given measurement noise on the order of 1 meter.
Experiments in teleoperator and autonomous control of space robotic vehicles
NASA Technical Reports Server (NTRS)
Alexander, Harold L.
1990-01-01
A research program and strategy are described which include fundamental teleoperation issues and autonomous-control issues of sensing and navigation for satellite robots. The program consists of developing interfaces for visual operation and studying the consequences of interface designs as well as developing navigation and control technologies based on visual interaction. A space-robot-vehicle simulator is under development for use in virtual-environment teleoperation experiments and neutral-buoyancy investigations. These technologies can be utilized in a study of visual interfaces to address tradeoffs between head-tracking and manual remote cameras, panel-mounted and helmet-mounted displays, and stereoscopic and monoscopic display systems. The present program can provide significant data for the development of control experiments for autonomously controlled satellite robots.
NASA Astrophysics Data System (ADS)
Croft, John; Deily, John; Hartman, Kathy; Weidow, David
1998-01-01
In the twenty-first century, NASA envisions frequent low-cost missions to explore the solar system, observe the universe, and study our planet. To realize NASA's goal, the Guidance, Navigation, and Control Center (GNCC) at the Goddard Space Flight Center sponsors technology programs that enhance spacecraft performance, streamline processes and ultimately enable cheaper science. Our technology programs encompass control system architectures, sensor and actuator components, electronic systems, design and development of algorithms, embedded systems and space vehicle autonomy. Through collaboration with government, universities, non-profit organizations, and industry, the GNCC incrementally develops key technologies that conquer NASA's challenges. This paper presents an overview of several innovative technology initiatives for the autonomous guidance, navigation, and control (GN&C) of satellites.
The Role of X-Rays in Future Space Navigation and Communication
NASA Technical Reports Server (NTRS)
Winternitz, Luke M. B.; Gendreau, Keith C.; Hasouneh, Monther A.; Mitchell, Jason W.; Fong, Wai H.; Lee, Wing-Tsz; Gavriil, Fotis; Arzoumanian, Zaven
2013-01-01
In the near future, applications using X-rays will enable autonomous navigation and time distribution throughout the solar system, high capacity and low-power space data links, highly accurate attitude sensing, and extremely high-precision formation flying capabilities. Each of these applications alone has the potential to revolutionize mission capabilities, particularly beyond Earth orbit. This paper will outline the NASA Goddard Space Flight Center vision and efforts toward realizing the full potential of X-ray navigation and communications.
Navigation Architecture for a Space Mobile Network
NASA Technical Reports Server (NTRS)
Valdez, Jennifer E.; Ashman, Benjamin; Gramling, Cheryl; Heckler, Gregory W.; Carpenter, Russell
2016-01-01
The Tracking and Data Relay Satellite System (TDRSS) Augmentation Service for Satellites (TASS) is a proposed beacon service to provide a global, space based GPS augmentation service based on the NASA Global Differential GPS (GDGPS) System. The TASS signal will be tied to the GPS time system and usable as an additional ranging and Doppler radiometric source. Additionally, it will provide data vital to autonomous navigation in the near Earth regime, including space weather information, TDRS ephemerides, Earth Orientation Parameters (EOP), and forward commanding capability. TASS benefits include enhancing situational awareness, enabling increased autonomy, and providing near real-time command access for user platforms. As NASA Headquarters' Space Communication and Navigation Office (SCaN) begins to move away from a centralized network architecture and towards a Space Mobile Network (SMN) that allows for user initiated services, autonomous navigation will be a key part of such a system. This paper explores how a TASS beacon service enables the Space Mobile Networking paradigm, what a typical user platform would require, and provides an in-depth analysis of several navigation scenarios and operations concepts. This paper provides an overview of the TASS beacon and its role within the SMN and user community. Supporting navigation analysis is presented for two user mission scenarios: an Earth observing spacecraft in low earth orbit (LEO), and a highly elliptical spacecraft in a lunar resonance orbit. These diverse flight scenarios indicate the breadth of applicability of the TASS beacon for upcoming users within the current network architecture and in the SMN.
Three-dimensional motor schema based navigation
NASA Technical Reports Server (NTRS)
Arkin, Ronald C.
1989-01-01
Reactive schema-based navigation is possible in space domains by extending the methods developed for ground-based navigation found within the Autonomous Robot Architecture (AuRA). Reformulation of two dimensional motor schemas for three dimensional applications is a straightforward process. The manifold advantages of schema-based control persist, including modular development, amenability to distributed processing, and responsiveness to environmental sensing. Simulation results show the feasibility of this methodology for space docking operations in a cluttered work area.
NASA Technical Reports Server (NTRS)
1976-01-01
The six themes identified by the Workshop have many common navigation guidance and control needs. All the earth orbit themes have a strong requirement for attitude, figure and stabilization control of large space structures, a requirement not currently being supported. All but the space transportation theme have need for precision pointing of spacecraft and instruments. In addition all the themes have requirements for increasing autonomous operations for such activities as spacecraft and experiment operations, onboard mission modification, rendezvous and docking, spacecraft assembly and maintenance, navigation and guidance, and self-checkout, test and repair. Major new efforts are required to conceptualize new approaches to large space antennas and arrays that are lightweight, readily deployable, and capable of precise attitude and figure control. Conventional approaches offer little hope of meeting these requirements. Functions that can benefit from increasing automation or autonomous operations are listed.
Autonomous Navigation Error Propagation Assessment for Lunar Surface Mobility Applications
NASA Technical Reports Server (NTRS)
Welch, Bryan W.; Connolly, Joseph W.
2006-01-01
The NASA Vision for Space Exploration is focused on the return of astronauts to the Moon. While navigation systems have already been proven in the Apollo missions to the moon, the current exploration campaign will involve more extensive and extended missions requiring new concepts for lunar navigation. In this document, the results of an autonomous navigation error propagation assessment are provided. The analysis is intended to be the baseline error propagation analysis for which Earth-based and Lunar-based radiometric data are added to compare these different architecture schemes, and quantify the benefits of an integrated approach, in how they can handle lunar surface mobility applications when near the Lunar South pole or on the Lunar Farside.
Autonomous vision-based navigation for proximity operations around binary asteroids
NASA Astrophysics Data System (ADS)
Gil-Fernandez, Jesus; Ortega-Hernando, Guillermo
2018-02-01
Future missions to small bodies demand higher level of autonomy in the Guidance, Navigation and Control system for higher scientific return and lower operational costs. Different navigation strategies have been assessed for ESA's asteroid impact mission (AIM). The main objective of AIM is the detailed characterization of binary asteroid Didymos. The trajectories for the proximity operations shall be intrinsically safe, i.e., no collision in presence of failures (e.g., spacecraft entering safe mode), perturbations (e.g., non-spherical gravity field), and errors (e.g., maneuver execution error). Hyperbolic arcs with sufficient hyperbolic excess velocity are designed to fulfil the safety, scientific, and operational requirements. The trajectory relative to the asteroid is determined using visual camera images. The ground-based trajectory prediction error at some points is comparable to the camera Field Of View (FOV). Therefore, some images do not contain the entire asteroid. Autonomous navigation can update the state of the spacecraft relative to the asteroid at higher frequency. The objective of the autonomous navigation is to improve the on-board knowledge compared to the ground prediction. The algorithms shall fit in off-the-shelf, space-qualified avionics. This note presents suitable image processing and relative-state filter algorithms for autonomous navigation in proximity operations around binary asteroids.
Autonomous vision-based navigation for proximity operations around binary asteroids
NASA Astrophysics Data System (ADS)
Gil-Fernandez, Jesus; Ortega-Hernando, Guillermo
2018-06-01
Future missions to small bodies demand higher level of autonomy in the Guidance, Navigation and Control system for higher scientific return and lower operational costs. Different navigation strategies have been assessed for ESA's asteroid impact mission (AIM). The main objective of AIM is the detailed characterization of binary asteroid Didymos. The trajectories for the proximity operations shall be intrinsically safe, i.e., no collision in presence of failures (e.g., spacecraft entering safe mode), perturbations (e.g., non-spherical gravity field), and errors (e.g., maneuver execution error). Hyperbolic arcs with sufficient hyperbolic excess velocity are designed to fulfil the safety, scientific, and operational requirements. The trajectory relative to the asteroid is determined using visual camera images. The ground-based trajectory prediction error at some points is comparable to the camera Field Of View (FOV). Therefore, some images do not contain the entire asteroid. Autonomous navigation can update the state of the spacecraft relative to the asteroid at higher frequency. The objective of the autonomous navigation is to improve the on-board knowledge compared to the ground prediction. The algorithms shall fit in off-the-shelf, space-qualified avionics. This note presents suitable image processing and relative-state filter algorithms for autonomous navigation in proximity operations around binary asteroids.
Autonomous Navigation of the SSTI/Lewis Spacecraft Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Hart, R. C.; Long, A. C.; Lee, T.
1997-01-01
The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) is pursuing the application of Global Positioning System (GPS) technology to improve the accuracy and economy of spacecraft navigation. High-accuracy autonomous navigation algorithms are being flight qualified in conjunction with GSFC's GPS Attitude Determination Flyer (GADFLY) experiment on the Small Satellite Technology Initiative (SSTI) Lewis spacecraft, which is scheduled for launch in 1997. Preflight performance assessments indicate that these algorithms can provide a real-time total position accuracy of better than 10 meters (1 sigma) and velocity accuracy of better than 0.01 meter per second (1 sigma), with selective availability at typical levels. This accuracy is projected to improve to the 2-meter level if corrections to be provided by the GPS Wide Area Augmentation System (WAAS) are included.
NASA Technical Reports Server (NTRS)
1975-01-01
User technology requirements are identified in relation to needed technology advancement for future space missions in the areas of navigation, guidance, and control. Emphasis is placed on: reduction of mission support cost by 50% through autonomous operation, a ten-fold increase in mission output through improved pointing and control, and a hundred-fold increase in human productivity in space through large-scale teleoperator applications.
First Results from a Hardware-in-the-Loop Demonstration of Closed-Loop Autonomous Formation Flying
NASA Technical Reports Server (NTRS)
Gill, E.; Naasz, Bo; Ebinuma, T.
2003-01-01
A closed-loop system for the demonstration of formation flying technologies has been developed at NASA s Goddard Space Flight Center. Making use of a GPS signal simulator with a dual radio frequency outlet, the system includes two GPS space receivers as well as a powerful onboard navigation processor dedicated to the GPS-based guidance, navigation, and control of a satellite formation in real-time. The closed-loop system allows realistic simulations of autonomous formation flying scenarios, enabling research in the fields of tracking and orbit control strategies for a wide range of applications. A sample scenario has been set up where the autonomous transition of a satellite formation from an initial along-track separation of 800 m to a final distance of 100 m has been demonstrated. As a result, a typical control accuracy of about 5 m has been achieved which proves the applicability of autonomous formation flying techniques to formations of satellites as close as 50 m.
INL Autonomous Navigation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
2005-03-30
The INL Autonomous Navigation System provides instructions for autonomously navigating a robot. The system permits high-speed autonomous navigation including obstacle avoidance, waypoing navigation and path planning in both indoor and outdoor environments.
Terminal Homing for Autonomous Underwater Vehicle Docking
2016-06-01
underwater domain, accurate navigation. Above the water, light and electromagnetic signals travel well through air and space, mediums that allow for a...DISTRIBUTION CODE 13. ABSTRACT The use of docking stations for autonomous underwater vehicles (AUV) provides the ability to keep a vehicle on...Mechanical and Aerospace Engineering iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT The use of docking stations for autonomous underwater
Embedded Relative Navigation Sensor Fusion Algorithms for Autonomous Rendezvous and Docking Missions
NASA Technical Reports Server (NTRS)
DeKock, Brandon K.; Betts, Kevin M.; McDuffie, James H.; Dreas, Christine B.
2008-01-01
bd Systems (a subsidiary of SAIC) has developed a suite of embedded relative navigation sensor fusion algorithms to enable NASA autonomous rendezvous and docking (AR&D) missions. Translational and rotational Extended Kalman Filters (EKFs) were developed for integrating measurements based on the vehicles' orbital mechanics and high-fidelity sensor error models and provide a solution with increased accuracy and robustness relative to any single relative navigation sensor. The filters were tested tinough stand-alone covariance analysis, closed-loop testing with a high-fidelity multi-body orbital simulation, and hardware-in-the-loop (HWIL) testing in the Marshall Space Flight Center (MSFC) Flight Robotics Laboratory (FRL).
NASA Astrophysics Data System (ADS)
Um, Jaeyong
2001-08-01
The Space Integrated GPS/INS (SIGI) sensor is the primary navigation and attitude determination source for the International Space Station (ISS). The SIGI was successfully demonstrated on-orbit for the first time in the SIGI Orbital Attitude Readiness (SOAR) demonstration on the Space Shuttle Atlantis in May 2000. Numerous proximity operations near the ISS have been and will be performed over the lifetime of the Station. The development of an autonomous relative navigation system is needed to improve the safety and efficiency of vehicle operations near the ISS. A hardware simulation study was performed for the GPS-based relative navigation using the state vector difference approach and the interferometric approach in the absence of multipath. The interferometric approach, where the relative states are estimated directly, showed comparable results for a 1 km baseline. One of the most pressing current technical issues is the design of an autonomous relative navigation system in the proximity of the ISS, where GPS signals are blocked and maneuvers happen frequently. An integrated GPS/INS system is investigated for the possibility of a fully autonomous relative navigation system. Another application of GPS measurements is determination of the vehicle's orientation in space. This study used the SOAR experiment data to characterize the SICI's on-orbit performance for attitude determination. A cold start initialization algorithm was developed for integer ambiguity resolution in any initial orientation. The original algorithm that was used in the SIGI had an operational limitation in the integer ambiguity resolution, which was developed for terrestrial applications, and limited its effectiveness in space. The new algorithm was tested using the SOAR data and has been incorporated in the current SIGI flight software. The attitude estimation performance was examined using two different GPS/INS integration algorithms. The GPS/INS attitude solution using the SOAR data was as accurate as 0.06 deg (RMS) in 3-axis with multipath mitigation. Other improvements to the attitude determination algorithm were the development of a faster integer ambiguity resolution method and the incorporation of line bias modeling.
Evaluation of Relative Navigation Algorithms for Formation-Flying Satellites
NASA Technical Reports Server (NTRS)
Kelbel, David; Lee, Taesul; Long, Anne; Carpenter, J. Russell; Gramling, Cheryl
2001-01-01
Goddard Space Flight Center is currently developing advanced spacecraft systems to provide autonomous navigation and control of formation flyers. This paper discusses autonomous relative navigation performance for formations in eccentric, medium, and high-altitude Earth orbits using Global Positioning System (GPS) Standard Positioning Service (SPS) and intersatellite range measurements. The performance of several candidate relative navigation approaches is evaluated. These analyses indicate that the relative navigation accuracy is primarily a function of the frequency of acquisition and tracking of the GPS signals. A relative navigation position accuracy of 0.5 meters root-mean-square (RMS) can be achieved for formations in medium-attitude eccentric orbits that can continuously track at least one GPS signal. A relative navigation position accuracy of better than 75 meters RMS can be achieved for formations in high-altitude eccentric orbits that have sparse tracking of the GPS signals. The addition of round-trip intersatellite range measurements can significantly improve relative navigation accuracy for formations with sparse tracking of the GPS signals.
Libration Point Navigation Concepts Supporting the Vision for Space Exploration
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Folta, David C.; Moreau, Michael C.; Quinn, David A.
2004-01-01
This work examines the autonomous navigation accuracy achievable for a lunar exploration trajectory from a translunar libration point lunar navigation relay satellite, augmented by signals from the Global Positioning System (GPS). We also provide a brief analysis comparing the libration point relay to lunar orbit relay architectures, and discuss some issues of GPS usage for cis-lunar trajectories.
Navigation strategies for multiple autonomous mobile robots moving in formation
NASA Technical Reports Server (NTRS)
Wang, P. K. C.
1991-01-01
The problem of deriving navigation strategies for a fleet of autonomous mobile robots moving in formation is considered. Here, each robot is represented by a particle with a spherical effective spatial domain and a specified cone of visibility. The global motion of each robot in the world space is described by the equations of motion of the robot's center of mass. First, methods for formation generation are discussed. Then, simple navigation strategies for robots moving in formation are derived. A sufficient condition for the stability of a desired formation pattern for a fleet of robots each equipped with the navigation strategy based on nearest neighbor tracking is developed. The dynamic behavior of robot fleets consisting of three or more robots moving in formation in a plane is studied by means of computer simulation.
Advancing Navigation, Timing, and Science with the Deep Space Atomic Clock
NASA Technical Reports Server (NTRS)
Ely, Todd A.; Seubert, Jill; Bell, Julia
2014-01-01
NASA's Deep Space Atomic Clock mission is developing a small, highly stable mercury ion atomic clock with an Allan deviation of at most 1e-14 at one day, and with current estimates near 3e-15. This stability enables one-way radiometric tracking data with accuracy equivalent to and, in certain conditions, better than current two-way deep space tracking data; allowing a shift to a more efficient and flexible one-way deep space navigation architecture. DSAC-enabled one-way tracking will benefit navigation and radio science by increasing the quantity and quality of tracking data. Additionally, DSAC would be a key component to fully-autonomous onboard radio navigation useful for time-sensitive situations. Potential deep space applications of DSAC are presented, including orbit determination of a Mars orbiter and gravity science on a Europa flyby mission.
The Deep Space Atomic Clock: Ushering in a New Paradigm for Radio Navigation and Science
NASA Technical Reports Server (NTRS)
Ely, Todd; Seubert, Jill; Prestage, John; Tjoelker, Robert
2013-01-01
The Deep Space Atomic Clock (DSAC) mission will demonstrate the on-orbit performance of a high-accuracy, high-stability miniaturized mercury ion atomic clock during a year-long experiment in Low Earth Orbit. DSAC's timing error requirement provides the frequency stability necessary to perform deep space navigation based solely on one-way radiometric tracking data. Compared to a two-way tracking paradigm, DSAC-enabled one-way tracking will benefit navigation and radio science by increasing the quantity and quality of tracking data. Additionally, DSAC also enables fully-autonomous onboard navigation useful for time-sensitive situations. The technology behind the mercury ion atomic clock and a DSAC mission overview are presented. Example deep space applications of DSAC, including navigation of a Mars orbiter and Europa flyby gravity science, highlight the benefits of DSAC-enabled one-way Doppler tracking.
Autonomous Navigation With Ground Station One-Way Forward-Link Doppler Data
NASA Technical Reports Server (NTRS)
Horstkamp, G. M.; Niklewski, D. J.; Gramling, C. J.
1996-01-01
The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) has spent several years developing operational onboard navigation systems (ONS's) to provide real time autonomous, highly accurate navigation products for spacecraft using NASA's space and ground communication systems. The highly successful Tracking and Data Relay Satellite (TDRSS) ONS (TONS) experiment on the Explorer Platform/Extreme Ultraviolet (EP/EUV) spacecraft, launched on June 7, 1992, flight demonstrated the ONS for high accuracy navigation using TDRSS forward link communication services. In late 1994, a similar ONS experiment was performed using EP/EUV flight hardware (the ultrastable oscillator and Doppler extractor card in one of the TDRSS transponders) and ground system software to demonstrate the feasibility of using an ONS with ground station forward link communication services. This paper provides a detailed evaluation of ground station-based ONS performance of data collected over a 20 day period. The ground station ONS (GONS) experiment results are used to project the expected performance of an operational system. The GONS processes Doppler data derived from scheduled ground station forward link services using a sequential estimation algorithm enhanced by a sophisticated process noise model to provide onboard orbit and frequency determination. Analysis of the GONS experiment performance indicates that real time onboard position accuracies of better than 125 meters (1 sigma) are achievable with two or more 5-minute contacts per day for the EP/EUV 525 kilometer altitude, 28.5 degree inclination orbit. GONS accuracy is shown to be a function of the fidelity of the onboard propagation model, the frequency/geometry of the tracking contacts, and the quality of the tracking measurements. GONS provides a viable option for using autonomous navigation to reduce operational costs for upcoming spacecraft missions with moderate position accuracy requirements.
Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke
2018-04-09
In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology.
Autonomous Navigation Results from the Mars Exploration Rover (MER) Mission
NASA Technical Reports Server (NTRS)
Maimone, Mark; Johnson, Andrew; Cheng, Yang; Willson, Reg; Matthies, Larry H.
2004-01-01
In January, 2004, the Mars Exploration Rover (MER) mission landed two rovers, Spirit and Opportunity, on the surface of Mars. Several autonomous navigation capabilities were employed in space for the first time in this mission. ]n the Entry, Descent, and Landing (EDL) phase, both landers used a vision system called the, Descent Image Motion Estimation System (DIMES) to estimate horizontal velocity during the last 2000 meters (m) of descent, by tracking features on the ground with a downlooking camera, in order to control retro-rocket firing to reduce horizontal velocity before impact. During surface operations, the rovers navigate autonomously using stereo vision for local terrain mapping and a local, reactive planning algorithm called Grid-based Estimation of Surface Traversability Applied to Local Terrain (GESTALT) for obstacle avoidance. ]n areas of high slip, stereo vision-based visual odometry has been used to estimate rover motion, As of mid-June, Spirit had traversed 3405 m, of which 1253 m were done autonomously; Opportunity had traversed 1264 m, of which 224 m were autonomous. These results have contributed substantially to the success of the mission and paved the way for increased levels of autonomy in future missions.
Small Body Landing Accuracy Using In-Situ Navigation
NASA Technical Reports Server (NTRS)
Bhaskaran, Shyam; Nandi, Sumita; Broschart, Stephen; Wallace, Mark; Olson, Corwin; Cangahuala, L. Alberto
2011-01-01
Spacecraft landings on small bodies (asteroids and comets) can require target accuracies too stringent to be met using ground-based navigation alone, especially if specific landing site requirements must be met for safety or to meet science goals. In-situ optical observations coupled with onboard navigation processing can meet the tighter accuracy requirements to enable such missions. Recent developments in deep space navigation capability include a self-contained autonomous navigation system (used in flight on three missions) and a landmark tracking system (used experimentally on the Japanese Hayabusa mission). The merging of these two technologies forms a methodology to perform autonomous onboard navigation around small bodies. This paper presents an overview of these systems, as well as the results from Monte Carlo studies to quantify the achievable landing accuracies by using these methods. Sensitivity of the results to variations in spacecraft maneuver execution error, attitude control accuracy and unmodeled forces are examined. Cases for two bodies, a small asteroid and on a mid-size comet, are presented.
Development of Navigation Doppler Lidar for Future Landing Mission
NASA Technical Reports Server (NTRS)
Amzajerdian, Farzin; Hines, Glenn D.; Petway, Larry B.; Barnes, Bruce W.; Pierrottet, Diego F.; Carson, John M., III
2016-01-01
A coherent Navigation Doppler Lidar (NDL) sensor has been developed under the Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project to support future NASA missions to planetary bodies. This lidar sensor provides accurate surface-relative altitude and vector velocity data during the descent phase that can be used by an autonomous Guidance, Navigation, and Control (GN&C) system to precisely navigate the vehicle from a few kilometers above the ground to a designated location and execute a controlled soft touchdown. The operation and performance of the NDL was demonstrated through closed-loop flights onboard the rocket-propelled Morpheus vehicle in 2014. In Morpheus flights, conducted at the NASA Kennedy Space Center, the NDL data was used by an autonomous GN&C system to navigate and land the vehicle precisely at the selected location surrounded by hazardous rocks and craters. Since then, development efforts for the NDL have shifted toward enhancing performance, optimizing design, and addressing spaceflight size and mass constraints and environmental and reliability requirements. The next generation NDL, with expanded operational envelope and significantly reduced size, will be demonstrated in 2017 through a new flight test campaign onboard a commercial rocketpropelled test vehicle.
Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke
2018-01-01
In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology. PMID:29642549
Experiments in teleoperator and autonomous control of space robotic vehicles
NASA Technical Reports Server (NTRS)
Alexander, Harold L.
1991-01-01
A program of research embracing teleoperator and automatic navigational control of freely flying satellite robots is presented. Current research goals include: (1) developing visual operator interfaces for improved vehicle teleoperation; (2) determining the effects of different visual interface system designs on operator performance; and (3) achieving autonomous vision-based vehicle navigation and control. This research program combines virtual-environment teleoperation studies and neutral-buoyancy experiments using a space-robot simulator vehicle currently under development. Visual-interface design options under investigation include monoscopic versus stereoscopic displays and cameras, helmet-mounted versus panel-mounted display monitors, head-tracking versus fixed or manually steerable remote cameras, and the provision of vehicle-fixed visual cues, or markers, in the remote scene for improved sensing of vehicle position, orientation, and motion.
Real-Time Hazard Detection and Avoidance Demonstration for a Planetary Lander
NASA Technical Reports Server (NTRS)
Epp, Chirold D.; Robertson, Edward A.; Carson, John M., III
2014-01-01
The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. In addition to precision landing close to a pre-mission defined landing location, the ALHAT System must be capable of autonomously identifying and avoiding surface hazards in real-time to enable a safe landing under any lighting conditions. This paper provides an overview of the recent results of the ALHAT closed loop hazard detection and avoidance flight demonstrations on the Morpheus Vertical Testbed (VTB) at the Kennedy Space Center, including results and lessons learned. This effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).
The Deep Space Atomic Clock Mission
NASA Technical Reports Server (NTRS)
Ely, Todd A.; Koch, Timothy; Kuang, Da; Lee, Karen; Murphy, David; Prestage, John; Tjoelker, Robert; Seubert, Jill
2012-01-01
The Deep Space Atomic Clock (DSAC) mission will demonstrate the space flight performance of a small, low-mass, high-stability mercury-ion atomic clock with long term stability and accuracy on par with that of the Deep Space Network. The timing stability introduced by DSAC allows for a 1-Way radiometric tracking paradigm for deep space navigation, with benefits including increased tracking via utilization of the DSN's Multiple Spacecraft Per Aperture (MSPA) capability and full ground station-spacecraft view periods, more accurate radio occultation signals, decreased single-frequency measurement noise, and the possibility for fully autonomous on-board navigation. Specific examples of navigation and radio science benefits to deep space missions are highlighted through simulations of Mars orbiter and Europa flyby missions. Additionally, this paper provides an overview of the mercury-ion trap technology behind DSAC, details of and options for the upcoming 2015/2016 space demonstration, and expected on-orbit clock performance.
Ribbon networks for modeling navigable paths of autonomous agents in virtual environments.
Willemsen, Peter; Kearney, Joseph K; Wang, Hongling
2006-01-01
This paper presents the Environment Description Framework (EDF) for modeling complex networks of intersecting roads and pathways in virtual environments. EDF represents information about the layout of streets and sidewalks, the rules that govern behavior on roads and walkways, and the locations of agents with respect to navigable structures. The framework serves as the substrate on which behavior programs for autonomous vehicles and pedestrians are built. Pathways are modeled as ribbons in space. The ribbon structure provides a natural coordinate frame for defining the local geometry of navigable surfaces. EDF includes a powerful runtime interface supported by robust and efficient code for locating objects on the ribbon network, for mapping between Cartesian and ribbon coordinates, and for determining behavioral constraints imposed by the environment.
2006-12-01
NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI
Single-Frequency GPS Relative Navigation in a High Ionosphere Orbital Environment
NASA Technical Reports Server (NTRS)
Conrad, Patrick R.; Naasz, Bo J.
2007-01-01
The Global Positioning System (GPS) provides a convenient source for space vehicle relative navigation measurements, especially for low Earth orbit formation flying and autonomous rendezvous mission concepts. For single-frequency GPS receivers, ionospheric path delay can be a significant error source if not properly mitigated. In particular, ionospheric effects are known to cause significant radial position error bias and add dramatically to relative state estimation error if the onboard navigation software does not force the use of measurements from common or shared GPS space vehicles. Results from GPS navigation simulations are presented for a pair of space vehicles flying in formation and using GPS pseudorange measurements to perform absolute and relative orbit determination. With careful measurement selection techniques relative state estimation accuracy to less than 20 cm with standard GPS pseudorange processing and less than 10 cm with single-differenced pseudorange processing is shown.
Laser Range and Bearing Finder for Autonomous Missions
NASA Technical Reports Server (NTRS)
Granade, Stephen R.
2004-01-01
NASA has recently re-confirmed their interest in autonomous systems as an enabling technology for future missions. In order for autonomous missions to be possible, highly-capable relative sensor systems are needed to determine an object's distance, direction, and orientation. This is true whether the mission is autonomous in-space assembly, rendezvous and docking, or rover surface navigation. Advanced Optical Systems, Inc. has developed a wide-angle laser range and bearing finder (RBF) for autonomous space missions. The laser RBF has a number of features that make it well-suited for autonomous missions. It has an operating range of 10 m to 5 km, with a 5 deg field of view. Its wide field of view removes the need for scanning systems such as gimbals, eliminating moving parts and making the sensor simpler and space qualification easier. Its range accuracy is 1% or better. It is designed to operate either as a stand-alone sensor or in tandem with a sensor that returns range, bearing, and orientation at close ranges, such as NASA's Advanced Video Guidance Sensor. We have assembled the initial prototype and are currently testing it. We will discuss the laser RBF's design and specifications. Keywords: laser range and bearing finder, autonomous rendezvous and docking, space sensors, on-orbit sensors, advanced video guidance sensor
Dual RF Astrodynamic GPS Orbital Navigator Satellite
NASA Technical Reports Server (NTRS)
Kanipe, David B.; Provence, Robert Steve; Straube, Timothy M.; Reed, Helen; Bishop, Robert; Lightsey, Glenn
2009-01-01
Dual RF Astrodynamic GPS Orbital Navigator Satellite (DRAGONSat) will demonstrate autonomous rendezvous and docking (ARD) in low Earth orbit (LEO) and gather flight data with a global positioning system (GPS) receiver strictly designed for space applications. ARD is the capability of two independent spacecraft to rendezvous in orbit and dock without crew intervention. DRAGONSat consists of two picosatellites (one built by the University of Texas and one built by Texas A and M University) and the Space Shuttle Payload Launcher (SSPL); this project will ultimately demonstrate ARD in LEO.
Autonomous integrated GPS/INS navigation experiment for OMV. Phase 1: Feasibility study
NASA Technical Reports Server (NTRS)
Upadhyay, Triveni N.; Priovolos, George J.; Rhodehamel, Harley
1990-01-01
The phase 1 research focused on the experiment definition. A tightly integrated Global Positioning System/Inertial Navigation System (GPS/INS) navigation filter design was analyzed and was shown, via detailed computer simulation, to provide precise position, velocity, and attitude (alignment) data to support navigation and attitude control requirements of future NASA missions. The application of the integrated filter was also shown to provide the opportunity to calibrate inertial instrument errors which is particularly useful in reducing INS error growth during times of GPS outages. While the Orbital Maneuvering Vehicle (OMV) provides a good target platform for demonstration and for possible flight implementation to provide improved capability, a successful proof-of-concept ground demonstration can be obtained using any simulated mission scenario data, such as Space Transfer Vehicle, Shuttle-C, Space Station.
Error Analysis System for Spacecraft Navigation Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, S. H.; Hart, R. C.; Hartman, K. R.; Tomcsik, T. L.; Searl, J. E.; Bernstein, A.
1997-01-01
The Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) is currently developing improved space-navigation filtering algorithms to use the Global Positioning System (GPS) for autonomous real-time onboard orbit determination. In connection with a GPS technology demonstration on the Small Satellite Technology Initiative (SSTI)/Lewis spacecraft, FDD analysts and programmers have teamed with the GSFC Guidance, Navigation, and Control Branch to develop the GPS Enhanced Orbit Determination Experiment (GEODE) system. The GEODE system consists of a Kalman filter operating as a navigation tool for estimating the position, velocity, and additional states required to accurately navigate the orbiting Lewis spacecraft by using astrodynamic modeling and GPS measurements from the receiver. A parallel effort at the FDD is the development of a GPS Error Analysis System (GEAS) that will be used to analyze and improve navigation filtering algorithms during development phases and during in-flight calibration. For GEAS, the Kalman filter theory is extended to estimate the errors in position, velocity, and other error states of interest. The estimation of errors in physical variables at regular intervals will allow the time, cause, and effect of navigation system weaknesses to be identified. In addition, by modeling a sufficient set of navigation system errors, a system failure that causes an observed error anomaly can be traced and accounted for. The GEAS software is formulated using Object Oriented Design (OOD) techniques implemented in the C++ programming language on a Sun SPARC workstation. The Phase 1 of this effort is the development of a basic system to be used to evaluate navigation algorithms implemented in the GEODE system. This paper presents the GEAS mathematical methodology, systems and operations concepts, and software design and implementation. Results from the use of the basic system to evaluate navigation algorithms implemented on GEODE are also discussed. In addition, recommendations for generalization of GEAS functions and for new techniques to optimize the accuracy and control of the GPS autonomous onboard navigation are presented.
A Behavior-Based Strategy for Single and Multi-Robot Autonomous Exploration
Cepeda, Jesus S.; Chaimowicz, Luiz; Soto, Rogelio; Gordillo, José L.; Alanís-Reyes, Edén A.; Carrillo-Arce, Luis C.
2012-01-01
In this paper, we consider the problem of autonomous exploration of unknown environments with single and multiple robots. This is a challenging task, with several potential applications. We propose a simple yet effective approach that combines a behavior-based navigation with an efficient data structure to store previously visited regions. This allows robots to safely navigate, disperse and efficiently explore the environment. A series of experiments performed using a realistic robotic simulator and a real testbed scenario demonstrate that our technique effectively distributes the robots over the environment and allows them to quickly accomplish their mission in large open spaces, narrow cluttered environments, dead-end corridors, as well as rooms with minimum exits.
Pulsar Timing and Its Application for Navigation and Gravitational Wave Detection
NASA Astrophysics Data System (ADS)
Becker, Werner; Kramer, Michael; Sesana, Alberto
2018-02-01
Pulsars are natural cosmic clocks. On long timescales they rival the precision of terrestrial atomic clocks. Using a technique called pulsar timing, the exact measurement of pulse arrival times allows a number of applications, ranging from testing theories of gravity to detecting gravitational waves. Also an external reference system suitable for autonomous space navigation can be defined by pulsars, using them as natural navigation beacons, not unlike the use of GPS satellites for navigation on Earth. By comparing pulse arrival times measured on-board a spacecraft with predicted pulse arrivals at a reference location (e.g. the solar system barycenter), the spacecraft position can be determined autonomously and with high accuracy everywhere in the solar system and beyond. We describe the unique properties of pulsars that suggest that such a navigation system will certainly have its application in future astronautics. We also describe the on-going experiments to use the clock-like nature of pulsars to "construct" a galactic-sized gravitational wave detector for low-frequency (f_{GW}˜ 10^{-9} - 10^{-7} Hz) gravitational waves. We present the current status and provide an outlook for the future.
Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)
NASA Technical Reports Server (NTRS)
Folta, David C.; Hawkins, Albin; Bauer, Frank H. (Technical Monitor)
2001-01-01
NASA's first autonomous formation flying mission completed its primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center (GSFC) implemented a universal 3-axis formation flying algorithm in an autonomous executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard flight design and presents the validation results of this unique system. Results from functionality assessment through fully autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a standalone algorithm.
Terrain Navigation Concepts for Autonomous Vehicles,
1984-06-01
AD-fi144 994 TERRAIN NAVIGATION CONCEPTS FOR AUTONOMOUS VEHICLES (U) i/i I ARMY ENGINEER OPOGRAPHIC LABS FORT BELVOIR VA R D LEIGHTY JUN 84 ETL-R@65...FUNCTIONS The pacing problem for developing autonomous vehicles that can efficiently move to designated locations in the real world in the perfor- mance...autonomous functions can serve as general terrain navigation requirements for our discussion of autonomous vehicles . LEIGHTY Can we build a vehicular system
Navigation of military and space unmanned ground vehicles in unstructured terrains
NASA Technical Reports Server (NTRS)
Lescoe, Paul; Lavery, David; Bedard, Roger
1991-01-01
Development of unmanned vehicles for local navigation in terrains unstructured by humans is reviewed. Modes of navigation include teleoperation or remote control, computer assisted remote driving (CARD), and semiautonomous navigation (SAN). A first implementation of a CARD system was successfully tested using the Robotic Technology Test Vehicle developed by Jet Propulsion Laboratory. Stereo pictures were transmitted to a remotely located human operator, who performed the sensing, perception, and planning functions of navigation. A computer provided range and angle measurements and the path plan was transmitted to the vehicle which autonomously executed the path. This implementation is to be enhanced by providing passive stereo vision and a reflex control system for autonomously stopping the vehicle if blocked by an obstacle. SAN achievements include implementation of a navigation testbed on a six wheel, three-body articulated rover vehicle, development of SAN algorithms and code, integration of SAN software onto the vehicle, and a successful feasibility demonstration that represents a step forward towards the technology required for long-range exploration of the lunar or Martian surface. The vehicle includes a passive stereo vision system with real-time area-based stereo image correlation, a terrain matcher, a path planner, and a path execution planner.
A navigation and control system for an autonomous rescue vehicle in the space station environment
NASA Technical Reports Server (NTRS)
Merkel, Lawrence
1991-01-01
A navigation and control system was designed and implemented for an orbital autonomous rescue vehicle envisioned to retrieve astronauts or equipment in the case that they become disengaged from the space station. The rescue vehicle, termed the Extra-Vehicular Activity Retriever (EVAR), has an on-board inertial measurement unit ahd GPS receivers for self state estimation, a laser range imager (LRI) and cameras for object state estimation, and a data link for reception of space station state information. The states of the retriever and objects (obstacles and the target object) are estimated by inertial state propagation which is corrected via measurements from the GPS, the LRI system, or the camera system. Kalman filters are utilized to perform sensor fusion and estimate the state propagation errors. Control actuation is performed by a Manned Maneuvering Unit (MMU). Phase plane control techniques are used to control the rotational and translational state of the retriever. The translational controller provides station-keeping or motion along either Clohessy-Wiltshire trajectories or straight line trajectories in the LVLH frame of any sufficiently observed object or of the space station. The software was used to successfully control a prototype EVAR on an air bearing floor facility, and a simulated EVAR operating in a simulated orbital environment. The design of the navigation system and the control system are presented. Also discussed are the hardware systems and the overall software architecture.
NASA Technical Reports Server (NTRS)
Wood, L. J.; Jones, J. B.; Mease, K. D.; Kwok, J. H.; Goltz, G. L.; Kechichian, J. A.
1984-01-01
A conceptual design is outlined for the navigation subsystem of the Autonomous Redundancy and Maintenance Management Subsystem (ARMMS). The principal function of this navigation subsystem is to maintain the spacecraft over a specified equatorial longitude to within + or - 3 deg. In addition, the navigation subsystem must detect and correct internal faults. It comprises elements for a navigation executive and for orbit determination, trajectory, maneuver planning, and maneuver command. Each of these elements is described. The navigation subsystem is to be used in the DSCS III spacecraft.
MSR Fetch Rover Capability Development at the Canadian Space Agency
NASA Astrophysics Data System (ADS)
Picard, M.; Hipkin, V.; Gingras, D.; Allard, P.; Lamarche, T.; Rocheleau, S. G.; Gemme, S.
2018-04-01
Describes Fetch Rover technology testing during CSA's 2016 Mars Sample Return Analogue Deployment which demonstrated autonomous navigation to 'cache depots' of M-2020-like sample tubes, acquisition of six such tubes, and transfer to a MAV mock up.
Semi-autonomous parking for enhanced safety and efficiency.
DOT National Transportation Integrated Search
2017-06-01
This project focuses on the use of tools from a combination of computer vision and localization based navigation schemes to aid the process of efficient and safe parking of vehicles in high density parking spaces. The principles of collision avoidanc...
NASA Technical Reports Server (NTRS)
Bell, Jerome A.; Stephens, Elaine; Barton, Gregg
1991-01-01
An overview is provided of the Space Exploration Initiative (SEI) concepts for telecommunications, information systems, and navigation (TISN), and engineering and architecture issues are discussed. The SEI program data system is reviewed to identify mission TISN interfaces, and reference TISN concepts are described for nominal, degraded, and mission-critical data services. The infrastructures reviewed include telecommunications for robotics support, autonomous navigation without earth-based support, and information networks for tracking and data acquisition. Four options for TISN support architectures are examined which relate to unique SEI exploration strategies. Detailed support estimates are given for: (1) a manned stay on Mars; (2) permanent lunar and Martian settlements; short-duration missions; and (4) systematic exploration of the moon and Mars.
Baohua, Li; Wenjie, Lai; Yun, Chen; Zongming, Liu
2013-01-01
An autonomous navigation algorithm using the sensor that integrated the star sensor (FOV1) and ultraviolet earth sensor (FOV2) is presented. The star images are sampled by FOV1, and the ultraviolet earth images are sampled by the FOV2. The star identification algorithm and star tracking algorithm are executed at FOV1. Then, the optical axis direction of FOV1 at J2000.0 coordinate system is calculated. The ultraviolet image of earth is sampled by FOV2. The center vector of earth at FOV2 coordinate system is calculated with the coordinates of ultraviolet earth. The autonomous navigation data of satellite are calculated by integrated sensor with the optical axis direction of FOV1 and the center vector of earth from FOV2. The position accuracy of the autonomous navigation for satellite is improved from 1000 meters to 300 meters. And the velocity accuracy of the autonomous navigation for satellite is improved from 100 m/s to 20 m/s. At the same time, the period sine errors of the autonomous navigation for satellite are eliminated. The autonomous navigation for satellite with a sensor that integrated ultraviolet earth sensor and star sensor is well robust. PMID:24250261
Baohua, Li; Wenjie, Lai; Yun, Chen; Zongming, Liu
2013-01-01
An autonomous navigation algorithm using the sensor that integrated the star sensor (FOV1) and ultraviolet earth sensor (FOV2) is presented. The star images are sampled by FOV1, and the ultraviolet earth images are sampled by the FOV2. The star identification algorithm and star tracking algorithm are executed at FOV1. Then, the optical axis direction of FOV1 at J2000.0 coordinate system is calculated. The ultraviolet image of earth is sampled by FOV2. The center vector of earth at FOV2 coordinate system is calculated with the coordinates of ultraviolet earth. The autonomous navigation data of satellite are calculated by integrated sensor with the optical axis direction of FOV1 and the center vector of earth from FOV2. The position accuracy of the autonomous navigation for satellite is improved from 1000 meters to 300 meters. And the velocity accuracy of the autonomous navigation for satellite is improved from 100 m/s to 20 m/s. At the same time, the period sine errors of the autonomous navigation for satellite are eliminated. The autonomous navigation for satellite with a sensor that integrated ultraviolet earth sensor and star sensor is well robust.
Preliminary Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)
NASA Technical Reports Server (NTRS)
Folta, David; Hawkins, Albin
2001-01-01
NASA's first autonomous formation flying mission is completing a primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center has implemented an autonomous universal three-axis formation flying algorithm in executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard design and presents the preliminary validation results of this unique system. Results from functionality assessment and autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a stand-alone algorithm.
Learning for autonomous navigation : extrapolating from underfoot to the far field
NASA Technical Reports Server (NTRS)
Matthies, Larry; Turmon, Michael; Howard, Andrew; Angelova, Anelia; Tang, Benyang; Mjolsness, Eric
2005-01-01
Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter. Enabling robots to learn from experience may alleviate both of these problems. We define two paradigms for this, learning from 3-D geometry and learning from proprioception, and describe initial instantiations of them we have developed under DARPA and NASA programs. Field test results show promise for learning traversability of vegetated terrain, learning to extend the lookahead range of the vision system, and learning how slip varies with slope.
Key Issues for Navigation and Time Dissemination in NASA's Space Exploration Program
NASA Technical Reports Server (NTRS)
Nelson, R. A.; Brodsky, B.; Oria, A. J.; Connolly, J. W.; Sands, O. S.; Welch, B. W.; Ely T.; Orr, R.; Schuchman, L.
2006-01-01
The renewed emphasis on robotic and human missions within NASA's space exploration program warrants a detailed consideration of how the positions of objects in space will be determined and tracked, whether they be spacecraft, human explorers, robots, surface vehicles, or science instrumentation. The Navigation Team within the NASA Space Communications Architecture Working Group (SCAWG) has addressed several key technical issues in this area and the principle findings are reported here. For navigation in the vicinity of the Moon, a variety of satellite constellations have been investigated that provide global or regional surface position determination and timely services analogous to those offered by GPS at Earth. In the vicinity of Mars, there are options for satellite constellations not available at the Moon due to the gravitational perturbations from Earth, such as two satellites in an aerostationary orbit. Alternate methods of radiometric navigation as considered, including one- and two-way signals, as well as autonomous navigation. The use of a software radio capable of receiving all available signal sources, such as GPS, pseudolites, and communication channels, is discussed. Methods of time transfer and dissemination are also considered in this paper.
Trajectory generation for an on-road autonomous vehicle
NASA Astrophysics Data System (ADS)
Horst, John; Barbera, Anthony
2006-05-01
We describe an algorithm that generates a smooth trajectory (position, velocity, and acceleration at uniformly sampled instants of time) for a car-like vehicle autonomously navigating within the constraints of lanes in a road. The technique models both vehicle paths and lane segments as straight line segments and circular arcs for mathematical simplicity and elegance, which we contrast with cubic spline approaches. We develop the path in an idealized space, warp the path into real space and compute path length, generate a one-dimensional trajectory along the path length that achieves target speeds and positions, and finally, warp, translate, and rotate the one-dimensional trajectory points onto the path in real space. The algorithm moves a vehicle in lane safely and efficiently within speed and acceleration maximums. The algorithm functions in the context of other autonomous driving functions within a carefully designed vehicle control hierarchy.
EnEx-RANGE - Robust autonomous Acoustic Navigation in Glacial icE
NASA Astrophysics Data System (ADS)
Heinen, Dirk; Eliseev, Dmitry; Henke, Christoph; Jeschke, Sabina; Linder, Peter; Reuter, Sebastian; Schönitz, Sebastian; Scholz, Franziska; Weinstock, Lars Steffen; Wickmann, Stefan; Wiebusch, Christopher; Zierke, Simon
2017-03-01
Within the Enceladus Explorer Initiative of the DLR Space Administration navigation technologies for a future space mission are in development. Those technologies are the basis for the search for extraterrestrial life on the Saturn moon Enceladus. An autonomous melting probe, the EnEx probe, aims to extract a liquid sample from a water reservoir below the icy crust. A first EnEx probe was developed and demonstrated in a terrestrial scenario at the Bloodfalls, Taylor Glacier, Antarctica in November 2014. To enable navigation in glacier ice two acoustic systems were integrated into the probe in addition to conventional navigation technologies. The first acoustic system determines the position of the probe during the run based on propagation times of acoustic signals from emitters at reference positions at the glacier surface to receivers in the probe. The second system provides information about the forefield of the probe. It is based on sonographic principles with phased array technology integrated in the probe's melting head. Information about obstacles or sampling regions in the probe's forefield can be acquired. The development of both systems is now continued in the project EnEx-RANGE. The emitters of the localization system are replaced by a network of intelligent acoustic enabled melting probes. These localize each other by means of acoustic signals and create the reference system for the EnEx probe. This presentation includes the discussion of the intelligent acoustic network, the acoustic navigation systems of the EnEx probe and results of terrestrial tests.
Developing Autonomous Vehicles That Learn to Navigate by Mimicking Human Behavior
2006-09-28
navigate in an unstructured environment to a specific target or location. 15. SUBJECT TERMS autonomous vehicles , fuzzy logic, learning behavior...ANSI-Std Z39-18 Developing Autonomous Vehicles That Learn to Navigate by Mimicking Human Behavior FINAL REPORT 9/28/2006 Dean B. Edwards Department...the future, as greater numbers of autonomous vehicles are employed, it is hoped that lower LONG-TERM GOALS Use LAGR (Learning Applied to Ground Robots
3D Reconfigurable MPSoC for Unmanned Spacecraft Navigation
NASA Astrophysics Data System (ADS)
Dekoulis, George
2016-07-01
This paper describes the design of a new lightweight spacecraft navigation system for unmanned space missions. The system addresses the demands for more efficient autonomous navigation in the near-Earth environment or deep space. The proposed instrumentation is directly suitable for unmanned systems operation and testing of new airborne prototypes for remote sensing applications. The system features a new sensor technology and significant improvements over existing solutions. Fluxgate type sensors have been traditionally used in unmanned defense systems such as target drones, guided missiles, rockets and satellites, however, the guidance sensors' configurations exhibit lower specifications than the presented solution. The current implementation is based on a recently developed material in a reengineered optimum sensor configuration for unprecedented low-power consumption. The new sensor's performance characteristics qualify it for spacecraft navigation applications. A major advantage of the system is the efficiency in redundancy reduction achieved in terms of both hardware and software requirements.
Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.
1996-01-01
The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,
Deep Space 1: Testing New Technologies for Future Small Bodies Missions
NASA Technical Reports Server (NTRS)
Rayman, Marc D.
2001-01-01
Launched on October 24, 1998, Deep Space 1 (DS1) was the first mission of NASA's New Millennium Program, chartered to validate in space high-risk, new technologies important for future space science programs. The advanced technology payload that was tested on DS1 comprises solar electric propulsion, solar concentrator arrays, autonomous on-board navigation and other autonomous systems, several telecommunications and microelectronics devices, and two low-mass integrated science instrument packages. The mission met or exceeded all of its success criteria. The 12 technologies were rigorously exercised so that subsequent flight projects would not have to incur the cost and risk of being the fist users of these new capabilities. Examples of the benefits to future small body missions from DS1's technologies will be described.
Sextant X-Ray Pulsar Navigation Demonstration: Initial On-Orbit Results
NASA Technical Reports Server (NTRS)
Mitchell, Jason W.; Winternitz, Luke M.; Hassouneh, Munther A.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wolff, Michael T.; Kerr, Matthew; Wood, Kent S.;
2018-01-01
The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a technology demonstration enhancement to the Neutron-star Interior Composition Explorer (NICER) mission. SEXTANT will be a first demonstration of in-space, autonomous, X-ray pulsar navigation (XNAV). Navigating using millisecond X-ray pulsars which could provide a GPS-like navigation capability available throughout our Solar System and beyond. NICER is a NASA Astrophysics Explorer Mission of Opportunity to the International Space Station that was launched and installed in June of 2017. During NICER's nominal 18-month base mission, SEXTANT will perform a number of experiments to demonstrate XNAV and advance the technology on a number of fronts. In this work, we review the SEXTANT, its goals, and present early results from SEXTANT experiments conducted in the first six months of operation. With these results, SEXTANT has made significant progress toward meeting its primary and secondary mission goals. We also describe the SEXTANT flight operations, calibration activities, and initial results.
Autonomous Navigation, Dynamic Path and Work Flow Planning in Multi-Agent Robotic Swarms Project
NASA Technical Reports Server (NTRS)
Falker, John; Zeitlin, Nancy; Leucht, Kurt; Stolleis, Karl
2015-01-01
Kennedy Space Center has teamed up with the Biological Computation Lab at the University of New Mexico to create a swarm of small, low-cost, autonomous robots, called Swarmies, to be used as a ground-based research platform for in-situ resource utilization missions. The behavior of the robot swarm mimics the central-place foraging strategy of ants to find and collect resources in an unknown environment and return those resources to a central site.
A System for Fast Navigation of Autonomous Vehicles
1991-09-01
AD-A243 523 4, jj A System for Fast Navigation of Autonomous Vehicles Sanjiv Singh, Dai Feng, Paul Keller, Gary Shaffer, Wen Fan Shi, Dong Hun Shin...FUNDING NUMBERS A System for Fast Navigation of Autonomous Vehicles 6. AUTHOR(S) S. Singh, D. Feng, P. Keller, G. Shaffer, W.F. Shi, D.H. Shin, J. West...common in the control of autonomous vehicles to establish the necessary kinematic models but to ignore an explicit representation of the vehicle dynamics
Terrain discovery and navigation of a multi-articulated linear robot using map-seeking circuits
NASA Astrophysics Data System (ADS)
Snider, Ross K.; Arathorn, David W.
2006-05-01
A significant challenge in robotics is providing a robot with the ability to sense its environment and then autonomously move while accommodating obstacles. The DARPA Grand Challenge, one of the most visible examples, set the goal of driving a vehicle autonomously for over a hundred miles avoiding obstacles along a predetermined path. Map-Seeking Circuits have shown their biomimetic capability in both vision and inverse kinematics and here we demonstrate their potential usefulness for intelligent exploration of unknown terrain using a multi-articulated linear robot. A robot that could handle any degree of terrain complexity would be useful for exploring inaccessible crowded spaces such as rubble piles in emergency situations, patrolling/intelligence gathering in tough terrain, tunnel exploration, and possibly even planetary exploration. Here we simulate autonomous exploratory navigation by an interaction of terrain discovery using the multi-articulated linear robot to build a local terrain map and exploitation of that growing terrain map to solve the propulsion problem of the robot.
Human-like robots for space and hazardous environments
NASA Technical Reports Server (NTRS)
1994-01-01
The three year goal for the Kansas State USRA/NASA Senior Design team is to design and build a walking autonomous robotic rover. The rover should be capable of crossing rough terrain, traversing human made obstacles (such as stairs and doors), and moving through human and robot occupied spaces without collision. The rover is also to evidence considerable decision making ability, navigation, and path planning skills.
Human-like robots for space and hazardous environments
NASA Astrophysics Data System (ADS)
The three year goal for the Kansas State USRA/NASA Senior Design team is to design and build a walking autonomous robotic rover. The rover should be capable of crossing rough terrain, traversing human made obstacles (such as stairs and doors), and moving through human and robot occupied spaces without collision. The rover is also to evidence considerable decision making ability, navigation, and path planning skills.
Geometry-Based Observability Metric
NASA Technical Reports Server (NTRS)
Eaton, Colin; Naasz, Bo
2012-01-01
The Satellite Servicing Capabilities Office (SSCO) is currently developing and testing Goddard s Natural Feature Image Recognition (GNFIR) software for autonomous rendezvous and docking missions. GNFIR has flight heritage and is still being developed and tailored for future missions with non-cooperative targets: (1) DEXTRE Pointing Package System on the International Space Station, (2) Relative Navigation System (RNS) on the Space Shuttle for the fourth Hubble Servicing Mission.
2001 Flight Mechanics Symposium
NASA Technical Reports Server (NTRS)
Lynch, John P. (Editor)
2001-01-01
This conference publication includes papers and abstracts presented at the Flight Mechanics Symposium held on June 19-21, 2001. Sponsored by the Guidance, Navigation and Control Center of Goddard Space Flight Center, this symposium featured technical papers on a wide range of issues related to attitude/orbit determination, prediction and control; attitude simulation; attitude sensor calibration; theoretical foundation of attitude computation; dynamics model improvements; autonomous navigation; constellation design and formation flying; estimation theory and computational techniques; Earth environment mission analysis and design; and, spacecraft re-entry mission design and operations.
2013-05-29
not necessarily express the views of and should not be attributed to ESA. 1 and visual navigation to maneuver autonomously to reduce the size of the...successful orbit and three-dimensional imaging of an RSO, using passive visual -only navigation and real-time near-optimal guidance. The mission design...Kit ( STK ) in the Earth-centered Earth-fixed (ECF) co- ordinate system, loaded to Simulink and transformed to the BFF for calculation of the SRP
End-to-end information system concept for the Mars Telecommunications Orbiter
NASA Technical Reports Server (NTRS)
Breidenthal, Julian C.; Edwards, Charles D.; Greenberg, Edward; Kazz, Greg J.; Noreen, Gary K.
2006-01-01
The Mars Telecommunications Orbiter (MTO) was intended to provide high-performance deep space relay links to landers, orbiters, sample-return missions, and approaching spacecraft in the vicinity of Mars, to demostrate interplanetary laser communications, to demonstrate autonomous navigation, and to carry out its own science investigations.
End-to-end information system concept for the Mars Telecommunications Orbiter
NASA Technical Reports Server (NTRS)
Bridenthal, Julian C.; Edwards, Charles D.; Greenberg, Edward; Kazz, Greg J.; Noreen, Gary K.
2006-01-01
The Mars Telecommunications Orbiter (MTO) was intended to provide high-performance deep space relay links to landers, orbiters, sample-return, missions, and approaching spacecraft in the vicinity of Mars, to demonstrate interplanetary laser communications, to demonstrate autonomous navigation, and to carry out is own science investigations.
Conceptual Design of a Communication-Based Deep Space Navigation Network
NASA Technical Reports Server (NTRS)
Anzalone, Evan J.; Chuang, C. H.
2012-01-01
As the need grows for increased autonomy and position knowledge accuracy to support missions beyond Earth orbit, engineers must push and develop more advanced navigation sensors and systems that operate independent of Earth-based analysis and processing. Several spacecraft are approaching this problem using inter-spacecraft radiometric tracking and onboard autonomous optical navigation methods. This paper proposes an alternative implementation to aid in spacecraft position fixing. The proposed method Network-Based Navigation technique takes advantage of the communication data being sent between spacecraft and between spacecraft and ground control to embed navigation information. The navigation system uses these packets to provide navigation estimates to an onboard navigation filter to augment traditional ground-based radiometric tracking techniques. As opposed to using digital signal measurements to capture inherent information of the transmitted signal itself, this method relies on the embedded navigation packet headers to calculate a navigation estimate. This method is heavily dependent on clock accuracy and the initial results show the promising performance of a notional system.
NASA Technical Reports Server (NTRS)
Rutishauser, David K.; Epp, Chirold; Robertson, Ed
2012-01-01
The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. Since its inception in 2006, the ALHAT Project has executed four field test campaigns to characterize and mature sensors and algorithms that support real-time hazard detection and global/local precision navigation for planetary landings. The driving objective for Government Fiscal Year 2012 (GFY2012) is to successfully demonstrate autonomous, real-time, closed loop operation of the ALHAT system in a realistic free flight scenario on Earth using the Morpheus lander developed at the Johnson Space Center (JSC). This goal represents an aggressive target consistent with a lean engineering culture of rapid prototyping and development. This culture is characterized by prioritizing early implementation to gain practical lessons learned and then building on this knowledge with subsequent prototyping design cycles of increasing complexity culminating in the implementation of the baseline design. This paper provides an overview of the ALHAT/Morpheus flight demonstration activities in GFY2012, including accomplishments, current status, results, and lessons learned. The ALHAT/Morpheus effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Team KuuKulgur waits to begin the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
NASA Astrophysics Data System (ADS)
Jankovic, Marko; Paul, Jan; Kirchner, Frank
2016-04-01
Recent studies of the space debris population in low Earth orbit (LEO) have concluded that certain regions have already reached a critical density of objects. This will eventually lead to a cascading process called the Kessler syndrome. The time may have come to seriously consider active debris removal (ADR) missions as the only viable way of preserving the space environment for future generations. Among all objects in the current environment, the SL-8 (Kosmos 3M second stages) rocket bodies (R/Bs) are some of the most suitable targets for future robotic ADR missions. However, to date, an autonomous relative navigation to and capture of an non-cooperative target has never been performed. Therefore, there is a need for more advanced, autonomous and modular systems that can cope with uncontrolled, tumbling objects. The guidance, navigation and control (GNC) system is one of the most critical ones. The main objective of this paper is to present a preliminary concept of a modular GNC architecture that should enable a safe and fuel-efficient capture of a known but uncooperative target, such as Kosmos 3M R/B. In particular, the concept was developed having in mind the most critical part of an ADR mission, i.e. close range proximity operations, and state of the art algorithms in the field of autonomous rendezvous and docking. In the end, a brief description of the hardware in the loop (HIL) testing facility is made, foreseen for the practical evaluation of the developed architecture.
Compact autonomous navigation system (CANS)
NASA Astrophysics Data System (ADS)
Hao, Y. C.; Ying, L.; Xiong, K.; Cheng, H. Y.; Qiao, G. D.
2017-11-01
Autonomous navigation of Satellite and constellation has series of benefits, such as to reduce operation cost and ground station workload, to avoid the event of crises of war and natural disaster, to increase spacecraft autonomy, and so on. Autonomous navigation satellite is independent of ground station support. Many systems are developed for autonomous navigation of satellite in the past 20 years. Along them American MANS (Microcosm Autonomous Navigation System) [1] of Microcosm Inc. and ERADS [2] [3] (Earth Reference Attitude Determination System) of Honeywell Inc. are well known. The systems anticipate a series of good features of autonomous navigation and aim low cost, integrated structure, low power consumption and compact layout. The ERADS is an integrated small 3-axis attitude sensor system with low cost and small volume. It has the Earth center measurement accuracy higher than the common IR sensor because the detected ultraviolet radiation zone of the atmosphere has a brightness gradient larger than that of the IR zone. But the ERADS is still a complex system because it has to eliminate many problems such as making of the sapphire sphere lens, birefringence effect of sapphire, high precision image transfer optical fiber flattener, ultraviolet intensifier noise, and so on. The marginal sphere FOV of the sphere lens of the ERADS is used to star imaging that may be bring some disadvantages., i.e. , the image energy and attitude measurements accuracy may be reduced due to the tilt image acceptance end of the fiber flattener in the FOV. Besides Japan, Germany and Russia developed visible earth sensor for GEO [4] [5]. Do we have a way to develop a cheaper/easier and more accurate autonomous navigation system that can be used to all LEO spacecraft, especially, to LEO small and micro satellites? To return this problem we provide a new type of the system—CANS (Compact Autonomous Navigation System) [6].
COBALT CoOperative Blending of Autonomous Landing Technology
NASA Technical Reports Server (NTRS)
Carson, John M. III; Restrepo, Carolina I.; Robertson, Edward A.; Seubert, Carl R.; Amzajerdian, Farzin
2016-01-01
COBALT is a terrestrial test platform for development and maturation of GN&C (Guidance, Navigation and Control) technologies for PL&HA (Precision Landing and Hazard Avoidance). The project is developing a third generation, Langley Navigation Doppler Lidar (NDL) for ultra-precise velocity and range measurements, which will be integrated and tested with the JPL Lander Vision System (LVS) for Terrain Relative Navigation (TRN) position estimates. These technologies together provide navigation that enables controlled precision landing. The COBALT hardware will be integrated in 2017 into the GN&C subsystem of the Xodiac rocket-propulsive Vertical Test Bed (VTB) developed by Masten Space Systems (MSS), and two terrestrial flight campaigns will be conducted: one open-loop (i.e., passive) and one closed-loop (i.e., active).
COBALT: A GN&C Payload for Testing ALHAT Capabilities in Closed-Loop Terrestrial Rocket Flights
NASA Technical Reports Server (NTRS)
Carson, John M., III; Amzajerdian, Farzin; Hines, Glenn D.; O'Neal, Travis V.; Robertson, Edward A.; Seubert, Carl; Trawny, Nikolas
2016-01-01
The COBALT (CoOperative Blending of Autonomous Landing Technology) payload is being developed within NASA as a risk reduction activity to mature, integrate and test ALHAT (Autonomous precision Landing and Hazard Avoidance Technology) systems targeted for infusion into near-term robotic and future human space flight missions. The initial COBALT payload instantiation is integrating the third-generation ALHAT Navigation Doppler Lidar (NDL) sensor, for ultra high-precision velocity plus range measurements, with the passive-optical Lander Vision System (LVS) that provides Terrain Relative Navigation (TRN) global-position estimates. The COBALT payload will be integrated onboard a rocket-propulsive terrestrial testbed and will provide precise navigation estimates and guidance planning during two flight test campaigns in 2017 (one open-loop and closed- loop). The NDL is targeting performance capabilities desired for future Mars and Moon Entry, Descent and Landing (EDL). The LVS is already baselined for TRN on the Mars 2020 robotic lander mission. The COBALT platform will provide NASA with a new risk-reduction capability to test integrated EDL Guidance, Navigation and Control (GN&C) components in closed-loop flight demonstrations prior to the actual mission EDL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
EISLER, G. RICHARD
This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstratemore » the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.« less
Mobile Autonomous Humanoid Assistant
NASA Technical Reports Server (NTRS)
Diftler, M. A.; Ambrose, R. O.; Tyree, K. S.; Goza, S. M.; Huber, E. L.
2004-01-01
A mobile autonomous humanoid robot is assisting human co-workers at the Johnson Space Center with tool handling tasks. This robot combines the upper body of the National Aeronautics and Space Administration (NASA)/Defense Advanced Research Projects Agency (DARPA) Robonaut system with a Segway(TradeMark) Robotic Mobility Platform yielding a dexterous, maneuverable humanoid perfect for aiding human co-workers in a range of environments. This system uses stereo vision to locate human team mates and tools and a navigation system that uses laser range and vision data to follow humans while avoiding obstacles. Tactile sensors provide information to grasping algorithms for efficient tool exchanges. The autonomous architecture utilizes these pre-programmed skills to form human assistant behaviors. The initial behavior demonstrates a robust capability to assist a human by acquiring a tool from a remotely located individual and then following the human in a cluttered environment with the tool for future use.
Fuzzy Behavior Modulation with Threshold Activation for Autonomous Vehicle Navigation
NASA Technical Reports Server (NTRS)
Tunstel, Edward
2000-01-01
This paper describes fuzzy logic techniques used in a hierarchical behavior-based architecture for robot navigation. An architectural feature for threshold activation of fuzzy-behaviors is emphasized, which is potentially useful for tuning navigation performance in real world applications. The target application is autonomous local navigation of a small planetary rover. Threshold activation of low-level navigation behaviors is the primary focus. A preliminary assessment of its impact on local navigation performance is provided based on computer simulations.
Control of autonomous robot using neural networks
NASA Astrophysics Data System (ADS)
Barton, Adam; Volna, Eva
2017-07-01
The aim of the article is to design a method of control of an autonomous robot using artificial neural networks. The introductory part describes control issues from the perspective of autonomous robot navigation and the current mobile robots controlled by neural networks. The core of the article is the design of the controlling neural network, and generation and filtration of the training set using ART1 (Adaptive Resonance Theory). The outcome of the practical part is an assembled Lego Mindstorms EV3 robot solving the problem of avoiding obstacles in space. To verify models of an autonomous robot behavior, a set of experiments was created as well as evaluation criteria. The speed of each motor was adjusted by the controlling neural network with respect to the situation in which the robot was found.
Autonomous navigation system based on GPS and magnetometer data
NASA Technical Reports Server (NTRS)
Julie, Thienel K. (Inventor); Richard, Harman R. (Inventor); Bar-Itzhack, Itzhack Y. (Inventor)
2004-01-01
This invention is drawn to an autonomous navigation system using Global Positioning System (GPS) and magnetometers for low Earth orbit satellites. As a magnetometer is reliable and always provides information on spacecraft attitude, rate, and orbit, the magnetometer-GPS configuration solves GPS initialization problem, decreasing the convergence time for navigation estimate and improving the overall accuracy. Eventually the magnetometer-GPS configuration enables the system to avoid costly and inherently less reliable gyro for rate estimation. Being autonomous, this invention would provide for black-box spacecraft navigation, producing attitude, orbit, and rate estimates without any ground input with high accuracy and reliability.
Agent Based Software for the Autonomous Control of Formation Flying Spacecraft
NASA Technical Reports Server (NTRS)
How, Jonathan P.; Campbell, Mark; Dennehy, Neil (Technical Monitor)
2003-01-01
Distributed satellite systems is an enabling technology for many future NASA/DoD earth and space science missions, such as MMS, MAXIM, Leonardo, and LISA [1, 2, 3]. While formation flying offers significant science benefits, to reduce the operating costs for these missions it will be essential that these multiple vehicles effectively act as a single spacecraft by performing coordinated observations. Autonomous guidance, navigation, and control as part of a coordinated fleet-autonomy is a key technology that will help accomplish this complex goal. This is no small task, as most current space missions require significant input from the ground for even relatively simple decisions such as thruster burns. Work for the NMP DS1 mission focused on the development of the New Millennium Remote Agent (NMRA) architecture for autonomous spacecraft control systems. NMRA integrates traditional real-time monitoring and control with components for constraint-based planning, robust multi-threaded execution, and model-based diagnosis and reconfiguration. The complexity of using an autonomous approach for space flight software was evident when most of its capabilities were stripped off prior to launch (although more capability was uplinked subsequently, and the resulting demonstration was very successful).
Computer graphics testbed to simulate and test vision systems for space applications
NASA Technical Reports Server (NTRS)
Cheatham, John B.
1991-01-01
Artificial intelligence concepts are applied to robotics. Artificial neural networks, expert systems and laser imaging techniques for autonomous space robots are being studied. A computer graphics laser range finder simulator developed by Wu has been used by Weiland and Norwood to study use of artificial neural networks for path planning and obstacle avoidance. Interest is expressed in applications of CLIPS, NETS, and Fuzzy Control. These applications are applied to robot navigation.
A Novel Robust H∞ Filter Based on Krein Space Theory in the SINS/CNS Attitude Reference System.
Yu, Fei; Lv, Chongyang; Dong, Qianhui
2016-03-18
Owing to their numerous merits, such as compact, autonomous and independence, the strapdown inertial navigation system (SINS) and celestial navigation system (CNS) can be used in marine applications. What is more, due to the complementary navigation information obtained from two different kinds of sensors, the accuracy of the SINS/CNS integrated navigation system can be enhanced availably. Thus, the SINS/CNS system is widely used in the marine navigation field. However, the CNS is easily interfered with by the surroundings, which will lead to the output being discontinuous. Thus, the uncertainty problem caused by the lost measurement will reduce the system accuracy. In this paper, a robust H∞ filter based on the Krein space theory is proposed. The Krein space theory is introduced firstly, and then, the linear state and observation models of the SINS/CNS integrated navigation system are established reasonably. By taking the uncertainty problem into account, in this paper, a new robust H∞ filter is proposed to improve the robustness of the integrated system. At last, this new robust filter based on the Krein space theory is estimated by numerical simulations and actual experiments. Additionally, the simulation and experiment results and analysis show that the attitude errors can be reduced by utilizing the proposed robust filter effectively when the measurements are missing discontinuous. Compared to the traditional Kalman filter (KF) method, the accuracy of the SINS/CNS integrated system is improved, verifying the robustness and the availability of the proposed robust H∞ filter.
An Autonomous Control System for an Intra-Vehicular Spacecraft Mobile Monitor Prototype
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.; Desiano, Salvatore D.; Gawdiak, Yuri; Nicewarner, Keith
2003-01-01
This paper presents an overview of an ongoing research and development effort at the NASA Ames Research Center to create an autonomous control system for an internal spacecraft autonomous mobile monitor. It primary functions are to provide crew support and perform intra- vehicular sensing activities by autonomously navigating onboard the International Space Station. We describe the mission roles and high-level functional requirements for an autonomous mobile monitor. The mobile monitor prototypes, of which two are operational and one is actively being designed, physical test facilities used to perform ground testing, including a 3D micro-gravity test facility, and simulators are briefly described. We provide an overview of the autonomy framework and describe each of its components, including those used for automated planning, goal-oriented task execution, diagnosis, and fault recovery. A sample mission test scenario is also described.
Navigation, behaviors, and control modes in an autonomous vehicle
NASA Astrophysics Data System (ADS)
Byler, Eric A.
1995-01-01
An Intelligent Mobile Sensing System (IMSS) has been developed for the automated inspection of radioactive and hazardous waste storage containers in warehouse facilities at Department of Energy sites. A 2D space of control modes was used that provides a combined view of reactive and planning approaches wherein a 2D situation space is defined by dimensions representing the predictability of the agent's task environment and the constraint imposed by its goals. In this sense selection of appropriate systems for planning, navigation, and control depends on the problem at hand. The IMSS vehicle navigation system is based on a combination of feature based motion, landmark sightings, and an a priori logical map of the mockup storage facility. Motion for the inspection activities are composed of different interactions of several available control modes, several obstacle avoidance modes, and several feature identification modes. Features used to drive these behaviors are both visual and acoustic.
NASA Technical Reports Server (NTRS)
Rush, John; Israel, David; Harlacher, Marc; Haas, Lin
2003-01-01
The Low Power Transceiver (LPT) is an advanced signal processing platform that offers a configurable and reprogrammable capability for supporting communications, navigation and sensor functions for mission applications ranging from spacecraft TT&C and autonomous orbit determination to sophisticated networks that use crosslinks to support communications and real-time relative navigation for formation flying. The LPT is the result of extensive collaborative research under NASNGSFC s Advanced Technology Program and ITT Industries internal research and development efforts. Its modular, multi-channel design currently enables transmitting and receiving communication signals on L- or S-band frequencies and processing GPS L-band signals for precision navigation. The LPT flew as a part of the GSFC Hitchhiker payload named Fast Reaction Experiments Enabling Science Technology And Research (FREESTAR) on-board Space Shuttle Columbia s final mission. The experiment demonstrated functionality in GPS-based navigation and orbit determination, NASA STDN Ground Network communications, space relay communications via the NASA TDRSS, on-orbit reconfiguration of the software radio, the use of the Internet Protocol (IP) for TT&C, and communication concepts for space based range safety. All data from the experiment was recovered and, as a result, all primary and secondary objectives of the experiment were successful. This paper presents the results of the LPTs maiden space flight as a part of STS- 107.
Synopsis of Precision Landing and Hazard Avoidance (PL&HA) Capabilities for Space Exploration
NASA Technical Reports Server (NTRS)
Robertson, Edward A.
2017-01-01
Until recently, robotic exploration missions to the Moon, Mars, and other solar system bodies relied upon controlled blind landings. Because terrestrial techniques for terrain relative navigation (TRN) had not yet been evolved to support space exploration, landing dispersions were driven by the capabilities of inertial navigation systems combined with surface relative altimetry and velocimetry. Lacking tight control over the actual landing location, mission success depended on the statistical vetting of candidate landing areas within the predicted landing dispersion ellipse based on orbital reconnaissance data, combined with the ability of the spacecraft to execute a controlled landing in terms of touchdown attitude, attitude rates, and velocity. In addition, the sensors, algorithms, and processing technologies required to perform autonomous hazard detection and avoidance in real time during the landing sequence were not yet available. Over the past decade, NASA has invested substantial resources on the development, integration, and testing of autonomous precision landing and hazard avoidance (PL&HA) capabilities. In addition to substantially improving landing accuracy and safety, these autonomous PL&HA functions also offer access to targets of interest located within more rugged and hazardous terrain. Optical TRN systems are baselined on upcoming robotic landing missions to the Moon and Mars, and NASA JPL is investigating the development of a comprehensive PL&HA system for a Europa lander. These robotic missions will demonstrate and mature PL&HA technologies that are considered essential for future human exploration missions. PL&HA technologies also have applications to rendezvous and docking/berthing with other spacecraft, as well as proximity navigation, contact, and retrieval missions to smaller bodies with microgravity environments, such as asteroids.
Unified Simulation and Analysis Framework for Deep Space Navigation Design
NASA Technical Reports Server (NTRS)
Anzalone, Evan; Chuang, Jason; Olsen, Carrie
2013-01-01
As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.
Further development and flight test of an autonomous precision landing system using a parafoil
NASA Technical Reports Server (NTRS)
Murray, James E.; Sim, Alex G.; Neufeld, David C.; Rennich, Patrick K.; Norris, Stephen R.; Hughes, Wesley S.
1994-01-01
NASA Dryden Flight Research Center and NASA Johnson Space Center are jointly conducting a phased program to determine the feasibility of the autonomous recovery of a spacecraft using a ram-air parafoil system for the final stages of entry from space to a precision landing. The feasibility is being studied using a flight model of a spacecraft in the generic shape of a flattened biconic that weighs approximately 120 lb and is flown under a commercially available ram-air parafoil. Key components of the vehicle include the global positioning system (GPS) guidance for navigation, a flight control computer, an electronic compass, a yaw rate gyro, and an onboard data recorder. A flight test program is being used to develop and refine the vehicle. The primary flight goal is to demonstrate autonomous flight from an altitude of 3,000 m (10,000 ft) with a lateral offset of 1.6 km (1.0 mi) to a precision soft landing. This paper summarizes the progress to date. Much of the navigation system has been tested, including a heading tracker that was developed using parameter estimation techniques and a complementary filter. The autoland portion of the autopilot is still in development. The feasibility of conducting the flare maneuver without servoactuators was investigated as a means of significantly reducing the servoactuator rate and load requirements.
Li, Tianlong; Chang, Xiaocong; Wu, Zhiguang; Li, Jinxing; Shao, Guangbin; Deng, Xinghong; Qiu, Jianbin; Guo, Bin; Zhang, Guangyu; He, Qiang; Li, Longqiu; Wang, Joseph
2017-09-26
Self-propelled micro- and nanoscale robots represent a rapidly emerging and fascinating robotics research area. However, designing autonomous and adaptive control systems for operating micro/nanorobotics in complex and dynamically changing environments, which is a highly demanding feature, is still an unmet challenge. Here we describe a smart microvehicle for precise autonomous navigation in complicated environments and traffic scenarios. The fully autonomous navigation system of the smart microvehicle is composed of a microscope-coupled CCD camera, an artificial intelligence planner, and a magnetic field generator. The microscope-coupled CCD camera provides real-time localization of the chemically powered Janus microsphere vehicle and environmental detection for path planning to generate optimal collision-free routes, while the moving direction of the microrobot toward a reference position is determined by the external electromagnetic torque. Real-time object detection offers adaptive path planning in response to dynamically changing environments. We demonstrate that the autonomous navigation system can guide the vehicle movement in complex patterns, in the presence of dynamically changing obstacles, and in complex biological environments. Such a navigation system for micro/nanoscale vehicles, relying on vision-based close-loop control and path planning, is highly promising for their autonomous operation in complex dynamic settings and unpredictable scenarios expected in a variety of realistic nanoscale scenarios.
Integrated polarization-dependent sensor for autonomous navigation
NASA Astrophysics Data System (ADS)
Liu, Ze; Zhang, Ran; Wang, Zhiwen; Guan, Le; Li, Bin; Chu, Jinkui
2015-01-01
Based on the navigation strategy of insects utilizing the polarized skylight, an integrated polarization-dependent sensor for autonomous navigation is presented. The navigation sensor has the features of compact structure, high precision, strong robustness, and a simple manufacture technique. The sensor is composed by integrating a complementary-metal-oxide-semiconductor sensor with a multiorientation nanowire grid polarizer. By nanoimprint lithography, the multiorientation nanowire polarizer is fabricated in one step and the alignment error is eliminated. The statistical theory is added to the interval-division algorithm to calculate the polarization angle of the incident light. The laboratory and outdoor tests for the navigation sensor are implemented and the errors of the measured angle are ±0.02 deg and ±1.3 deg, respectively. The results show that the proposed sensor has potential for application in autonomous navigation.
New vision system and navigation algorithm for an autonomous ground vehicle
NASA Astrophysics Data System (ADS)
Tann, Hokchhay; Shakya, Bicky; Merchen, Alex C.; Williams, Benjamin C.; Khanal, Abhishek; Zhao, Jiajia; Ahlgren, David J.
2013-12-01
Improvements were made to the intelligence algorithms of an autonomously operating ground vehicle, Q, which competed in the 2013 Intelligent Ground Vehicle Competition (IGVC). The IGVC required the vehicle to first navigate between two white lines on a grassy obstacle course, then pass through eight GPS waypoints, and pass through a final obstacle field. Modifications to Q included a new vision system with a more effective image processing algorithm for white line extraction. The path-planning algorithm adopted the vision system, creating smoother, more reliable navigation. With these improvements, Q successfully completed the basic autonomous navigation challenge, finishing tenth out of over 50 teams.
Preliminary Operational Results of the TDRSS Onboard Navigation System (TONS) for the Terra Mission
NASA Technical Reports Server (NTRS)
Gramling, Cheryl; Lorah, John; Santoro, Ernest; Work, Kevin; Chambers, Robert; Bauer, Frank H. (Technical Monitor)
2000-01-01
The Earth Observing System Terra spacecraft was launched on December 18, 1999, to provide data for the characterization of the terrestrial and oceanic surfaces, clouds, radiation, aerosols, and radiative balance. The Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (ONS) (TONS) flying on Terra provides the spacecraft with an operational real-time navigation solution. TONS is a passive system that makes judicious use of Terra's communication and computer subsystems. An objective of the ONS developed by NASA's Goddard Space Flight Center (GSFC) Guidance, Navigation and Control Center is to provide autonomous navigation with minimal power, weight, and volume impact on the user spacecraft. TONS relies on extracting tracking measurements onboard from a TDRSS forward-link communication signal and processing these measurements in an onboard extended Kalman filter to estimate Terra's current state. Terra is the first NASA low Earth orbiting mission to fly autonomous navigation which produces accurate results. The science orbital accuracy requirements for Terra are 150 meters (m) (3sigma) per axis with a goal of 5m (1 sigma) RSS which TONS is expected to meet. The TONS solutions are telemetered in real-time to the mission scientists along with their science data for immediate processing. Once set in the operational mode, TONS eliminates the need for ground orbit determination and allows for a smooth flow from the spacecraft telemetry to planning products for the mission team. This paper will present the preliminary results of the operational TONS solution available from Terra.
NASA Astrophysics Data System (ADS)
Lu, Shan; Zhang, Hanmo
2016-01-01
To meet the requirement of autonomous orbit determination, this paper proposes a fast curve fitting method based on earth ultraviolet features to obtain accurate earth vector direction, in order to achieve the high precision autonomous navigation. Firstly, combining the stable characters of earth ultraviolet radiance and the use of transmission model software of atmospheric radiation, the paper simulates earth ultraviolet radiation model on different time and chooses the proper observation band. Then the fast improved edge extracting method combined Sobel operator and local binary pattern (LBP) is utilized, which can both eliminate noises efficiently and extract earth ultraviolet limb features accurately. And earth's centroid locations on simulated images are estimated via the least square fitting method using part of the limb edges. Taken advantage of the estimated earth vector direction and earth distance, Extended Kalman Filter (EKF) is applied to realize the autonomous navigation finally. Experiment results indicate the proposed method can achieve a sub-pixel earth centroid location estimation and extremely enhance autonomous celestial navigation precision.
ERIC Educational Resources Information Center
Ardi, Priyatno
2017-01-01
The advent of mobile learning platforms and Web 2.0 technologies is believed to provide an autonomous learning space that minimizes the power structure between the teacher and students in Indonesian EFL classes, accommodating the students to display their capacity to navigate their own learning. "Schoology" m-learning platform, a social…
A Self-Tuning Kalman Filter for Autonomous Spacecraft Navigation
NASA Technical Reports Server (NTRS)
Truong, Son H.
1998-01-01
Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman Filter and Global Positioning System (GPS) data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. Current techniques of Kalman filtering, however, still rely on manual tuning from analysts, and cannot help in optimizing autonomy without compromising accuracy and performance. This paper presents an approach to produce a high accuracy autonomous navigation system fully integrated with the flight system. The resulting system performs real-time state estimation by using an Extended Kalman Filter (EKF) implemented with high-fidelity state dynamics model, as does the GPS Enhanced Orbit Determination Experiment (GEODE) system developed by the NASA Goddard Space Flight Center. Augmented to the EKF is a sophisticated neural-fuzzy system, which combines the explicit knowledge representation of fuzzy logic with the learning power of neural networks. The fuzzy-neural system performs most of the self-tuning capability and helps the navigation system recover from estimation errors. The core requirement is a method of state estimation that handles uncertainties robustly, capable of identifying estimation problems, flexible enough to make decisions and adjustments to recover from these problems, and compact enough to run on flight hardware. The resulting system can be extended to support geosynchronous spacecraft and high-eccentricity orbits. Mathematical methodology, systems and operations concepts, and implementation of a system prototype are presented in this paper. Results from the use of the prototype to evaluate optimal control algorithms implemented are discussed. Test data and major control issues (e.g., how to define specific roles for fuzzy logic to support the self-learning capability) are also discussed. In addition, architecture of a complete end-to-end candidate flight system that provides navigation with highly autonomous control using data from GPS is presented.
Recursive Gradient Estimation Using Splines for Navigation of Autonomous Vehicles.
1985-07-01
AUTONOMOUS VEHICLES C. N. SHEN DTIC " JULY 1985 SEP 1 219 85 V US ARMY ARMAMENT RESEARCH AND DEVELOPMENT CENTER LARGE CALIBER WEAPON SYSTEMS LABORATORY I...GRADIENT ESTIMATION USING SPLINES FOR NAVIGATION OF AUTONOMOUS VEHICLES Final S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(q) 8. CONTRACT OR GRANT NUMBER...which require autonomous vehicles . Essential to these robotic vehicles is an adequate and efficient computer vision system. A potentially more
Analysis of key technologies in geomagnetic navigation
NASA Astrophysics Data System (ADS)
Zhang, Xiaoming; Zhao, Yan
2008-10-01
Because of the costly price and the error accumulation of high precise Inertial Navigation Systems (INS) and the vulnerability of Global Navigation Satellite Systems (GNSS), the geomagnetic navigation technology, a passive autonomous navigation method, is paid attention again. Geomagnetic field is a natural spatial physical field, and is a function of position and time in near earth space. The navigation technology based on geomagnetic field is researched in a wide range of commercial and military applications. This paper presents the main features and the state-of-the-art of Geomagnetic Navigation System (GMNS). Geomagnetic field models and reference maps are described. Obtaining, modeling and updating accurate Anomaly Magnetic Field information is an important step for high precision geomagnetic navigation. In addition, the errors of geomagnetic measurement using strapdown magnetometers are analyzed. The precise geomagnetic data is obtained by means of magnetometer calibration and vehicle magnetic field compensation. According to the measurement data and reference map or model of geomagnetic field, the vehicle's position and attitude can be obtained using matching algorithm or state-estimating method. The tendency of geomagnetic navigation in near future is introduced at the end of this paper.
High Speed Lunar Navigation for Crewed and Remotely Piloted Vehicles
NASA Technical Reports Server (NTRS)
Pedersen, L.; Allan, M.; To, V.; Utz, H.; Wojcikiewicz, W.; Chautems, C.
2010-01-01
Increased navigation speed is desirable for lunar rovers, whether autonomous, crewed or remotely operated, but is hampered by the low gravity, high contrast lighting and rough terrain. We describe lidar based navigation system deployed on NASA's K10 autonomous rover and to increase the terrain hazard situational awareness of the Lunar Electric Rover crew.
Mobile Robot Designed with Autonomous Navigation System
NASA Astrophysics Data System (ADS)
An, Feng; Chen, Qiang; Zha, Yanfang; Tao, Wenyin
2017-10-01
With the rapid development of robot technology, robots appear more and more in all aspects of life and social production, people also ask more requirements for the robot, one is that robot capable of autonomous navigation, can recognize the road. Take the common household sweeping robot as an example, which could avoid obstacles, clean the ground and automatically find the charging place; Another example is AGV tracking car, which can following the route and reach the destination successfully. This paper introduces a new type of robot navigation scheme: SLAM, which can build the environment map in a totally strange environment, and at the same time, locate its own position, so as to achieve autonomous navigation function.
A Feedforward Control Approach to the Local Navigation Problem for Autonomous Vehicles
1994-05-02
AD-A282 787 " A Feedforward Control Approach to the Local Navigation Problem for Autonomous Vehicles Alonzo Kelly CMU-RI-TR-94-17 The Robotics...follow, or a direction to prefer, it cannot generate its own strategic goals. Therefore, it solves the local planning problem for autonomous vehicles . The... autonomous vehicles . It is intelligent because it uses range images that are generated from either a laser rangefinder or a stereo triangulation
A Novel Robust H∞ Filter Based on Krein Space Theory in the SINS/CNS Attitude Reference System
Yu, Fei; Lv, Chongyang; Dong, Qianhui
2016-01-01
Owing to their numerous merits, such as compact, autonomous and independence, the strapdown inertial navigation system (SINS) and celestial navigation system (CNS) can be used in marine applications. What is more, due to the complementary navigation information obtained from two different kinds of sensors, the accuracy of the SINS/CNS integrated navigation system can be enhanced availably. Thus, the SINS/CNS system is widely used in the marine navigation field. However, the CNS is easily interfered with by the surroundings, which will lead to the output being discontinuous. Thus, the uncertainty problem caused by the lost measurement will reduce the system accuracy. In this paper, a robust H∞ filter based on the Krein space theory is proposed. The Krein space theory is introduced firstly, and then, the linear state and observation models of the SINS/CNS integrated navigation system are established reasonably. By taking the uncertainty problem into account, in this paper, a new robust H∞ filter is proposed to improve the robustness of the integrated system. At last, this new robust filter based on the Krein space theory is estimated by numerical simulations and actual experiments. Additionally, the simulation and experiment results and analysis show that the attitude errors can be reduced by utilizing the proposed robust filter effectively when the measurements are missing discontinuous. Compared to the traditional Kalman filter (KF) method, the accuracy of the SINS/CNS integrated system is improved, verifying the robustness and the availability of the proposed robust H∞ filter. PMID:26999153
Cobalt: Development and Maturation of GN&C Technologies for Precision Landing
NASA Technical Reports Server (NTRS)
Carson, John M.; Restrepo, Carolina; Seubert, Carl; Amzajerdian, Farzin
2016-01-01
The CoOperative Blending of Autonomous Landing Technologies (COBALT) instrument is a terrestrial test platform for development and maturation of guidance, navigation and control (GN&C) technologies for precision landing. The project is developing a third-generation Langley Research Center (LaRC) navigation doppler lidar (NDL) for ultra-precise velocity and range measurements, which will be integrated and tested with the Jet Propulsion Laboratory (JPL) lander vision system (LVS) for terrain relative navigation (TRN) position estimates. These technologies together provide precise navigation knowledge that is critical for a controlled and precise touchdown. The COBALT hardware will be integrated in 2017 into the GN&C subsystem of the Xodiac rocket-propulsive vertical test bed (VTB) developed by Masten Space Systems, and two terrestrial flight campaigns will be conducted: one open-loop (i.e., passive) and one closed-loop (i.e., active).
NASA Astrophysics Data System (ADS)
Qin, M.; Wan, X.; Shao, Y. Y.; Li, S. Y.
2018-04-01
Vision-based navigation has become an attractive solution for autonomous navigation for planetary exploration. This paper presents our work of designing and building an autonomous vision-based GPS-denied unmanned vehicle and developing an ARFM (Adaptive Robust Feature Matching) based VO (Visual Odometry) software for its autonomous navigation. The hardware system is mainly composed of binocular stereo camera, a pan-and tilt, a master machine, a tracked chassis. And the ARFM-based VO software system contains four modules: camera calibration, ARFM-based 3D reconstruction, position and attitude calculation, BA (Bundle Adjustment) modules. Two VO experiments were carried out using both outdoor images from open dataset and indoor images captured by our vehicle, the results demonstrate that our vision-based unmanned vehicle is able to achieve autonomous localization and has the potential for future planetary exploration.
Relative navigation for spacecraft formation flying
NASA Technical Reports Server (NTRS)
Hartman, Kate R.; Gramling, Cheryl J.; Lee, Taesul; Kelbel, David A.; Long, Anne C.
1998-01-01
The Goddard Space Flight Center Guidance, Navigation, and Control Center (GNCC) is currently developing and implementing advanced satellite systems to provide autonomous control of formation flyers. The initial formation maintenance capability will be flight-demonstrated on the Earth-Orbiter-1 (EO-1) satellite, which is planned under the National Aeronautics and Space Administration New Millennium Program to be a coflight with the Landsat-7 (L-7) satellite. Formation flying imposes relative navigation accuracy requirements in addition to the orbit accuracy requirements for the individual satellites. In the case of EO-1 and L-7, the two satellites are in nearly coplanar orbits, with a small difference in the longitude of the ascending node to compensate for the Earth's rotation. The GNCC has performed trajectory error analysis for the relative navigation of the EO-1/L-7 formation, as well as for a more advanced tracking configuration using cross-link satellite communications. This paper discusses the orbit determination and prediction accuracy achievable for EO-1 and L-7 under various tracking and orbit determination scenarios and discusses the expected relative separation errors in their formation flying configuration.
Relative Navigation for Spacecraft Formation Flying
NASA Technical Reports Server (NTRS)
Hartman, Kate R.; Gramling, Cheryl J.; Lee, Taesul; Kelbel, David A.; Long, Anne C.
1998-01-01
The Goddard Space Flight Center Guidance, Navigation, and Control Center (GNCC) is currently developing and implementing advanced satellite systems to provide autonomous control of formation flyers. The initial formation maintenance capability will be flight-demonstrated on the Earth-Orbiter-1 (EO-l) satellite, which is planned under the National Aeronautics and Space Administration New Millennium Program to be a coflight with the Landsat-7 (L-7) satellite. Formation flying imposes relative navigation accuracy requirements in addition to the orbit accuracy requirements for the individual satellites. In the case of EO-1 and L-7, the two satellites are in nearly coplanar orbits, with a small difference in the longitude of the ascending node to compensate for the Earth's rotation. The GNCC has performed trajectory error analysis for the relative navigation of the EO-1/L-7 formation, as well as for a more advanced tracking configuration using cross- link satellite communications. This paper discusses the orbit determination and prediction accuracy achievable for EO-1 and L-7 under various tracking and orbit determination scenarios and discusses the expected relative separation errors in their formation flying configuration.
Human-like robots for space and hazardous environments
NASA Technical Reports Server (NTRS)
Cogley, Allen; Gustafson, David; White, Warren; Dyer, Ruth; Hampton, Tom (Editor); Freise, Jon (Editor)
1990-01-01
The three year goal for this NASA Senior Design team is to design and build a walking autonomous robotic rover. The rover should be capable of rough terrain crossing, traversing human made obstacles (such as stairs and doors), and moving through human and robot occupied spaces without collision. The rover is also to evidence considerable decision making ability, navigation and path planning skills. These goals came from the concept that the robot should have the abilities of both a planetary rover and a hazardous waste site scout.
Human-like robots for space and hazardous environments
NASA Astrophysics Data System (ADS)
Cogley, Allen; Gustafson, David; White, Warren; Dyer, Ruth; Hampton, Tom; Freise, Jon
The three year goal for this NASA Senior Design team is to design and build a walking autonomous robotic rover. The rover should be capable of rough terrain crossing, traversing human made obstacles (such as stairs and doors), and moving through human and robot occupied spaces without collision. The rover is also to evidence considerable decision making ability, navigation and path planning skills. These goals came from the concept that the robot should have the abilities of both a planetary rover and a hazardous waste site scout.
Target Trailing With Safe Navigation With Colregs for Maritime Autonomous Surface Vehicles
NASA Technical Reports Server (NTRS)
Kuwata, Yoshiaki (Inventor); Aghazarian, Hrand (Inventor); Huntsberger, Terrance L. (Inventor); Howard, Andrew B. (Inventor); Wolf, Michael T. (Inventor); Zarzhitsky, Dimitri V. (Inventor)
2014-01-01
Systems and methods for operating autonomous waterborne vessels in a safe manner. The systems include hardware for identifying the locations and motions of other vessels, as well as the locations of stationary objects that represent navigation hazards. By applying a computational method that uses a maritime navigation algorithm for avoiding hazards and obeying COLREGS using Velocity Obstacles to the data obtained, the autonomous vessel computes a safe and effective path to be followed in order to accomplish a desired navigational end result, while operating in a manner so as to avoid hazards and to maintain compliance with standard navigational procedures defined by international agreement. The systems and methods have been successfully demonstrated on water with radar and stereo cameras as the perception sensors, and integrated with a higher level planner for trailing a maneuvering target.
Autonomous Navigation of Small Uavs Based on Vehicle Dynamic Model
NASA Astrophysics Data System (ADS)
Khaghani, M.; Skaloud, J.
2016-03-01
This paper presents a novel approach to autonomous navigation for small UAVs, in which the vehicle dynamic model (VDM) serves as the main process model within the navigation filter. The proposed method significantly increases the accuracy and reliability of autonomous navigation, especially for small UAVs with low-cost IMUs on-board. This is achieved with no extra sensor added to the conventional INS/GNSS setup. This improvement is of special interest in case of GNSS outages, where inertial coasting drifts very quickly. In the proposed architecture, the solution to VDM equations provides the estimate of position, velocity, and attitude, which is updated within the navigation filter based on available observations, such as IMU data or GNSS measurements. The VDM is also fed with the control input to the UAV, which is available within the control/autopilot system. The filter is capable of estimating wind velocity and dynamic model parameters, in addition to navigation states and IMU sensor errors. Monte Carlo simulations reveal major improvements in navigation accuracy compared to conventional INS/GNSS navigation system during the autonomous phase, when satellite signals are not available due to physical obstruction or electromagnetic interference for example. In case of GNSS outages of a few minutes, position and attitude accuracy experiences improvements of orders of magnitude compared to inertial coasting. It means that during such scenario, the position-velocity-attitude (PVA) determination is sufficiently accurate to navigate the UAV to a home position without any signal that depends on vehicle environment.
Localization system for use in GPS denied environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trueblood, J. J.
The military uses to autonomous platforms to complete missions to provide standoff for the warfighters. However autonomous platforms rely on GPS to provide their global position. In many missions spaces the autonomous platforms may encounter GPS denied environments which limits where the platform operates and requires the warfighters to takes its place. GPS denied environments can occur due to tall building, trees, canyon wall blocking the GPS satellite signals or a lack of coverage. An Inertial Navigation System (INS) uses sensors to detect the vehicle movement and direction its traveling to calculate the vehicle. One of biggest challenges with anmore » INS system is the accuracy and accumulation of errors over time of the sensors. If these challenges can be overcome the INS would provide accurate positioning information to the autonomous vehicle in GPS denied environments and allow them to provide the desired standoff for the warfighters.« less
NASA Astrophysics Data System (ADS)
van Hecke, Kevin; de Croon, Guido C. H. E.; Hennes, Daniel; Setterfield, Timothy P.; Saenz-Otero, Alvar; Izzo, Dario
2017-11-01
Although machine learning holds an enormous promise for autonomous space robots, it is currently not employed because of the inherent uncertain outcome of learning processes. In this article we investigate a learning mechanism, Self-Supervised Learning (SSL), which is very reliable and hence an important candidate for real-world deployment even on safety-critical systems such as space robots. To demonstrate this reliability, we introduce a novel SSL setup that allows a stereo vision equipped robot to cope with the failure of one of its cameras. The setup learns to estimate average depth using a monocular image, by using the stereo vision depths from the past as trusted ground truth. We present preliminary results from an experiment on the International Space Station (ISS) performed with the MIT/NASA SPHERES VERTIGO satellite. The presented experiments were performed on October 8th, 2015 on board the ISS. The main goals were (1) data gathering, and (2) navigation based on stereo vision. First the astronaut Kimiya Yui moved the satellite around the Japanese Experiment Module to gather stereo vision data for learning. Subsequently, the satellite freely explored the space in the module based on its (trusted) stereo vision system and a pre-programmed exploration behavior, while simultaneously performing the self-supervised learning of monocular depth estimation on board. The two main goals were successfully achieved, representing the first online learning robotic experiments in space. These results lay the groundwork for a follow-up experiment in which the satellite will use the learned single-camera depth estimation for autonomous exploration in the ISS, and are an advancement towards future space robots that continuously improve their navigation capabilities over time, even in harsh and completely unknown space environments.
Wind-based navigation of a hot-air balloon on Titan: a feasibility study
NASA Astrophysics Data System (ADS)
Furfaro, Roberto; Lunine, Jonathan I.; Elfes, Alberto; Reh, Kim
2008-04-01
Current analysis of data streamed back to Earth by the Cassini spacecraft features Titan as one of the most exciting places in the solar system. NASA centers and universities around the US, as well as the European Space Agency, are studying the possibility of sending, as part of the next mission to this giant moon of Saturn, a hot-air balloon (Montgolfier-type) for further and more in-depth exploration. The basic idea would be to design a reliable, semi-autonomous, and yet cheap Montgolfier capable of using continuous flow of waste heat from a power source to lift the balloon and sustain its altitude in the Titan environment. In this paper we study the problem of locally navigating a hot-air balloon in the nitrogen-based Titan atmosphere. The basic idea is to define a strategy (i.e. design of a suitable guidance system) that allows autonomous and semi-autonomous navigation of the balloon using the available (and partial) knowledge of the wind structure blowing on the saturnian satellite surface. Starting from first principles we determined the appropriate thermal and dynamical models describing (a) the vertical dynamics of the balloon and (b) the dynamics of the balloon moving on a vertical plane (2-D motion). Next, various non-linear fuzzy-based control strategies have been evaluated, analyzed and implemented in MATLAB to numerically simulate the capability of the system to simultaneously maintain altitude, as well as a scientifically desirable trajectory. We also looked at the ability of the balloon to perform station keeping. The results of the simulation are encouraging and show the effectiveness of such a system to cheaply and effectively perform semi-autonomous exploration of Titan.
Autonomous docking ground demonstration (category 3)
NASA Technical Reports Server (NTRS)
Lamkin, Steve L.; Eick, Richard E.; Baxter, James M.; Boyd, M. G.; Clark, Fred D.; Lee, Thomas Q.; Othon, L. T.; Prather, Joseph L.; Spehar, Peter T.; Teders, Rebecca J.
1991-01-01
The NASA Johnson Space Center (JSC) is involved in the development of an autonomous docking ground demonstration. The demonstration combines the technologies, expertise and facilities of the JSC Tracking and Communications Division (EE), Structures and Mechanics Division (ES), and the Navigation, Guidance and Control Division (EG) and their supporting contractors. The autonomous docking ground demonstration is an evaluation of the capabilities of the laser sensor system to support the docking phase (12ft to contact) when operated in conjunction with the Guidance, Navigation and Control Software. The docking mechanism being used was developed for the Apollo Soyuz Test Program. This demonstration will be conducted using the Six-Degrees of Freedom (6-DOF) Dynamic Test System (DTS). The DTS environment simulates the Space Station Freedom as the stationary or target vehicle and the Orbiter as the active or chase vehicle. For this demonstration the laser sensor will be mounted on the target vehicle and the retroreflectors on the chase vehicle. This arrangement was used to prevent potential damage to the laser. The sensor system. GN&C and 6-DOF DTS will be operated closed-loop. Initial condition to simulate vehicle misalignments, translational and rotational, will be introduced within the constraints of the systems involved. Detailed description of each of the demonstration components (e.g., Sensor System, GN&C, 6-DOF DTS and supporting computer configuration) including their capabilities and limitations will be discussed. A demonstration architecture drawing and photographs of the test configuration will be presented.
Autonomous docking ground demonstration (category 3)
NASA Astrophysics Data System (ADS)
Lamkin, Steve L.; Eick, Richard E.; Baxter, James M.; Boyd, M. G.; Clark, Fred D.; Lee, Thomas Q.; Othon, L. T.; Prather, Joseph L.; Spehar, Peter T.; Teders, Rebecca J.
The NASA Johnson Space Center (JSC) is involved in the development of an autonomous docking ground demonstration. The demonstration combines the technologies, expertise and facilities of the JSC Tracking and Communications Division (EE), Structures and Mechanics Division (ES), and the Navigation, Guidance and Control Division (EG) and their supporting contractors. The autonomous docking ground demonstration is an evaluation of the capabilities of the laser sensor system to support the docking phase (12ft to contact) when operated in conjunction with the Guidance, Navigation and Control Software. The docking mechanism being used was developed for the Apollo Soyuz Test Program. This demonstration will be conducted using the Six-Degrees of Freedom (6-DOF) Dynamic Test System (DTS). The DTS environment simulates the Space Station Freedom as the stationary or target vehicle and the Orbiter as the active or chase vehicle. For this demonstration the laser sensor will be mounted on the target vehicle and the retroreflectors on the chase vehicle. This arrangement was used to prevent potential damage to the laser. The sensor system. GN&C and 6-DOF DTS will be operated closed-loop. Initial condition to simulate vehicle misalignments, translational and rotational, will be introduced within the constraints of the systems involved. Detailed description of each of the demonstration components (e.g., Sensor System, GN&C, 6-DOF DTS and supporting computer configuration) including their capabilities and limitations will be discussed. A demonstration architecture drawing and photographs of the test configuration will be presented.
A Robust Mechanical Sensing System for Unmanned Sea Surface Vehicles
NASA Technical Reports Server (NTRS)
Kulczycki, Eric A.; Magnone, Lee J.; Huntsberger, Terrance; Aghazarian, Hrand; Padgett, Curtis W.; Trotz, David C.; Garrett, Michael S.
2009-01-01
The need for autonomous navigation and intelligent control of unmanned sea surface vehicles requires a mechanically robust sensing architecture that is watertight, durable, and insensitive to vibration and shock loading. The sensing system developed here comprises four black and white cameras and a single color camera. The cameras are rigidly mounted to a camera bar that can be reconfigured to mount multiple vehicles, and act as both navigational cameras and application cameras. The cameras are housed in watertight casings to protect them and their electronics from moisture and wave splashes. Two of the black and white cameras are positioned to provide lateral vision. They are angled away from the front of the vehicle at horizontal angles to provide ideal fields of view for mapping and autonomous navigation. The other two black and white cameras are positioned at an angle into the color camera's field of view to support vehicle applications. These two cameras provide an overlap, as well as a backup to the front camera. The color camera is positioned directly in the middle of the bar, aimed straight ahead. This system is applicable to any sea-going vehicle, both on Earth and in space.
The Integration, Testing and Flight of the EO-1 GPS
NASA Technical Reports Server (NTRS)
Quinn, David A.; Sanneman, Paul A.; Shulman, Seth E.; Sager, Jennifer A.
2001-01-01
The Global Positioning System has long been hailed as the wave of the future for autonomous on-board navigation of low Earth orbiting spacecraft despite the fact that relatively few spacecraft have actually employed it for this purpose. While several missions operated out of the Goddard Space Flight Center have flown GPS receivers on board, the New Millenium Program (NMP) Earth Orbiting-1 (EO-1) spacecraft is the first to employ GPS for active, autonomous on-board navigation. Since EO-1 was designed to employ GPS as its primary source of the navigation ephemeris, special care had to be taken during the integration phase of spacecraft construction to assure proper performance. This paper is a discussion of that process: a brief overview of how the GPS works, how it fits into the design of the EO-1 Attitude Control System (ACS), the steps taken to integrate the system into the EO-1 spacecraft, the ultimate on-orbit performance during launch and early operations of the EO-1 mission and the performance of the on-board GPS ephemeris versus the ground based ephemeris. Conclusions will include a discussion of the lessons learned.
The Self-Paced Graz Brain-Computer Interface: Methods and Applications
Scherer, Reinhold; Schloegl, Alois; Lee, Felix; Bischof, Horst; Janša, Janez; Pfurtscheller, Gert
2007-01-01
We present the self-paced 3-class Graz brain-computer interface (BCI) which is based on the detection of sensorimotor electroencephalogram (EEG) rhythms induced by motor imagery. Self-paced operation means that the BCI is able to determine whether the ongoing brain activity is intended as control signal (intentional control) or not (non-control state). The presented system is able to automatically reduce electrooculogram (EOG) artifacts, to detect electromyographic (EMG) activity, and uses only three bipolar EEG channels. Two applications are presented: the freeSpace virtual environment (VE) and the Brainloop interface. The freeSpace is a computer-game-like application where subjects have to navigate through the environment and collect coins by autonomously selecting navigation commands. Three subjects participated in these feedback experiments and each learned to navigate through the VE and collect coins. Two out of the three succeeded in collecting all three coins. The Brainloop interface provides an interface between the Graz-BCI and Google Earth. PMID:18350133
Reactive navigation for autonomous guided vehicle using neuro-fuzzy techniques
NASA Astrophysics Data System (ADS)
Cao, Jin; Liao, Xiaoqun; Hall, Ernest L.
1999-08-01
A Neuro-fuzzy control method for navigation of an Autonomous Guided Vehicle robot is described. Robot navigation is defined as the guiding of a mobile robot to a desired destination or along a desired path in an environment characterized by as terrain and a set of distinct objects, such as obstacles and landmarks. The autonomous navigate ability and road following precision are mainly influenced by its control strategy and real-time control performance. Neural network and fuzzy logic control techniques can improve real-time control performance for mobile robot due to its high robustness and error-tolerance ability. For a mobile robot to navigate automatically and rapidly, an important factor is to identify and classify mobile robots' currently perceptual environment. In this paper, a new approach of the current perceptual environment feature identification and classification, which are based on the analysis of the classifying neural network and the Neuro- fuzzy algorithm, is presented. The significance of this work lies in the development of a new method for mobile robot navigation.
NASA Astrophysics Data System (ADS)
Lu, Jiazhen; Lei, Chaohua; Yang, Yanqiang; Liu, Ming
2017-06-01
Many countries have been paying great attention to space exploration, especially about the Moon and the Mars. Autonomous and high-accuracy navigation systems are needed for probers and rovers to accomplish missions. Inertial navigation system (INS)/celestial navigation system (CNS) based navigation system has been used widely on the lunar rovers. Initialization is a particularly important step for navigation. This paper presents an in-motion alignment and positioning method for lunar rovers by INS/CNS/odometer integrated navigation. The method can estimate not only the position and attitude errors, but also the biases of the accelerometers and gyros using the standard Kalman filter. The differences between the platform star azimuth, elevation angles and the computed star azimuth, elevation angles, and the difference between the velocity measured by odometer and the velocity measured by inertial sensors are taken as measurements. The semi-physical experiments are implemented to demonstrate that the position error can reduce to 10 m and attitude error is within 2″ during 5 min. The experiment results prove that it is an effective and attractive initialization approach for lunar rovers.
Autonomous satellite navigation using starlight refraction angle measurements
NASA Astrophysics Data System (ADS)
Ning, Xiaolin; Wang, Longhua; Bai, Xinbei; Fang, Jiancheng
2013-05-01
An on-board autonomous navigation capability is required to reduce the operation costs and enhance the navigation performance of future satellites. Autonomous navigation by stellar refraction is a type of autonomous celestial navigation method that uses high-accuracy star sensors instead of Earth sensors to provide information regarding Earth's horizon. In previous studies, the refraction apparent height has typically been used for such navigation. However, the apparent height cannot be measured directly by a star sensor and can only be calculated by the refraction angle and an atmospheric refraction model. Therefore, additional errors are introduced by the uncertainty and nonlinearity of atmospheric refraction models, which result in reduced navigation accuracy and reliability. A new navigation method based on the direct measurement of the refraction angle is proposed to solve this problem. Techniques for the determination of the refraction angle are introduced, and a measurement model for the refraction angle is established. The method is tested and validated by simulations. When the starlight refraction height ranges from 20 to 50 km, a positioning accuracy of better than 100 m can be achieved for a low-Earth-orbit (LEO) satellite using the refraction angle, while the positioning accuracy of the traditional method using the apparent height is worse than 500 m under the same conditions. Furthermore, an analysis of the factors that affect navigation accuracy, including the measurement accuracy of the refraction angle, the number of visible refracted stars per orbit and the installation azimuth of star sensor, is presented. This method is highly recommended for small satellites in particular, as no additional hardware besides two star sensors is required.
Meng, Zhijun; Yang, Jun; Guo, Xiye; Zhou, Yongbin
2017-01-01
Global Navigation Satellite System performance can be significantly enhanced by introducing inter-satellite links (ISLs) in navigation constellation. The improvement in position, velocity, and time accuracy as well as the realization of autonomous functions requires ISL distance measurement data as the original input. To build a high-performance ISL, the ranging consistency among navigation satellites is an urgent problem to be solved. In this study, we focus on the variation in the ranging delay caused by the sensitivity of the ISL payload equipment to the ambient temperature in space and propose a simple and low-power temperature-sensing ranging compensation sensor suitable for onboard equipment. The experimental results show that, after the temperature-sensing ranging compensation of the ISL payload equipment, the ranging consistency becomes less than 0.2 ns when the temperature change is 90 °C. PMID:28608809
Perception, planning, and control for walking on rugged terrain
NASA Technical Reports Server (NTRS)
Simmons, Reid; Krotkov, Eric
1991-01-01
The CMU Planetary Rover project is developing a six-legged walking robot capable of autonomously navigating, exploring, and acquiring samples in rugged, unknown environments. To gain experience with the problems involved in walking on rugged terrain, a full-scale prototype leg was built and mounted on a carriage that rolls along overhead rails. Issues addressed in developing the software system to autonomously walk the leg through rugged terrain are described. In particular, the insights gained into perceiving and modeling rugged terrain, controlling the legged mechanism, interacting with the ground, choosing safe yet effective footfalls, and planning efficient leg moves through space are described.
Reactive Sequencing for Autonomous Navigation Evolving from Phoenix Entry, Descent, and Landing
NASA Technical Reports Server (NTRS)
Grasso, Christopher A.; Riedel, Joseph E.; Vaughan, Andrew T.
2010-01-01
Virtual Machine Language (VML) is an award-winning advanced procedural sequencing language in use on NASA deep-space missions since 1997, and was used for the successful entry, descent, and landing (EDL) of the Phoenix spacecraft onto the surface of Mars. Phoenix EDL utilized a state-oriented operations architecture which executed within the constraints of the existing VML 2.0 flight capability, compatible with the linear "land or die" nature of the mission. The intricacies of Phoenix EDL included the planned discarding of portions of the vehicle, the complex communications management for relay through on-orbit assets, the presence of temporally indeterminate physical events, and the need to rapidly catch up four days of sequencing should a reboot of the spacecraft flight computer occur shortly before atmospheric entry. These formidable operational challenges led to new techniques for packaging and coordinating reusable sequences called blocks using one-way synchronization via VML sequencing global variable events. The coordinated blocks acted as an ensemble to land the spacecraft, while individually managing various elements in as simple a fashion as possible. This paper outlines prototype VML 2.1 flight capabilities that have evolved from the one-way synchronization techniques in order to implement even more ambitious autonomous mission capabilities. Target missions for these new capabilities include autonomous touch-and-go sampling of cometary and asteroidal bodies, lunar landing of robotic missions, and ultimately landing of crewed lunar vehicles. Close proximity guidance, navigation, and control operations, on-orbit rendezvous, and descent and landing events featured in these missions require elaborate abort capability, manifesting highly non-linear scenarios that are so complex as to overtax traditional sequencing, or even the sort of one-way coordinated sequencing used during EDL. Foreseeing advanced command and control needs for small body and lunar landing guidance, navigation and control scenarios, work began three years ago on substantial upgrades to VML that are now being exercised in scenarios for lunar landing and comet/asteroid rendezvous. The advanced state-based approach includes coordinated state transition machines with distributed decision-making logic. These state machines are not merely sequences - they are reactive logic constructs capable of autonomous decision making within a well-defined domain. Combined with the JPL's AutoNav software used on Deep Space 1 and Deep Impact, the system allows spacecraft to autonomously navigate to an unmapped surface, soft-contact, and either land or ascend. The state machine architecture enabled by VML 2.1 has successfully performed sampling missions and lunar descent missions in a simulated environment, and is progressing toward flight capability. The authors are also investigating using the VML 2.1 flight director architecture to perform autonomous activities like rendezvous with a passive hypothetical Mars sample return capsule. The approach being pursued is similar to the touch-and-go sampling state machines, with the added complications associated with the search for, physical capture of, and securing of a separate spacecraft. Complications include optically finding and tracking the Orbiting Sample Capsule (OSC), keeping the OSC illuminated, making orbital adjustments, and physically capturing the OSC. Other applications could include autonomous science collection and fault compensation.
Angles-only navigation for autonomous orbital rendezvous
NASA Astrophysics Data System (ADS)
Woffinden, David C.
The proposed thesis of this dissertation has both a practical element and theoretical component which aim to answer key questions related to the use of angles-only navigation for autonomous orbital rendezvous. The first and fundamental principle to this work argues that an angles-only navigation filter can determine the relative position and orientation (pose) between two spacecraft to perform the necessary maneuvers and close proximity operations for autonomous orbital rendezvous. Second, the implementation of angles-only navigation for on-orbit applications is looked upon with skeptical eyes because of its perceived limitation of determining the relative range between two vehicles. This assumed, yet little understood subtlety can be formally characterized with a closed-form analytical observability criteria which specifies the necessary and sufficient conditions for determining the relative position and velocity with only angular measurements. With a mathematical expression of the observability criteria, it can be used to (1) identify the orbital rendezvous trajectories and maneuvers that ensure the relative position and velocity are observable for angles-only navigation, (2) quantify the degree or level of observability and (3) compute optimal maneuvers that maximize observability. In summary, the objective of this dissertation is to provide both a practical and theoretical foundation for the advancement of autonomous orbital rendezvous through the use of angles-only navigation.
Evaluating ACLS Algorithms for the International Space Station (ISS) - A Paradigm Revisited
NASA Technical Reports Server (NTRS)
Alexander, Dave; Brandt, Keith; Locke, James; Hurst, Victor, IV; Mack, Michael D.; Pettys, Marianne; Smart, Kieran
2007-01-01
The ISS may have communication gaps of up to 45 minutes during each orbit and therefore it is imperative to have medical protocols, including an effective ACLS algorithm, that can be reliably autonomously executed during flight. The aim of this project was to compare the effectiveness of the current ACLS algorithm with an improved algorithm having a new navigation format.
Architecting Communication Network of Networks for Space System of Systems
NASA Technical Reports Server (NTRS)
Bhasin, Kul B.; Hayden, Jeffrey L.
2008-01-01
The National Aeronautics and Space Administration (NASA) and the Department of Defense (DoD) are planning Space System of Systems (SoS) to address the new challenges of space exploration, defense, communications, navigation, Earth observation, and science. In addition, these complex systems must provide interoperability, enhanced reliability, common interfaces, dynamic operations, and autonomy in system management. Both NASA and the DoD have chosen to meet the new demands with high data rate communication systems and space Internet technologies that bring Internet Protocols (IP), routers, servers, software, and interfaces to space networks to enable as much autonomous operation of those networks as possible. These technologies reduce the cost of operations and, with higher bandwidths, support the expected voice, video, and data needed to coordinate activities at each stage of an exploration mission. In this paper, we discuss, in a generic fashion, how the architectural approaches and processes are being developed and used for defining a hypothetical communication and navigation networks infrastructure to support lunar exploration. Examples are given of the products generated by the architecture development process.
A Long Distance Laser Altimeter for Terrain Relative Navigation and Spacecraft Landing
NASA Technical Reports Server (NTRS)
Pierrottet, Diego F.; Amzajerdian, Farzin; Barnes, Bruce W.
2014-01-01
A high precision laser altimeter was developed under the Autonomous Landing and Hazard Avoidance (ALHAT) project at NASA Langley Research Center. The laser altimeter provides slant-path range measurements from operational ranges exceeding 30 km that will be used to support surface-relative state estimation and navigation during planetary descent and precision landing. The altimeter uses an advanced time-of-arrival receiver, which produces multiple signal-return range measurements from tens of kilometers with 5 cm precision. The transmitter is eye-safe, simplifying operations and testing on earth. The prototype is fully autonomous, and able to withstand the thermal and mechanical stresses experienced during test flights conducted aboard helicopters, fixed-wing aircraft, and Morpheus, a terrestrial rocket-powered vehicle developed by NASA Johnson Space Center. This paper provides an overview of the sensor and presents results obtained during recent field experiments including a helicopter flight test conducted in December 2012 and Morpheus flight tests conducted during March of 2014.
Autonomous satellite navigation with the Global Positioning System
NASA Technical Reports Server (NTRS)
Fuchs, A. J.; Wooden, W. H., II; Long, A. C.
1977-01-01
This paper discusses the potential of using the Global Positioning System (GPS) to provide autonomous navigation capability to NASA satellites in the 1980 era. Some of the driving forces motivating autonomous navigation are presented. These include such factors as advances in attitude control systems, onboard science annotation, and onboard gridding of imaging data. Simulation results which demonstrate baseline orbit determination accuracies using GPS data on Seasat, Landsat-D, and the Solar Maximum Mission are presented. Emphasis is placed on identifying error sources such as GPS time, GPS ephemeris, user timing biases, and user orbit dynamics, and in a parametric sense on evaluating their contribution to the orbit determination accuracies.
NASA Technical Reports Server (NTRS)
Crain, Timothy P.; Bishop, Robert H.; Carson, John M., III; Trawny, Nikolas; Hanak, Chad; Sullivan, Jacob; Christian, John; DeMars, Kyle; Campbell, Tom; Getchius, Joel
2016-01-01
The Morpheus Project began in late 2009 as an ambitious e ort code-named Project M to integrate three ongoing multi-center NASA technology developments: humanoid robotics, liquid oxygen/liquid methane (LOX/LCH4) propulsion and Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) into a single engineering demonstration mission to be own to the Moon by 2013. The humanoid robot e ort was redirected to a deploy- ment of Robonaut 2 on the International Space Station in February of 2011 while Morpheus continued as a terrestrial eld test project integrating the existing ALHAT Project's tech- nologies into a sub-orbital ight system using the world's rst LOX/LCH4 main propulsion and reaction control system fed from the same blowdown tanks. A series of 33 tethered tests with the Morpheus 1.0 vehicle and Morpheus 1.5 vehicle were conducted from April 2011 - December 2013 before successful, sustained free ights with the primary Vertical Testbed (VTB) navigation con guration began with Free Flight 3 on December 10, 2013. Over the course of the following 12 free ights and 3 tethered ights, components of the ALHAT navigation system were integrated into the Morpheus vehicle, operations, and ight control loop. The ALHAT navigation system was integrated and run concurrently with the VTB navigation system as a reference and fail-safe option in ight (see touchdown position esti- mate comparisons in Fig. 1). Flight testing completed with Free Flight 15 on December 15, 2014 with a completely autonomous Hazard Detection and Avoidance (HDA), integration of surface relative and Hazard Relative Navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman lter software, and landing within 2 meters of the VTB GPS-based navigation solution at the safe landing site target. This paper describes the Mor- pheus joint VTB/ALHAT navigation architecture, the sensors utilized during the terrestrial ight campaign, issues resolved during testing, and the navigation results from the ight tests.
Towards high-speed autonomous navigation of unknown environments
NASA Astrophysics Data System (ADS)
Richter, Charles; Roy, Nicholas
2015-05-01
In this paper, we summarize recent research enabling high-speed navigation in unknown environments for dynamic robots that perceive the world through onboard sensors. Many existing solutions to this problem guarantee safety by making the conservative assumption that any unknown portion of the map may contain an obstacle, and therefore constrain planned motions to lie entirely within known free space. In this work, we observe that safety constraints may significantly limit performance and that faster navigation is possible if the planner reasons about collision with unobserved obstacles probabilistically. Our overall approach is to use machine learning to approximate the expected costs of collision using the current state of the map and the planned trajectory. Our contribution is to demonstrate fast but safe planning using a learned function to predict future collision probabilities.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Dorothy Rasco, NASA Deputy Associate Administrator for the Space Technology Mission Directorate, speaks at the TouchTomorrow Festival, held in conjunction with the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
LABRADOR: a learning autonomous behavior-based robot for adaptive detection and object retrieval
NASA Astrophysics Data System (ADS)
Yamauchi, Brian; Moseley, Mark; Brookshire, Jonathan
2013-01-01
As part of the TARDEC-funded CANINE (Cooperative Autonomous Navigation in a Networked Environment) Program, iRobot developed LABRADOR (Learning Autonomous Behavior-based Robot for Adaptive Detection and Object Retrieval). LABRADOR was based on the rugged, man-portable, iRobot PackBot unmanned ground vehicle (UGV) equipped with an explosives ordnance disposal (EOD) manipulator arm and a custom gripper. For LABRADOR, we developed a vision-based object learning and recognition system that combined a TLD (track-learn-detect) filter based on object shape features with a color-histogram-based object detector. Our vision system was able to learn in real-time to recognize objects presented to the robot. We also implemented a waypoint navigation system based on fused GPS, IMU (inertial measurement unit), and odometry data. We used this navigation capability to implement autonomous behaviors capable of searching a specified area using a variety of robust coverage strategies - including outward spiral, random bounce, random waypoint, and perimeter following behaviors. While the full system was not integrated in time to compete in the CANINE competition event, we developed useful perception, navigation, and behavior capabilities that may be applied to future autonomous robot systems.
NASA Technical Reports Server (NTRS)
Mardirossian, H.; Beri, A. C.; Doll, C. E.
1990-01-01
The Flight Dynamics Facility (FDF) at Goddard Space Flight Center (GSFC) provides spacecraft trajectory determination for a wide variety of National Aeronautics and Space Administration (NASA)-supported satellite missions, using the Tracking Data Relay Satellite System (TDRSS) and Ground Spaceflight and Tracking Data Network (GSTDN). To take advantage of computerized decision making processes that can be used in spacecraft navigation, the Orbit Determination Automation System (ODAS) was designed, developed, and implemented as a prototype system to automate orbit determination (OD) and orbit quality assurance (QA) functions performed by orbit operations. Based on a machine-resident generic schedule and predetermined mission-dependent QA criteria, ODAS autonomously activates an interface with the existing trajectory determination system using a batch least-squares differential correction algorithm to perform the basic OD functions. The computational parameters determined during the OD are processed to make computerized decisions regarding QA, and a controlled recovery process is activated when the criteria are not satisfied. The complete cycle is autonomous and continuous. ODAS was extensively tested for performance under conditions resembling actual operational conditions and found to be effective and reliable for extended autonomous OD. Details of the system structure and function are discussed, and test results are presented.
NASA Technical Reports Server (NTRS)
Mardirossian, H.; Heuerman, K.; Beri, A.; Samii, M. V.; Doll, C. E.
1989-01-01
The Flight Dynamics Facility (FDF) at Goddard Space Flight Center (GSFC) provides spacecraft trajectory determination for a wide variety of National Aeronautics and Space Administration (NASA)-supported satellite missions, using the Tracking Data Relay Satellite System (TDRSS) and Ground Spaceflight and Tracking Data Network (GSTDN). To take advantage of computerized decision making processes that can be used in spacecraft navigation, the Orbit Determination Automation System (ODAS) was designed, developed, and implemented as a prototype system to automate orbit determination (OD) and orbit quality assurance (QA) functions performed by orbit operations. Based on a machine-resident generic schedule and predetermined mission-dependent QA criteria, ODAS autonomously activates an interface with the existing trajectory determination system using a batch least-squares differential correction algorithm to perform the basic OD functions. The computational parameters determined during the OD are processed to make computerized decisions regarding QA, and a controlled recovery process isactivated when the criteria are not satisfied. The complete cycle is autonomous and continuous. ODAS was extensively tested for performance under conditions resembling actual operational conditions and found to be effective and reliable for extended autonomous OD. Details of the system structure and function are discussed, and test results are presented.
He, Bo; Zhang, Hongjin; Li, Chao; Zhang, Shujing; Liang, Yan; Yan, Tianhong
2011-01-01
This paper addresses an autonomous navigation method for the autonomous underwater vehicle (AUV) C-Ranger applying information-filter-based simultaneous localization and mapping (SLAM), and its sea trial experiments in Tuandao Bay (Shangdong Province, P.R. China). Weak links in the information matrix in an extended information filter (EIF) can be pruned to achieve an efficient approach-sparse EIF algorithm (SEIF-SLAM). All the basic update formulae can be implemented in constant time irrespective of the size of the map; hence the computational complexity is significantly reduced. The mechanical scanning imaging sonar is chosen as the active sensing device for the underwater vehicle, and a compensation method based on feedback of the AUV pose is presented to overcome distortion of the acoustic images due to the vehicle motion. In order to verify the feasibility of the navigation methods proposed for the C-Ranger, a sea trial was conducted in Tuandao Bay. Experimental results and analysis show that the proposed navigation approach based on SEIF-SLAM improves the accuracy of the navigation compared with conventional method; moreover the algorithm has a low computational cost when compared with EKF-SLAM. PMID:22346682
He, Bo; Zhang, Hongjin; Li, Chao; Zhang, Shujing; Liang, Yan; Yan, Tianhong
2011-01-01
This paper addresses an autonomous navigation method for the autonomous underwater vehicle (AUV) C-Ranger applying information-filter-based simultaneous localization and mapping (SLAM), and its sea trial experiments in Tuandao Bay (Shangdong Province, P.R. China). Weak links in the information matrix in an extended information filter (EIF) can be pruned to achieve an efficient approach-sparse EIF algorithm (SEIF-SLAM). All the basic update formulae can be implemented in constant time irrespective of the size of the map; hence the computational complexity is significantly reduced. The mechanical scanning imaging sonar is chosen as the active sensing device for the underwater vehicle, and a compensation method based on feedback of the AUV pose is presented to overcome distortion of the acoustic images due to the vehicle motion. In order to verify the feasibility of the navigation methods proposed for the C-Ranger, a sea trial was conducted in Tuandao Bay. Experimental results and analysis show that the proposed navigation approach based on SEIF-SLAM improves the accuracy of the navigation compared with conventional method; moreover the algorithm has a low computational cost when compared with EKF-SLAM.
NASA Technical Reports Server (NTRS)
Epp, Chirold D.; Robertson, Edward A.; Ruthishauser, David K.
2013-01-01
The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project was chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with real-time terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. This is accomplished with the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN). The NASA plan for the ALHAT technology is to perform the TRL6 closed loop demonstration on the Morpheus Vertical Test Bed (VTB). The first Morpheus vehicle was lost in August of 2012 during free-flight testing at Kennedy Space Center (KSC), so the decision was made to perform a helicopter test of the integrated ALHAT System with the Morpheus avionics over the ALHAT planetary hazard field at KSC. The KSC helicopter tests included flight profiles approximating planetary approaches, with the entire ALHAT system interfaced with all appropriate Morpheus subsystems and operated in real-time. During these helicopter flights, the ALHAT system imaged the simulated lunar terrain constructed in FY2012 to support ALHAT/Morpheus testing at KSC. To the best of our knowledge, this represents the highest fidelity testing of a system of this kind to date. During this helicopter testing, two new Morpheus landers were under construction at the Johnson Space Center to support the objective of an integrated ALHAT/Morpheus free-flight demonstration. This paper provides an overview of this helicopter flight test activity, including results and lessons learned, and also provides an overview of recent integrated testing of ALHAT on the second Morpheus vehicle.
NASA Technical Reports Server (NTRS)
Rutishauser, David; Epp, Chirold; Robertson, Edward
2013-01-01
The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project was chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with real-time terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. This is accomplished with the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN). The NASA plan for the ALHAT technology is to perform the TRL6 closed loop demonstration on the Morpheus Vertical Test Bed (VTB). The first Morpheus vehicle was lost in August of 2012 during free-flight testing at Kennedy Space Center (KSC), so the decision was made to perform a helicopter test of the integrated ALHAT System with the Morpheus avionics over the ALHAT planetary hazard field at KSC. The KSC helicopter tests included flight profiles approximating planetary approaches, with the entire ALHAT system interfaced with all appropriate Morpheus subsystems and operated in real-time. During these helicopter flights, the ALHAT system imaged the simulated lunar terrain constructed in FY2012 to support ALHAT/Morpheus testing at KSC. To the best of our knowledge, this represents the highest fidelity testing of a system of this kind to date. During this helicopter testing, two new Morpheus landers were under construction at the Johnson Space Center to support the objective of an integrated ALHAT/Morpheus free-flight demonstration. This paper provides an overview of this helicopter flight test activity, including results and lessons learned, and also provides an overview of recent integrated testing of ALHAT on the second Morpheus vehicle.
Autonomous satellite navigation by stellar refraction
NASA Technical Reports Server (NTRS)
Gounley, R.; White, R.; Gai, E.
1983-01-01
This paper describes an error analysis of an autonomous navigator using refraction measurements of starlight passing through the upper atmosphere. The analysis is based on a discrete linear Kalman filter. The filter generated steady-state values of navigator performance for a variety of test cases. Results of these simulations show that in low-earth orbit position-error standard deviations of less than 0.100 km may be obtained using only 40 star sightings per orbit.
Guidance, navigation, and control trades for an Electric Orbit Transfer Vehicle
NASA Astrophysics Data System (ADS)
Zondervan, K. P.; Bauer, T. A.; Jenkin, A. B.; Metzler, R. A.; Shieh, R. A.
The USAF Space Division initiated the Electric Insertion Transfer Experiment (ELITE) in the fall of 1988. The ELITE space mission is planned for the mid 1990s and will demonstrate technological readiness for the development of operational solar-powered electric orbit transfer vehicles (EOTVs). To minimize the cost of ground operations, autonomous flight is desirable. Thus, the guidance, navigation, and control (GNC) functions of an EOTV should reside on board. In order to define GNC requirements for ELITE, parametric trades must be performed for an operational solar-powered EOTV so that a clearer understanding of the performance aspects is obtained. Parametric trades for the GNC subsystems have provided insight into the relationship between pointing accuracy, transfer time, and propellant utilization. Additional trades need to be performed, taking into account weight, cost, and degree of autonomy.
NASA Technical Reports Server (NTRS)
Teles, Jerome (Editor); Samii, Mina V. (Editor)
1993-01-01
A conference on spaceflight dynamics produced papers in the areas of orbit determination, spacecraft tracking, autonomous navigation, the Deep Space Program Science Experiment Mission (DSPSE), the Global Positioning System, attitude control, geostationary satellites, interplanetary missions and trajectories, applications of estimation theory, flight dynamics systems, low-Earth orbit missions, orbital mechanics, mission experience in attitude dynamics, mission experience in sensor studies, attitude dynamics theory and simulations, and orbit-related experience. These papaers covered NASA, European, Russian, Japanese, Chinese, and Brazilian space programs and hardware.
Neural Network Based Sensory Fusion for Landmark Detection
NASA Technical Reports Server (NTRS)
Kumbla, Kishan -K.; Akbarzadeh, Mohammad R.
1997-01-01
NASA is planning to send numerous unmanned planetary missions to explore the space. This requires autonomous robotic vehicles which can navigate in an unstructured, unknown, and uncertain environment. Landmark based navigation is a new area of research which differs from the traditional goal-oriented navigation, where a mobile robot starts from an initial point and reaches a destination in accordance with a pre-planned path. The landmark based navigation has the advantage of allowing the robot to find its way without communication with the mission control station and without exact knowledge of its coordinates. Current algorithms based on landmark navigation however pose several constraints. First, they require large memories to store the images. Second, the task of comparing the images using traditional methods is computationally intensive and consequently real-time implementation is difficult. The method proposed here consists of three stages, First stage utilizes a heuristic-based algorithm to identify significant objects. The second stage utilizes a neural network (NN) to efficiently classify images of the identified objects. The third stage combines distance information with the classification results of neural networks for efficient and intelligent navigation.
NASA Technical Reports Server (NTRS)
Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.
1994-01-01
A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.
Navigation through unknown and dynamic open spaces using topological notions
NASA Astrophysics Data System (ADS)
Miguel-Tomé, Sergio
2018-04-01
Until now, most algorithms used for navigation have had the purpose of directing system towards one point in space. However, humans communicate tasks by specifying spatial relations among elements or places. In addition, the environments in which humans develop their activities are extremely dynamic. The only option that allows for successful navigation in dynamic and unknown environments is making real-time decisions. Therefore, robots capable of collaborating closely with human beings must be able to make decisions based on the local information registered by the sensors and interpret and express spatial relations. Furthermore, when one person is asked to perform a task in an environment, this task is communicated given a category of goals so the person does not need to be supervised. Thus, two problems appear when one wants to create multifunctional robots: how to navigate in dynamic and unknown environments using spatial relations and how to accomplish this without supervision. In this article, a new architecture to address the two cited problems is presented, called the topological qualitative navigation architecture. In previous works, a qualitative heuristic called the heuristic of topological qualitative semantics (HTQS) has been developed to establish and identify spatial relations. However, that heuristic only allows for establishing one spatial relation with a specific object. In contrast, navigation requires a temporal sequence of goals with different objects. The new architecture attains continuous generation of goals and resolves them using HTQS. Thus, the new architecture achieves autonomous navigation in dynamic or unknown open environments.
The Mathematics of Navigating the Solar System
NASA Technical Reports Server (NTRS)
Hintz, Gerald
2000-01-01
In navigating spacecraft throughout the solar system, the space navigator relies on three academic disciplines - optimization, estimation, and control - that work on mathematical models of the real world. Thus, the navigator determines the flight path that will consume propellant and other resources in an efficient manner, determines where the craft is and predicts where it will go, and transfers it onto the optimal trajectory that meets operational and mission constraints. Mission requirements, for example, demand that observational measurements be made with sufficient precision that relativity must be modeled in collecting and fitting (the estimation process) the data, and propagating the trajectory. Thousands of parameters are now determined in near real-time to model the gravitational forces acting on a spacecraft in the vicinity of an irregularly shaped body. Completing these tasks requires mathematical models, analyses, and processing techniques. Newton, Gauss, Lambert, Legendre, and others are justly famous for their contributions to the mathematics of these tasks. More recently, graduate students participated in research to update the gravity model of the Saturnian system, including higher order gravity harmonics, tidal effects, and the influence of the rings. This investigation was conducted for the Cassini project to incorporate new trajectory modeling features in the navigation software. The resulting trajectory model will be used in navigating the 4-year tour of the Saturnian satellites. Also, undergraduate students are determining the ephemerides (locations versus time) of asteroids that will be used as reference objects in navigating the New Millennium's Deep Space 1 spacecraft autonomously.
COBALT Flight Demonstrations Fuse Technologies
2017-06-07
This 5-minute, 50-second video shows how the CoOperative Blending of Autonomous Landing Technologies (COBALT) system pairs new landing sensor technologies that promise to yield the highest precision navigation solution ever tested for NASA space landing applications. The technologies included a navigation doppler lidar (NDL), which provides ultra-precise velocity and line-of-sight range measurements, and the Lander Vision System (LVS), which provides terrain-relative navigation. Through flight campaigns conducted in March and April 2017 aboard Masten Space Systems' Xodiac, a rocket-powered vertical takeoff, vertical landing (VTVL) platform, the COBALT system was flight tested to collect sensor performance data for NDL and LVS and to check the integration and communication between COBALT and the rocket. The flight tests provided excellent performance data for both sensors, as well as valuable information on the integrated performance with the rocket that will be used for subsequent COBALT modifications prior to follow-on flight tests. Based at NASA’s Armstrong Flight Research Center in Edwards, CA, the Flight Opportunities program funds technology development flight tests on commercial suborbital space providers of which Masten is a vendor. The program has previously tested the LVS on the Masten rocket and validated the technology for the Mars 2020 rover.
Precise laser gyroscope for autonomous inertial navigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuznetsov, A G; Molchanov, A V; Izmailov, E A
2015-01-31
Requirements to gyroscopes of strapdown inertial navigation systems for aircraft application are formulated. The construction of a ring helium – neon laser designed for autonomous navigation is described. The processes that determine the laser service life and the relation between the random error of the angular velocity measurement and the surface relief features of the cavity mirrors are analysed. The results of modelling one of the promising approaches to processing the laser gyroscope signals are presented. (laser gyroscopes)
Study of the Navigation Method for a Snake Robot Based on the Kinematics Model with MEMS IMU.
Zhao, Xu; Dou, Lihua; Su, Zhong; Liu, Ning
2018-03-16
A snake robot is a type of highly redundant mobile robot that significantly differs from a tracked robot, wheeled robot and legged robot. To address the issue of a snake robot performing self-localization in the application environment without assistant orientation, an autonomous navigation method is proposed based on the snake robot's motion characteristic constraints. The method realized the autonomous navigation of the snake robot with non-nodes and an external assistant using its own Micro-Electromechanical-Systems (MEMS) Inertial-Measurement-Unit (IMU). First, it studies the snake robot's motion characteristics, builds the kinematics model, and then analyses the motion constraint characteristics and motion error propagation properties. Second, it explores the snake robot's navigation layout, proposes a constraint criterion and the fixed relationship, and makes zero-state constraints based on the motion features and control modes of a snake robot. Finally, it realizes autonomous navigation positioning based on the Extended-Kalman-Filter (EKF) position estimation method under the constraints of its motion characteristics. With the self-developed snake robot, the test verifies the proposed method, and the position error is less than 5% of Total-Traveled-Distance (TDD). In a short-distance environment, this method is able to meet the requirements of a snake robot in order to perform autonomous navigation and positioning in traditional applications and can be extended to other familiar multi-link robots.
Open-Loop Flight Testing of COBALT GN&C Technologies for Precise Soft Landing
NASA Technical Reports Server (NTRS)
Carson, John M., III; Amzajerdian, Farzin; Seubert, Carl R.; Restrepo, Carolina I.
2017-01-01
A terrestrial, open-loop (OL) flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) platform was conducted onboard the Masten Xodiac suborbital rocket testbed, with support through the NASA Advanced Exploration Systems (AES), Game Changing Development (GCD), and Flight Opportunities (FO) Programs. The COBALT platform integrates NASA Guidance, Navigation and Control (GN&C) sensing technologies for autonomous, precise soft landing, including the Navigation Doppler Lidar (NDL) velocity and range sensor and the Lander Vision System (LVS) Terrain Relative Navigation (TRN) system. A specialized navigation filter running onboard COBALT fuzes the NDL and LVS data in real time to produce a precise navigation solution that is independent of the Global Positioning System (GPS) and suitable for future, autonomous planetary landing systems. The OL campaign tested COBALT as a passive payload, with COBALT data collection and filter execution, but with the Xodiac vehicle Guidance and Control (G&C) loops closed on a Masten GPS-based navigation solution. The OL test was performed as a risk reduction activity in preparation for an upcoming 2017 closed-loop (CL) flight campaign in which Xodiac G&C will act on the COBALT navigation solution and the GPS-based navigation will serve only as a backup monitor.
Development and flight test of a deployable precision landing system
NASA Technical Reports Server (NTRS)
Sim, Alex G.; Murray, James E.; Neufeld, David C.; Reed, R. Dale
1994-01-01
A joint NASA Dryden Flight Research Facility and Johnson Space Center program was conducted to determine the feasibility of the autonomous recovery of a spacecraft using a ram-air parafoil system for the final stages of entry from space that included a precision landing. The feasibility of this system was studied using a flight model of a spacecraft in the generic shape of a flattened biconic that weighed approximately 150 lb and was flown under a commercially available, ram-air parachute. Key elements of the vehicle included the Global Positioning System guidance for navigation, flight control computer, ultrasonic sensing for terminal altitude, electronic compass, and onboard data recording. A flight test program was used to develop and refine the vehicle. This vehicle completed an autonomous flight from an altitude of 10,000 ft and a lateral offset of 1.7 miles that resulted in a precision flare and landing into the wind at a predetermined location. At times, the autonomous flight was conducted in the presence of winds approximately equal to vehicle airspeed. Several novel techniques for computing the winds postflight were evaluated. Future program objectives are also presented.
Linked Autonomous Interplanetary Satellite Orbit Navigation
NASA Technical Reports Server (NTRS)
Parker, Jeffrey S.; Anderson, Rodney L.; Born, George H.; Leonard, Jason M.; McGranaghan, Ryan M.; Fujimoto, Kohei
2013-01-01
A navigation technology known as LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation) has been known to produce very impressive navigation results for scenarios involving two or more cooperative satellites near the Moon, such that at least one satellite must be in an orbit significantly perturbed by the Earth, such as a lunar halo orbit. The two (or more) satellites track each other using satellite-to-satellite range and/or range-rate measurements. These relative measurements yield absolute orbit navigation when one of the satellites is in a lunar halo orbit, or the like. The geometry between a lunar halo orbiter and a GEO satellite continuously changes, which dramatically improves the information content of a satellite-to-satellite tracking signal. The geometrical variations include significant out-of-plane shifts, as well as inplane shifts. Further, the GEO satellite is almost continuously in view of a lunar halo orbiter. High-fidelity simulations demonstrate that LiAISON technology improves the navigation of GEO orbiters by an order of magnitude, relative to standard ground tracking. If a GEO satellite is navigated using LiAISON- only tracking measurements, its position is typically known to better than 10 meters. If LiAISON measurements are combined with simple radiometric ground observations, then the satellite s position is typically known to better than 3 meters, which is substantially better than the current state of GEO navigation. There are two features of LiAISON that are novel and advantageous compared with conventional satellite navigation. First, ordinary satellite-to-satellite tracking data only provides relative navigation of each satellite. The novelty is the placement of one navigation satellite in an orbit that is significantly perturbed by both the Earth and the Moon. A navigation satellite can track other satellites elsewhere in the Earth-Moon system and acquire knowledge about both satellites absolute positions and velocities, as well as relative positions and velocities in space. The second novelty is that ordinarily one requires many satellites in order to achieve full navigation of any given customer s position and velocity over time. With LiAISON navigation, only a single navigation satellite is needed, provided that the satellite is significantly affected by the gravity of the Earth and the Moon. That single satellite can track another satellite elsewhere in the Earth- Moon system and obtain absolute knowledge of both satellites states.
A GPS Receiver for Lunar Missions
NASA Technical Reports Server (NTRS)
Bamford, William A.; Heckler, Gregory W.; Holt, Greg N.; Moreau, Michael C.
2008-01-01
Beginning with the launch of the Lunar Reconnaissance Orbiter (LRO) in October of 2008, NASA will once again begin its quest to land humans on the Moon. This effort will require the development of new spacecraft which will safely transport people from the Earth to the Moon and back again, as well as robotic probes tagged with science, re-supply, and communication duties. In addition to the next-generation spacecraft currently under construction, including the Orion capsule, NASA is also investigating and developing cutting edge navigation sensors which will allow for autonomous state estimation in low Earth orbit (LEO) and cislunar space. Such instruments could provide an extra layer of redundancy in avionics systems and reduce the reliance on support and on the Deep Space Network (DSN). One such sensor is the weak-signal Global Positioning System (GPS) receiver "Navigator" being developed at NASA's Goddard Space Flight Center (GSFC). At the heart of the Navigator is a Field Programmable Gate Array (FPGA) based acquisition engine. This engine allows for the rapid acquisition/reacquisition of strong GPS signals, enabling the receiver to quickly recover from outages due to blocked satellites or atmospheric entry. Additionally, the acquisition algorithm provides significantly lower sensitivities than a conventional space-based GPS receiver, permitting it to acquire satellites well above the GPS constellation. This paper assesses the performance of the Navigator receiver based upon three of the major flight regimes of a manned lunar mission: Earth ascent, cislunar navigation, and entry. Representative trajectories for each of these segments were provided by NASA. The Navigator receiver was connected to a Spirent GPS signal generator, to allow for the collection of real-time, hardware-in-the-loop results for each phase of the flight. For each of the flight segments, the Navigator was tested on its ability to acquire and track GPS satellites under the dynamical environment unique to that trajectory.
Mamdani Fuzzy System for Indoor Autonomous Mobile Robot
NASA Astrophysics Data System (ADS)
Khan, M. K. A. Ahamed; Rashid, Razif; Elamvazuthi, I.
2011-06-01
Several control algorithms for autonomous mobile robot navigation have been proposed in the literature. Recently, the employment of non-analytical methods of computing such as fuzzy logic, evolutionary computation, and neural networks has demonstrated the utility and potential of these paradigms for intelligent control of mobile robot navigation. In this paper, Mamdani fuzzy system for an autonomous mobile robot is developed. The paper begins with the discussion on the conventional controller and then followed by the description of fuzzy logic controller in detail.
Draper Laboratory small autonomous aerial vehicle
NASA Astrophysics Data System (ADS)
DeBitetto, Paul A.; Johnson, Eric N.; Bosse, Michael C.; Trott, Christian A.
1997-06-01
The Charles Stark Draper Laboratory, Inc. and students from Massachusetts Institute of Technology and Boston University have cooperated to develop an autonomous aerial vehicle that won the 1996 International Aerial Robotics Competition. This paper describes the approach, system architecture and subsystem designs for the entry. This entry represents a combination of many technology areas: navigation, guidance, control, vision processing, human factors, packaging, power, real-time software, and others. The aerial vehicle, an autonomous helicopter, performs navigation and control functions using multiple sensors: differential GPS, inertial measurement unit, sonar altimeter, and a flux compass. The aerial transmits video imagery to the ground. A ground based vision processor converts the image data into target position and classification estimates. The system was designed, built, and flown in less than one year and has provided many lessons about autonomous vehicle systems, several of which are discussed. In an appendix, our current research in augmenting the navigation system with vision- based estimates is presented.
Advanced Communication Architectures and Technologies for Missions to the Outer Planets
NASA Technical Reports Server (NTRS)
Bhasin, K.; Hayden, J. L.
2001-01-01
Missions to the outer planets would be considerably enhanced by the implementation of a future space communication infrastructure that utilizes relay stations placed at strategic locations in the solar system. These relay stations would operate autonomously and handle remote mission command and data traffic on a prioritized demand access basis. Such a system would enhance communications from that of the current direct communications between the planet and Earth. The system would also provide high rate data communications to outer planet missions, clear communications paths during times when the sun occults the mission spacecraft as viewed from Earth, and navigational "lighthouses" for missions utilizing onboard autonomous operations. Additional information is contained in the original extended abstract.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
Lunar Navigation Determination System - LaNDS
NASA Technical Reports Server (NTRS)
Quinn, David; Talabac, Stephen
2012-01-01
A portable comprehensive navigational system has been developed that both robotic and human explorers can use to determine their location, attitude, and heading anywhere on the lunar surface independent of external infrastructure (needs no Lunar satellite network, line of sight to the Sun or Earth, etc.). The system combines robust processing power with an extensive topographical database to create a real-time atlas (GIS Geospatial Information System) that is able to autonomously control and monitor both single unmanned rovers and fleets of rovers, as well as science payload stations. The system includes provisions for teleoperation and tele-presence. The system accepts (but does not require) inputs from a wide range of sensors. A means was needed to establish a location when the search is taken deep in a crater (looking for water ice) and out of view of Earth or any other references. A star camera can be employed to determine the user's attitude in menial space and stellar map in body space. A local nadir reference (e.g., an accelerometer that orients the nadir vector in body space) can be used in conjunction with a digital ephemeris and gravity model of the Moon to isolate the latitude, longitude, and azimuth of the user on the surface. That information can be used in conjunction with a Lunar GIS and advanced navigation planning algorithms to aid astronauts (or other assets) to navigate on the Lunar surface.
NASA Astrophysics Data System (ADS)
Fink, Wolfgang; Brooks, Alexander J.-W.; Tarbell, Mark A.; Dohm, James M.
2017-05-01
Autonomous reconnaissance missions are called for in extreme environments, as well as in potentially hazardous (e.g., the theatre, disaster-stricken areas, etc.) or inaccessible operational areas (e.g., planetary surfaces, space). Such future missions will require increasing degrees of operational autonomy, especially when following up on transient events. Operational autonomy encompasses: (1) Automatic characterization of operational areas from different vantages (i.e., spaceborne, airborne, surface, subsurface); (2) automatic sensor deployment and data gathering; (3) automatic feature extraction including anomaly detection and region-of-interest identification; (4) automatic target prediction and prioritization; (5) and subsequent automatic (re-)deployment and navigation of robotic agents. This paper reports on progress towards several aspects of autonomous C4ISR systems, including: Caltech-patented and NASA award-winning multi-tiered mission paradigm, robotic platform development (air, ground, water-based), robotic behavior motifs as the building blocks for autonomous tele-commanding, and autonomous decision making based on a Caltech-patented framework comprising sensor-data-fusion (feature-vectors), anomaly detection (clustering and principal component analysis), and target prioritization (hypothetical probing).
A positional estimation technique for an autonomous land vehicle in an unstructured environment
NASA Technical Reports Server (NTRS)
Talluri, Raj; Aggarwal, J. K.
1990-01-01
This paper presents a solution to the positional estimation problem of an autonomous land vehicle navigating in an unstructured mountainous terrain. A Digital Elevation Map (DEM) of the area in which the robot is to navigate is assumed to be given. It is also assumed that the robot is equipped with a camera that can be panned and tilted, and a device to measure the elevation of the robot above the ground surface. No recognizable landmarks are assumed to be present in the environment in which the robot is to navigate. The solution presented makes use of the DEM information, and structures the problem as a heuristic search in the DEM for the possible robot location. The shape and position of the horizon line in the image plane and the known camera geometry of the perspective projection are used as parameters to search the DEM. Various heuristics drawn from the geometric constraints are used to prune the search space significantly. The algorithm is made robust to errors in the imaging process by accounting for the worst care errors. The approach is tested using DEM data of areas in Colorado and Texas. The method is suitable for use in outdoor mobile robots and planetary rovers.
NASA Technical Reports Server (NTRS)
Gramling, C. J.; Long, A. C.; Lee, T.; Ottenstein, N. A.; Samii, M. V.
1991-01-01
A Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) is currently being developed by NASA to provide a high accuracy autonomous navigation capability for users of TDRSS and its successor, the Advanced TDRSS (ATDRSS). The fully autonomous user onboard navigation system will support orbit determination, time determination, and frequency determination, based on observation of a continuously available, unscheduled navigation beacon signal. A TONS experiment will be performed in conjunction with the Explorer Platform (EP) Extreme Ultraviolet Explorer (EUVE) mission to flight quality TONS Block 1. An overview is presented of TONS and a preliminary analysis of the navigation accuracy anticipated for the TONS experiment. Descriptions of the TONS experiment and the associated navigation objectives, as well as a description of the onboard navigation algorithms, are provided. The accuracy of the selected algorithms is evaluated based on the processing of realistic simulated TDRSS one way forward link Doppler measurements. The analysis process is discussed and the associated navigation accuracy results are presented.
An autonomous organic reaction search engine for chemical reactivity.
Dragone, Vincenza; Sans, Victor; Henson, Alon B; Granda, Jaroslaw M; Cronin, Leroy
2017-06-09
The exploration of chemical space for new reactivity, reactions and molecules is limited by the need for separate work-up-separation steps searching for molecules rather than reactivity. Herein we present a system that can autonomously evaluate chemical reactivity within a network of 64 possible reaction combinations and aims for new reactivity, rather than a predefined set of targets. The robotic system combines chemical handling, in-line spectroscopy and real-time feedback and analysis with an algorithm that is able to distinguish and select the most reactive pathways, generating a reaction selection index (RSI) without need for separate work-up or purification steps. This allows the automatic navigation of a chemical network, leading to previously unreported molecules while needing only to do a fraction of the total possible reactions without any prior knowledge of the chemistry. We show the RSI correlates with reactivity and is able to search chemical space using the most reactive pathways.
An autonomous organic reaction search engine for chemical reactivity
NASA Astrophysics Data System (ADS)
Dragone, Vincenza; Sans, Victor; Henson, Alon B.; Granda, Jaroslaw M.; Cronin, Leroy
2017-06-01
The exploration of chemical space for new reactivity, reactions and molecules is limited by the need for separate work-up-separation steps searching for molecules rather than reactivity. Herein we present a system that can autonomously evaluate chemical reactivity within a network of 64 possible reaction combinations and aims for new reactivity, rather than a predefined set of targets. The robotic system combines chemical handling, in-line spectroscopy and real-time feedback and analysis with an algorithm that is able to distinguish and select the most reactive pathways, generating a reaction selection index (RSI) without need for separate work-up or purification steps. This allows the automatic navigation of a chemical network, leading to previously unreported molecules while needing only to do a fraction of the total possible reactions without any prior knowledge of the chemistry. We show the RSI correlates with reactivity and is able to search chemical space using the most reactive pathways.
An autonomous organic reaction search engine for chemical reactivity
Dragone, Vincenza; Sans, Victor; Henson, Alon B.; Granda, Jaroslaw M.; Cronin, Leroy
2017-01-01
The exploration of chemical space for new reactivity, reactions and molecules is limited by the need for separate work-up-separation steps searching for molecules rather than reactivity. Herein we present a system that can autonomously evaluate chemical reactivity within a network of 64 possible reaction combinations and aims for new reactivity, rather than a predefined set of targets. The robotic system combines chemical handling, in-line spectroscopy and real-time feedback and analysis with an algorithm that is able to distinguish and select the most reactive pathways, generating a reaction selection index (RSI) without need for separate work-up or purification steps. This allows the automatic navigation of a chemical network, leading to previously unreported molecules while needing only to do a fraction of the total possible reactions without any prior knowledge of the chemistry. We show the RSI correlates with reactivity and is able to search chemical space using the most reactive pathways. PMID:28598440
Stanford Aerospace Research Laboratory research overview
NASA Technical Reports Server (NTRS)
Ballhaus, W. L.; Alder, L. J.; Chen, V. W.; Dickson, W. C.; Ullman, M. A.
1993-01-01
Over the last ten years, the Stanford Aerospace Robotics Laboratory (ARL) has developed a hardware facility in which a number of space robotics issues have been, and continue to be, addressed. This paper reviews two of the current ARL research areas: navigation and control of free flying space robots, and modelling and control of extremely flexible space structures. The ARL has designed and built several semi-autonomous free-flying robots that perform numerous tasks in a zero-gravity, drag-free, two-dimensional environment. It is envisioned that future generations of these robots will be part of a human-robot team, in which the robots will operate under the task-level commands of astronauts. To make this possible, the ARL has developed a graphical user interface (GUI) with an intuitive object-level motion-direction capability. Using this interface, the ARL has demonstrated autonomous navigation, intercept and capture of moving and spinning objects, object transport, multiple-robot cooperative manipulation, and simple assemblies from both free-flying and fixed bases. The ARL has also built a number of experimental test beds on which the modelling and control of flexible manipulators has been studied. Early ARL experiments in this arena demonstrated for the first time the capability to control the end-point position of both single-link and multi-link flexible manipulators using end-point sensing. Building on these accomplishments, the ARL has been able to control payloads with unknown dynamics at the end of a flexible manipulator, and to achieve high-performance control of a multi-link flexible manipulator.
Experiment D005: Star occultation navigation
NASA Technical Reports Server (NTRS)
Silva, R. M.; Jorris, T. R.; Vallerie, E. M., III
1971-01-01
The usefulness of star occultation measurements for space navigation and the determination of a horizon density profile which could be used to update atmospheric models for horizon-based measurement systems were studied. The time of occultation of a known star by a celestial body, as seen by an orbiting observer, determines a cylinder of position, the axis of which is the line through the star and the body center, and the radius of which is equal to the occulting-body radius. The dimming percentage, with respect to the altitude of this grazing ray from the star to the observer, is a percentage altitude for occultation. That is, the star can be assumed to be occulted when it reaches a predetermined percentage of its unattenuated value. The procedure used was to measure this attenuation with respect to time to determine the usefulness of the measurements for autonomous space navigation. In this experiment, the crewmembers had to accomplish star acquisition, identification, calibration, and tracking. Instrumentation was required only for measurement of the relative intensity of the star as it set into the atmosphere.
Navigation system for autonomous mapper robots
NASA Astrophysics Data System (ADS)
Halbach, Marc; Baudoin, Yvan
1993-05-01
This paper describes the conception and realization of a fast, robust, and general navigation system for a mobile (wheeled or legged) robot. A database, representing a high level map of the environment is generated and continuously updated. The first part describes the legged target vehicle and the hexapod robot being developed. The second section deals with spatial and temporal sensor fusion for dynamic environment modeling within an obstacle/free space probabilistic classification grid. Ultrasonic sensors are used, others are suspected to be integrated, and a-priori knowledge is treated. US sensors are controlled by the path planning module. The third part concerns path planning and a simulation of a wheeled robot is also presented.
Spatial abstraction for autonomous robot navigation.
Epstein, Susan L; Aroor, Anoop; Evanusa, Matthew; Sklar, Elizabeth I; Parsons, Simon
2015-09-01
Optimal navigation for a simulated robot relies on a detailed map and explicit path planning, an approach problematic for real-world robots that are subject to noise and error. This paper reports on autonomous robots that rely on local spatial perception, learning, and commonsense rationales instead. Despite realistic actuator error, learned spatial abstractions form a model that supports effective travel.
SLAM algorithm applied to robotics assistance for navigation in unknown environments.
Cheein, Fernando A Auat; Lopez, Natalia; Soria, Carlos M; di Sciascio, Fernando A; Pereira, Fernando Lobo; Carelli, Ricardo
2010-02-17
The combination of robotic tools with assistance technology determines a slightly explored area of applications and advantages for disability or elder people in their daily tasks. Autonomous motorized wheelchair navigation inside an environment, behaviour based control of orthopaedic arms or user's preference learning from a friendly interface are some examples of this new field. In this paper, a Simultaneous Localization and Mapping (SLAM) algorithm is implemented to allow the environmental learning by a mobile robot while its navigation is governed by electromyographic signals. The entire system is part autonomous and part user-decision dependent (semi-autonomous). The environmental learning executed by the SLAM algorithm and the low level behaviour-based reactions of the mobile robot are robotic autonomous tasks, whereas the mobile robot navigation inside an environment is commanded by a Muscle-Computer Interface (MCI). In this paper, a sequential Extended Kalman Filter (EKF) feature-based SLAM algorithm is implemented. The features correspond to lines and corners -concave and convex- of the environment. From the SLAM architecture, a global metric map of the environment is derived. The electromyographic signals that command the robot's movements can be adapted to the patient's disabilities. For mobile robot navigation purposes, five commands were obtained from the MCI: turn to the left, turn to the right, stop, start and exit. A kinematic controller to control the mobile robot was implemented. A low level behavior strategy was also implemented to avoid robot's collisions with the environment and moving agents. The entire system was tested in a population of seven volunteers: three elder, two below-elbow amputees and two young normally limbed patients. The experiments were performed within a closed low dynamic environment. Subjects took an average time of 35 minutes to navigate the environment and to learn how to use the MCI. The SLAM results have shown a consistent reconstruction of the environment. The obtained map was stored inside the Muscle-Computer Interface. The integration of a highly demanding processing algorithm (SLAM) with a MCI and the communication between both in real time have shown to be consistent and successful. The metric map generated by the mobile robot would allow possible future autonomous navigation without direct control of the user, whose function could be relegated to choose robot destinations. Also, the mobile robot shares the same kinematic model of a motorized wheelchair. This advantage can be exploited for wheelchair autonomous navigation.
An autonomous rendezvous and docking system using cruise missile technologies
NASA Technical Reports Server (NTRS)
Jones, Ruel Edwin
1991-01-01
In November 1990 the Autonomous Rendezvous & Docking (AR&D) system was first demonstrated for members of NASA's Strategic Avionics Technology Working Group. This simulation utilized prototype hardware from the Cruise Missile and Advanced Centaur Avionics systems. The object was to show that all the accuracy, reliability and operational requirements established for a space craft to dock with Space Station Freedom could be met by the proposed system. The rapid prototyping capabilities of the Advanced Avionics Systems Development Laboratory were used to evaluate the proposed system in a real time, hardware in the loop simulation of the rendezvous and docking reference mission. The simulation permits manual, supervised automatic and fully autonomous operations to be evaluated. It is also being upgraded to be able to test an Autonomous Approach and Landing (AA&L) system. The AA&L and AR&D systems are very similar. Both use inertial guidance and control systems supplemented by GPS. Both use an Image Processing System (IPS), for target recognition and tracking. The IPS includes a general purpose multiprocessor computer and a selected suite of sensors that will provide the required relative position and orientation data. Graphic displays can also be generated by the computer, providing the astronaut / operator with real-time guidance and navigation data with enhanced video or sensor imagery.
Celestial Pattern Recognition Allowing Autonomous Earth-Surface or Deep-Space Positioning.
1983-12-01
viii :% -7 .2 I. INTRODUCTION AND BACKGROUND Background of the Project This research project is conceptual and opportunistic. It is conceptual in that...alternative approaches and the usefulness of a device in different navigation regimes. Both the research and this report have tried to follow these...Packard, Potter, and Viglione (Ref 2; 7; 15; 17; 23) have published papers (most of these 20 years ago) suggesting that some form of star
Study of the Navigation Method for a Snake Robot Based on the Kinematics Model with MEMS IMU
Dou, Lihua; Su, Zhong; Liu, Ning
2018-01-01
A snake robot is a type of highly redundant mobile robot that significantly differs from a tracked robot, wheeled robot and legged robot. To address the issue of a snake robot performing self-localization in the application environment without assistant orientation, an autonomous navigation method is proposed based on the snake robot’s motion characteristic constraints. The method realized the autonomous navigation of the snake robot with non-nodes and an external assistant using its own Micro-Electromechanical-Systems (MEMS) Inertial-Measurement-Unit (IMU). First, it studies the snake robot’s motion characteristics, builds the kinematics model, and then analyses the motion constraint characteristics and motion error propagation properties. Second, it explores the snake robot’s navigation layout, proposes a constraint criterion and the fixed relationship, and makes zero-state constraints based on the motion features and control modes of a snake robot. Finally, it realizes autonomous navigation positioning based on the Extended-Kalman-Filter (EKF) position estimation method under the constraints of its motion characteristics. With the self-developed snake robot, the test verifies the proposed method, and the position error is less than 5% of Total-Traveled-Distance (TDD). In a short-distance environment, this method is able to meet the requirements of a snake robot in order to perform autonomous navigation and positioning in traditional applications and can be extended to other familiar multi-link robots. PMID:29547515
New Vectorial Propulsion System and Trajectory Control Designs for Improved AUV Mission Autonomy.
Masmitja, Ivan; Gonzalez, Julian; Galarza, Cesar; Gomariz, Spartacus; Aguzzi, Jacopo; Del Rio, Joaquin
2018-04-17
Autonomous Underwater Vehicles (AUV) are proving to be a promising platform design for multidisciplinary autonomous operability with a wide range of applications in marine ecology and geoscience. Here, two novel contributions towards increasing the autonomous navigation capability of a new AUV prototype (the Guanay II) as a mix between a propelled vehicle and a glider are presented. Firstly, a vectorial propulsion system has been designed to provide full vehicle maneuverability in both horizontal and vertical planes. Furthermore, two controllers have been designed, based on fuzzy controls, to provide the vehicle with autonomous navigation capabilities. Due to the decoupled system propriety, the controllers in the horizontal plane have been designed separately from the vertical plane. This class of non-linear controllers has been used to interpret linguistic laws into different zones of functionality. This method provided good performance, used as interpolation between different rules or linear controls. Both improvements have been validated through simulations and field tests, displaying good performance results. Finally, the conclusion of this work is that the Guanay II AUV has a solid controller to perform autonomous navigation and carry out vertical immersions.
A Survey of LIDAR Technology and Its Use in Spacecraft Relative Navigation
NASA Technical Reports Server (NTRS)
Christian, John A.; Cryan, Scott P.
2013-01-01
This paper provides a survey of modern LIght Detection And Ranging (LIDAR) sensors from a perspective of how they can be used for spacecraft relative navigation. In addition to LIDAR technology commonly used in space applications today (e.g. scanning, flash), this paper reviews emerging LIDAR technologies gaining traction in other non-aerospace fields. The discussion will include an overview of sensor operating principles and specific pros/cons for each type of LIDAR. This paper provides a comprehensive review of LIDAR technology as applied specifically to spacecraft relative navigation. HE problem of orbital rendezvous and docking has been a consistent challenge for complex space missions since before the Gemini 8 spacecraft performed the first successful on-orbit docking of two spacecraft in 1966. Over the years, a great deal of effort has been devoted to advancing technology associated with all aspects of the rendezvous, proximity operations, and docking (RPOD) flight phase. After years of perfecting the art of crewed rendezvous with the Gemini, Apollo, and Space Shuttle programs, NASA began investigating the problem of autonomous rendezvous and docking (AR&D) to support a host of different mission applications. Some of these applications include autonomous resupply of the International Space Station (ISS), robotic servicing/refueling of existing orbital assets, and on-orbit assembly.1 The push towards a robust AR&D capability has led to an intensified interest in a number of different sensors capable of providing insight into the relative state of two spacecraft. The present work focuses on exploring the state-of-the-art in one of these sensors - LIght Detection And Ranging (LIDAR) sensors. It should be noted that the military community frequently uses the acronym LADAR (LAser Detection And Ranging) to refer to what this paper calls LIDARs. A LIDAR is an active remote sensing device that is typically used in space applications to obtain the range to one or more points on a target spacecraft. As the name suggests, LIDAR sensors use light (typically a laser) to illuminate the target and measure the time it takes for the emitted signal to return to the sensor. Because the light must travel from the source, to
Kikutis, Ramūnas; Stankūnas, Jonas; Rudinskas, Darius; Masiulionis, Tadas
2017-09-28
Current research on Unmanned Aerial Vehicles (UAVs) shows a lot of interest in autonomous UAV navigation. This interest is mainly driven by the necessity to meet the rules and restrictions for small UAV flights that are issued by various international and national legal organizations. In order to lower these restrictions, new levels of automation and flight safety must be reached. In this paper, a new method for ground obstacle avoidance derived by using UAV navigation based on the Dubins paths algorithm is presented. The accuracy of the proposed method has been tested, and research results have been obtained by using Software-in-the-Loop (SITL) simulation and real UAV flights, with the measurements done with a low cost Global Navigation Satellite System (GNSS) sensor. All tests were carried out in a three-dimensional space, but the height accuracy was not assessed. The GNSS navigation data for the ground obstacle avoidance algorithm is evaluated statistically.
Kikutis, Ramūnas; Stankūnas, Jonas; Rudinskas, Darius; Masiulionis, Tadas
2017-01-01
Current research on Unmanned Aerial Vehicles (UAVs) shows a lot of interest in autonomous UAV navigation. This interest is mainly driven by the necessity to meet the rules and restrictions for small UAV flights that are issued by various international and national legal organizations. In order to lower these restrictions, new levels of automation and flight safety must be reached. In this paper, a new method for ground obstacle avoidance derived by using UAV navigation based on the Dubins paths algorithm is presented. The accuracy of the proposed method has been tested, and research results have been obtained by using Software-in-the-Loop (SITL) simulation and real UAV flights, with the measurements done with a low cost Global Navigation Satellite System (GNSS) sensor. All tests were carried out in a three-dimensional space, but the height accuracy was not assessed. The GNSS navigation data for the ground obstacle avoidance algorithm is evaluated statistically. PMID:28956839
An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles.
Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo
2017-03-25
Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance.
An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles
Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo
2017-01-01
Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance. PMID:28346346
Autonomous vehicle navigation utilizing fuzzy controls concepts for a next generation wheelchair.
Hansen, J D; Barrett, S F; Wright, C H G; Wilcox, M
2008-01-01
Three different positioning techniques were investigated to create an autonomous vehicle that could accurately navigate towards a goal: Global Positioning System (GPS), compass dead reckoning, and Ackerman steering. Each technique utilized a fuzzy logic controller that maneuvered a four-wheel car towards a target. The reliability and the accuracy of the navigation methods were investigated by modeling the algorithms in software and implementing them in hardware. To implement the techniques in hardware, positioning sensors were interfaced to a remote control car and a microprocessor. The microprocessor utilized the sensor measurements to orient the car with respect to the target. Next, a fuzzy logic control algorithm adjusted the front wheel steering angle to minimize the difference between the heading and bearing. After minimizing the heading error, the car maintained a straight steering angle along its path to the final destination. The results of this research can be used to develop applications that require precise navigation. The design techniques can also be implemented on alternate platforms such as a wheelchair to assist with autonomous navigation.
An Overview of Flight Test Results for a Formation Flight Autopilot
NASA Technical Reports Server (NTRS)
Hanson, Curtis E.; Ryan, Jack; Allen, Michael J.; Jacobson, Steven R.
2002-01-01
The first flight test phase of the NASA Dryden Flight Research Center Autonomous Formation Flight project has successfully demonstrated precision autonomous station-keeping of an F/A-18 research airplane with a second F/A-18 airplane. Blended inertial navigation system (INS) and global positioning system (GPS) measurements have been communicated across an air-to-air telemetry link and used to compute relative-position estimates. A precision research formation autopilot onboard the trailing airplane controls lateral and vertical spacing while the leading airplane operates under production autopilot control. Four research autopilot gain sets have been designed and flight-tested, and each exceeds the project design requirement of steady-state tracking accuracy within 1 standard deviation of 10 ft. Performance also has been demonstrated using single- and multiple-axis inputs such as step commands and frequency sweeps. This report briefly describes the experimental formation flight systems employed and discusses the navigation, guidance, and control algorithms that have been flight-tested. An overview of the flight test results of the formation autopilot during steady-state tracking and maneuvering flight is presented.
Angles-only relative orbit determination in low earth orbit
NASA Astrophysics Data System (ADS)
Ardaens, Jean-Sébastien; Gaias, Gabriella
2018-06-01
The paper provides an overview of the angles-only relative orbit determination activities conducted to support the Autonomous Vision Approach Navigation and Target Identification (AVANTI) experiment. This in-orbit endeavor was carried out by the German Space Operations Center (DLR/GSOC) in autumn 2016 to demonstrate the capability to perform spaceborne autonomous close-proximity operations using solely line-of-sight measurements. The images collected onboard have been reprocessed by an independent on-ground facility for precise relative orbit determination, which served as ultimate instance to monitor the formation safety and to characterize the onboard navigation and control performances. During two months, several rendezvous have been executed, generating a valuable collection of images taken at distances ranging from 50 km to only 50 m. Despite challenging experimental conditions characterized by a poor visibility and strong orbit perturbations, angles-only relative positioning products could be continuously derived throughout the whole experiment timeline, promising accuracy at the meter level during the close approaches. The results presented in the paper are complemented with former angles-only experience gained with the PRISMA satellites to better highlight the specificities induced by different orbits and satellite designs.
76 FR 21772 - Navigation Safety Advisory Council
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-18
..., routing measures, marine information, diving safety, and aids to navigation systems. Agenda The NAVSAC... discussion of autonomous unmanned vessels and discuss their implications for the Inland Navigation Rules. A... DEPARTMENT OF HOMELAND SECURITY Coast Guard [Docket No. USCG-2011-0204] Navigation Safety Advisory...
NASA Technical Reports Server (NTRS)
Fuchs, A. J. (Editor)
1979-01-01
Onboard and real time image processing to enhance geometric correction of the data is discussed with application to autonomous navigation and attitude and orbit determination. Specific topics covered include: (1) LANDSAT landmark data; (2) star sensing and pattern recognition; (3) filtering algorithms for Global Positioning System; and (4) determining orbital elements for geostationary satellites.
Developments in Acoustic Navigation and Communication for High-Latitude Ocean Research
NASA Astrophysics Data System (ADS)
Gobat, J.; Lee, C.
2006-12-01
Developments in autonomous platforms (profiling floats, drifters, long-range gliders and propeller-driven vehicles) offer the possibility of unprecedented access to logistically difficult polar regions that challenge conventional techniques. Currently, however, navigation and telemetry for these platforms rely on satellite positioning and communications poorly suited for high-latitude applications where ice cover restricts access to the sea surface. A similar infrastructure offering basin-wide acoustic geolocation and telemetry would allow the community to employ autonomous platforms to address previously intractable problems in Arctic oceanography. Two recent efforts toward the development of such an infrastructure are reported here. As part of an observational array monitoring fluxes through Davis Strait, development of real-time RAFOS acoustic navigation for gliders has been ongoing since autumn 2004. To date, test deployments have been conducted in a 260 Hz field in the Pacific and 780 Hz fields off Norway and in Davis Strait. Real-time navigation accuracy of ~1~km is achievable. Autonomously navigating gliders will operate under ice cover beginning in autumn 2006. In addition to glider navigation development, the Davis Strait array moorings carry fixed RAFOS recorders to study propagation over a range of distances under seasonally varying ice cover. Results from the under-ice propagation and glider navigation experiments are presented. Motivated by the need to coordinate these types of development efforts, an international group of acousticians, autonomous platform developers, high-latitude oceanographers and marine mammal researchers gathered in Seattle, U.S.A. from 27 February -- 1 March 2006 for an NSF Office of Polar Programs sponsored Acoustic Navigation and Communication for High-latitude Ocean Research (ANCHOR) workshop. Workshop participants focused on summarizing the current state of knowledge concerning Arctic acoustics, navigation and communications, developing an overarching system specification to guide community-wide engineering efforts and establishing an active community and steering group to guide long-term engineering efforts and ensure interoperability. This presentation will summarize ANCHOR workshop findings.
INS integrated motion analysis for autonomous vehicle navigation
NASA Technical Reports Server (NTRS)
Roberts, Barry; Bazakos, Mike
1991-01-01
The use of inertial navigation system (INS) measurements to enhance the quality and robustness of motion analysis techniques used for obstacle detection is discussed with particular reference to autonomous vehicle navigation. The approach to obstacle detection used here employs motion analysis of imagery generated by a passive sensor. Motion analysis of imagery obtained during vehicle travel is used to generate range measurements to points within the field of view of the sensor, which can then be used to provide obstacle detection. Results obtained with an INS integrated motion analysis approach are reviewed.
NASA Technical Reports Server (NTRS)
Reinhart, Richard; Schier, James; Israel, David; Tai, Wallace; Liebrecht, Philip; Townes, Stephen
2017-01-01
The National Aeronautics and Space Administration (NASA) is studying alternatives for the United States space communications architecture through the 2040 timeframe. This architecture provides communication and navigation services to both human exploration and science missions throughout the solar system. Several of NASA's key space assets are approaching their end of design life and major systems are in need of replacement. The changes envisioned in the relay satellite architecture and capabilities around both Earth and Mars are significant undertakings and occur only once or twice each generation, and therefore is referred to as NASA's next generation space communications architecture. NASA's next generation architecture will benefit from technology and services developed over recent years. These innovations will provide missions with new operations concepts, increased performance, and new business and operating models. Advancements in optical communications will enable high-speed data channels and the use of new and more complex science instruments. Modern multiple beam/multiple access technologies such as those employed on commercial high throughput satellites will enable enhanced capabilities for on-demand service, and with new protocols will help provide Internet-like connectivity for cooperative spacecraft to improve data return and coordinate joint mission objectives. On-board processing with autonomous and cognitive networking will play larger roles to help manage system complexity. Spacecraft and ground systems will coordinate among themselves to establish communications, negotiate link connectivity, and learn to share spectrum to optimize resource allocation. Spacecraft will autonomously navigate, plan trajectories, and handle off-nominal events. NASA intends to leverage the ever-expanding capabilities of the satellite communications industry and foster its continued growth. NASA's technology development will complement and extend commercial capabilities to meet unique space environment requirements and to provide capabilities that are beyond the commercial marketplace. The progress of the communications industry, including the emerging global space internet segment and its planned constellations of 100's of satellites offer additional opportunities for new capability and mission concepts. The opportunities and challenges of a future space architecture require an optimal solution encompassing a global perspective. The concepts and technologies intentionally define an architecture that applies not only to NASA, but to other U.S. government agencies, international space and government agencies, and domestic and international industries to advance the openness, interoperability, and affordability of space communications. Cooperation among the worlds space agencies, their capabilities, standards, operations, and interoperability are key to advancing humankinds understand of the universe and extending human presence into the solar system.
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Schier, James S.; Israel, David J.; Tai, Wallace; Liebrecht, Philip E.; Townes, Stephen A.
2017-01-01
The National Aeronautics and Space Administration (NASA) is studying alternatives for the United States space communications architecture through the 2040 timeframe. This architecture provides communication and navigation services to both human exploration and science missions throughout the solar system. Several of NASA's key space assets are approaching their end of design life and major systems are in need of replacement. The changes envisioned in the relay satellite architecture and capabilities around both Earth and Mars are significant undertakings and occur only once or twice each generation, and therefore is referred to as NASA's next generation space communications architecture. NASA's next generation architecture will benefit from technology and services developed over recent years. These innovations will provide missions with new operations concepts, increased performance, and new business and operating models. Advancements in optical communications will enable high-speed data channels and the use of new and more complex science instruments. Modern multiple beam/multiple access technologies such as those employed on commercial high throughput satellites will enable enhanced capabilities for on-demand service, and with new protocols will help provide Internet-like connectivity for cooperative spacecraft to improve data return and coordinate joint mission objectives. On-board processing with autonomous and cognitive networking will play larger roles to help manage system complexity. Spacecraft and ground systems will coordinate among themselves to establish communications, negotiate link connectivity, and learn to share spectrum to optimize resource allocation. Spacecraft will autonomously navigate, plan trajectories, and handle off-nominal events. NASA intends to leverage the ever-expanding capabilities of the satellite communications industry and foster its continued growth. NASA's technology development will complement and extend commercial capabilities to meet unique space environment requirements and to provide capabilities that are beyond the commercial marketplace. The progress of the communications industry, including the emerging global space internet segment and its planned constellations of 100's of satellites offer additional opportunities for new capability and mission concepts. The opportunities and challenges of a future space architecture require an optimal solution encompassing a global perspective. The concepts and technologies intentionally define an architecture that applies not only to NASA, but to other U.S. government agencies, international space and government agencies, and domestic and international industries to advance the openness, interoperability, and affordability of space communications. Cooperation among the worlds space agencies, their capabilities, standards, operations, and interoperability are key to advancing humankind's understand of the universe and extending human presence into the solar system.
An Efficient Model-Based Image Understanding Method for an Autonomous Vehicle.
1997-09-01
The problem discussed in this dissertation is the development of an efficient method for visual navigation of autonomous vehicles . The approach is to... autonomous vehicles . Thus the new method is implemented as a component of the image-understanding system in the autonomous mobile robot Yamabico-11 at
Sandia National Laboratories proof-of-concept robotic security vehicle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrington, J.J.; Jones, D.P.; Klarer, P.R.
1989-01-01
Several years ago Sandia National Laboratories developed a prototype interior robot that could navigate autonomously inside a large complex building to air and test interior intrusion detection systems. Recently the Department of Energy Office of Safeguards and Security has supported the development of a vehicle that will perform limited security functions autonomously in a structured exterior environment. The goal of the first phase of this project was to demonstrate the feasibility of an exterior robotic vehicle for security applications by using converted interior robot technology, if applicable. An existing teleoperational test bed vehicle with remote driving controls was modified andmore » integrated with a newly developed command driving station and navigation system hardware and software to form the Robotic Security Vehicle (RSV) system. The RSV, also called the Sandia Mobile Autonomous Navigator (SANDMAN), has been successfully used to demonstrate that teleoperated security vehicles which can perform limited autonomous functions are viable and have the potential to decrease security manpower requirements and improve system capabilities. 2 refs., 3 figs.« less
Solar Thermal Utility-Scale Joint Venture Program (USJVP) Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
MANCINI,THOMAS R.
2001-04-01
Several years ago Sandia National Laboratories developed a prototype interior robot [1] that could navigate autonomously inside a large complex building to aid and test interior intrusion detection systems. Recently the Department of Energy Office of Safeguards and Security has supported the development of a vehicle that will perform limited security functions autonomously in a structured exterior environment. The goal of the first phase of this project was to demonstrate the feasibility of an exterior robotic vehicle for security applications by using converted interior robot technology, if applicable. An existing teleoperational test bed vehicle with remote driving controls was modifiedmore » and integrated with a newly developed command driving station and navigation system hardware and software to form the Robotic Security Vehicle (RSV) system. The RSV, also called the Sandia Mobile Autonomous Navigator (SANDMAN), has been successfully used to demonstrate that teleoperated security vehicles which can perform limited autonomous functions are viable and have the potential to decrease security manpower requirements and improve system capabilities.« less
Control technique for planetary rover
NASA Technical Reports Server (NTRS)
Nakatani, Ichiro; Kubota, Takashi; Adachi, Tadashi; Saitou, Hiroaki; Okamoto, Sinya
1994-01-01
Beginning next century, several schemes for sending a planetary rover to the moon or Mars are being planned. As part of the development program, autonomous navigation technology is being studied to allow the rover the ability to move autonomously over a long range of unknown planetary surface. In the previous study, we ran the autonomous navigation experiment on an outdoor test terrain by using a rover test-bed that was controlled by a conventional sense-plan-act method. In some cases during the experiment, a problem occurred with the rover moving into untraversable areas. To improve this situation, a new control technique has been developed that gives the rover the ability of reacting to the outputs of the proximity sensors, a reaction behavior if you will. We have developed a new rover test-bed system on which an autonomous navigation experiment was performed using the newly developed control technique. In this outdoor experiment, the new control technique effectively produced the control command for the rover to avoid obstacles and be guided to the goal point safely.
GPS World, Innovation: Autonomous Navigation at High Earth Orbits
NASA Technical Reports Server (NTRS)
Bamford, William; Winternitz, Luke; Hay, Curtis
2005-01-01
Calculating a spacecraft's precise location at high orbital altitudes-22,000 miles (35,800 km) and beyond-is an important and challenging problem. New and exciting opportunities become possible if satellites are able to autonomously determine their own orbits. First, the repetitive task of periodically collecting range measurements from terrestrial antennas to high altitude spacecraft becomes less important-this lessens competition for control facilities and saves money by reducing operational costs. Also, autonomous navigation at high orbital altitudes introduces the possibility of autonomous station keeping. For example, if a geostationary satellite begins to drift outside of its designated slot it can make orbit adjustments without requiring commands from the ground. Finally, precise onboard orbit determination opens the door to satellites flying in formation-an emerging concept for many scientific space applications. The realization of these benefits is not a trivial task. While the navigation signals broadcast by GPS satellites are well suited for orbit and attitude determination at lower altitudes, acquiring and using these signals at geostationary (GEO) and highly elliptical orbits is much more difficult. The light blue trace describes the GPS orbit at approximately 12,550 miles (20,200 km) altitude. GPS satellites were designed to provide navigation signals to terrestrial users-consequently the antenna array points directly toward the earth. GEO and HE0 orbits, however, are well above the operational GPS constellation, making signal reception at these altitudes more challenging. The nominal beamwidth of a Block II/IIA GPS satellite antenna array is approximately 42.6 degrees. At GEO and HE0 altitudes, most of these primary beam transmissions are blocked by the Earth, leaving only a narrow region of nominal signal visibility near opposing limbs of the earth. This region is highlighted in gray. If GPS receivers at GEO and HE0 orbits were designed to use these higher power signals only, precise orbit determination would not be practical. Fortunately, the GPS satellite antenna array also produces side lobe signals at much lower power levels. NASA has designed and tested the Navigator, a new GPS receiver that can acquire and track these weaker signals, thereby dramatically increasing the signal visibility at these altitudes. While using much weaker signals is a fundamental requirement for a high orbital altitude GPS receiver, it is certainly not the only challenge. There are other unique characteristics of this application that must also be considered. For example, Position Dilution of Precision (PDOP) figures are much higher at GEO and HE0 altitudes because visible GPS satellites are concentrated in a much smaller area with respect to the spacecraft antenna. These poor PDOP values contribute considerable error to the point solutions calculated by the spacecraft GPS receiver. Finally, spacecraft GPS receivers must be designed to withstand a variety of extreme environmental conditions. Variations in acceleration between launch and booster separation are extreme. Temperature gradients in the space environment are also severe. Furthermore, radiation effects are a major concern-spacecraft-borne GPS receivers must be designed with radiation-hardened electronics to guard against this phenomenon, otherwise they simply will not work. Perhaps most importantly, there are no opportunities to repair or modify any space-borne GPS receiver after it has been launched. Great care must be taken to ensure all performance characteristics have been analyzed prior to liftoff.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Team KuuKulgur watches as their robots attempt the level one competition during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Sam Ortega, NASA program manager for Centennial Challenges, is seen during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The Retrievers team robot is seen as it attempts the level one challenge the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
Wind-Based Navigation of a Hot-air Balloon on Titan: A Feasibility Study
NASA Technical Reports Server (NTRS)
Furfaro, Roberto; Lunine, Jonathan I.; Elfes, Alberto; Reh, Kim
2008-01-01
Current analysis of data streamed back to Earth by the Cassini spacecraft features Titan as one of the most exciting places in the solar system. NASA centers and universities around the US, as well as the European Space Agency, are studying the possibility of sending, as part of the next mission to this giant moon of Saturn, a hot-air balloon (Montgolfier-type) for further and more in-depth exploration. The basic idea would be to design a reliable, semi-autonomous, and yet cheap Montgolfier capable of using continuous flow of waste heat from a power source to lift the balloon and sustain its altitude in the Titan environment. In this paper we study the problem of locally navigating a hot-air balloon in the nitrogen-based Titan atmosphere. The basic idea is to define a strategy (i.e. design of a suitable guidance system) that allows autonomous and semi-autonomous navigation of the balloon using the available (and partial) knowledge of the wind structure blowing on the saturnian satellite surface. Starting from first principles we determined the appropriate thermal and dynamical models describing (a) the vertical dynamics of the balloon and (b) the dynamics of the balloon moving on a vertical plane (2-D motion). Next, various non-linear fuzzy-based control strategies have been evaluated, analyzed and implemented in MATLAB to numerically simulate the capability of the system to simultaneously maintain altitude, as well as a scientifically desirable trajectory. We also looked at the ability of the balloon to perform station keeping. The results of the simulation are encouraging and show the effectiveness of such a system to cheaply and effectively perform semiautonomous exploration of Titan.
NASA Astrophysics Data System (ADS)
Hesar, Siamak G.; Parker, Jeffrey S.; Leonard, Jason M.; McGranaghan, Ryan M.; Born, George H.
2015-12-01
We study the application of Linked Autonomous Interplanetary Satellite Orbit Navigation (LiAISON) to track vehicles on the far side of the lunar surface. The LiAISON architecture is demonstrated to achieve accurate orbit determination solutions for various mission scenarios in the Earth-Moon system. Given the proper description of the force field, LiAISON is capable of producing absolute orbit determination solutions using relative satellite-to-satellite tracking observations alone. The lack of direct communication between Earth-based tracking stations and the far side of the Moon provides an ideal opportunity for implementing LiAISON. This paper presents a novel approach to use the LiAISON architecture to perform autonomous navigation of assets on the lunar far side surface. Relative measurements between a spacecraft placed in an EML-2 halo orbit and lunar surface asset(s) are simulated and processed. Comprehensive simulation results show that absolute states of the surface assets are observable with an achieved accuracy of the position estimate on the order of tens of meters.
Fully autonomous navigation for the NASA cargo transfer vehicle
NASA Technical Reports Server (NTRS)
Wertz, James R.; Skulsky, E. David
1991-01-01
A great deal of attention has been paid to navigation during the close approach (less than or equal to 1 km) phase of spacecraft rendezvous. However, most spacecraft also require a navigation system which provides the necessary accuracy for placing both satellites within the range of the docking sensors. The Microcosm Autonomous Navigation System (MANS) is an on-board system which uses Earth-referenced attitude sensing hardware to provide precision orbit and attitude determination. The system is capable of functioning from LEO to GEO and beyond. Performance depends on the number of available sensors as well as mission geometry; however, extensive simulations have shown that MANS will provide 100 m to 400 m (3(sigma)) position accuracy and 0.03 to 0.07 deg (3(sigma)) attitude accuracy in low Earth orbit. The system is independent of any external source, including GPS. MANS is expected to have a significant impact on ground operations costs, mission definition and design, survivability, and the potential development of very low-cost, fully autonomous spacecraft.
NASA Technical Reports Server (NTRS)
Lavery, David; Bedard, Roger J., Jr.
1991-01-01
The NASA Planetary Rover Project was initiated in 1989. The emphasis of the work to date has been on development of autonomous navigation technology within the context of a high mobility wheeled vehicle at the JPL and an innovative legged locomotion concept at Carnegie Mellon University. The status and accomplishments of these two efforts are discussed. First, however, background information is given on the three rover types required for the Space Exploration Initiative (SEI) whose objective is a manned mission to Mars.
Control - Demands mushroom as station grows
NASA Technical Reports Server (NTRS)
Szirmay, S. Z.; Blair, J.
1983-01-01
The NASA space station, which is presently in the planning stage, is to be composed of both rigid and nonrigid modules, rotating elements, and flexible appendages subjected to environmental disturbances from the earth's atmospheric gravity gradient, and magnetic field, as well as solar radiation and self-generated disturbances. Control functions, which will originally include attitude control, docking and berthing control, and system monitoring and management, will with evolving mission objectives come to encompass such control functions as articulation control, autonomous navigation, space traffic control, and large space structure control. Attention is given to the advancements in modular, distributed, and adaptive control methods, as well as system identification and hardware fault tolerance techniques, which will be required.
NASA Astrophysics Data System (ADS)
Theil, S.; Ammann, N.; Andert, F.; Franz, T.; Krüger, H.; Lehner, H.; Lingenauber, M.; Lüdtke, D.; Maass, B.; Paproth, C.; Wohlfeil, J.
2018-03-01
Since 2010 the German Aerospace Center is working on the project Autonomous Terrain-based Optical Navigation (ATON). Its objective is the development of technologies which allow autonomous navigation of spacecraft in orbit around and during landing on celestial bodies like the Moon, planets, asteroids and comets. The project developed different image processing techniques and optical navigation methods as well as sensor data fusion. The setup—which is applicable to many exploration missions—consists of an inertial measurement unit, a laser altimeter, a star tracker and one or multiple navigation cameras. In the past years, several milestones have been achieved. It started with the setup of a simulation environment including the detailed simulation of camera images. This was continued by hardware-in-the-loop tests in the Testbed for Robotic Optical Navigation (TRON) where images were generated by real cameras in a simulated downscaled lunar landing scene. Data were recorded in helicopter flight tests and post-processed in real-time to increase maturity of the algorithms and to optimize the software. Recently, two more milestones have been achieved. In late 2016, the whole navigation system setup was flying on an unmanned helicopter while processing all sensor information onboard in real time. For the latest milestone the navigation system was tested in closed-loop on the unmanned helicopter. For that purpose the ATON navigation system provided the navigation state for the guidance and control of the unmanned helicopter replacing the GPS-based standard navigation system. The paper will give an introduction to the ATON project and its concept. The methods and algorithms of ATON are briefly described. The flight test results of the latest two milestones are presented and discussed.
First Results from a Hardware-in-the-Loop Demonstration of Closed-Loop Autonomous Formation Flying
NASA Technical Reports Server (NTRS)
Gill, E.; Naasz, Bo; Ebinuma, T.
2003-01-01
A closed-loop system for the demonstration of autonomous satellite formation flying technologies using hardware-in-the-loop has been developed. Making use of a GPS signal simulator with a dual radio frequency outlet, the system includes two GPS space receivers as well as a powerful onboard navigation processor dedicated to the GPS-based guidance, navigation, and control of a satellite formation in real-time. The closed-loop system allows realistic simulations of autonomous formation flying scenarios, enabling research in the fields of tracking and orbit control strategies for a wide range of applications. The autonomous closed-loop formation acquisition and keeping strategy is based on Lyapunov's direct control method as applied to the standard set of Keplerian elements. This approach not only assures global and asymptotic stability of the control but also maintains valuable physical insight into the applied control vectors. Furthermore, the approach can account for system uncertainties and effectively avoids a computationally expensive solution of the two point boundary problem, which renders the concept particularly attractive for implementation in onboard processors. A guidance law has been developed which strictly separates the relative from the absolute motion, thus avoiding the numerical integration of a target trajectory in the onboard processor. Moreover, upon using precise kinematic relative GPS solutions, a dynamical modeling or filtering is avoided which provides for an efficient implementation of the process on an onboard processor. A sample formation flying scenario has been created aiming at the autonomous transition of a Low Earth Orbit satellite formation from an initial along-track separation of 800 m to a target distance of 100 m. Assuming a low-thrust actuator which may be accommodated on a small satellite, a typical control accuracy of less than 5 m has been achieved which proves the applicability of autonomous formation flying techniques to formations of satellites as close as 50 m.
Visual Odometry for Autonomous Deep-Space Navigation Project
NASA Technical Reports Server (NTRS)
Robinson, Shane; Pedrotty, Sam
2016-01-01
Autonomous rendezvous and docking (AR&D) is a critical need for manned spaceflight, especially in deep space where communication delays essentially leave crews on their own for critical operations like docking. Previously developed AR&D sensors have been large, heavy, power-hungry, and may still require further development (e.g. Flash LiDAR). Other approaches to vision-based navigation are not computationally efficient enough to operate quickly on slower, flight-like computers. The key technical challenge for visual odometry is to adapt it from the current terrestrial applications it was designed for to function in the harsh lighting conditions of space. This effort leveraged Draper Laboratory’s considerable prior development and expertise, benefitting both parties. The algorithm Draper has created is unique from other pose estimation efforts as it has a comparatively small computational footprint (suitable for use onboard a spacecraft, unlike alternatives) and potentially offers accuracy and precision needed for docking. This presents a solution to the AR&D problem that only requires a camera, which is much smaller, lighter, and requires far less power than competing AR&D sensors. We have demonstrated the algorithm’s performance and ability to process ‘flight-like’ imagery formats with a ‘flight-like’ trajectory, positioning ourselves to easily process flight data from the upcoming ‘ISS Selfie’ activity and then compare the algorithm’s quantified performance to the simulated imagery. This will bring visual odometry beyond TRL 5, proving its readiness to be demonstrated as part of an integrated system.Once beyond TRL 5, visual odometry will be poised to be demonstrated as part of a system in an in-space demo where relative pose is critical, like Orion AR&D, ISS robotic operations, asteroid proximity operations, and more.
Visual Odometry for Autonomous Deep-Space Navigation Project
NASA Technical Reports Server (NTRS)
Robinson, Shane; Pedrotty, Sam
2016-01-01
Autonomous rendezvous and docking (AR&D) is a critical need for manned spaceflight, especially in deep space where communication delays essentially leave crews on their own for critical operations like docking. Previously developed AR&D sensors have been large, heavy, power-hungry, and may still require further development (e.g. Flash LiDAR). Other approaches to vision-based navigation are not computationally efficient enough to operate quickly on slower, flight-like computers. The key technical challenge for visual odometry is to adapt it from the current terrestrial applications it was designed for to function in the harsh lighting conditions of space. This effort leveraged Draper Laboratory's considerable prior development and expertise, benefitting both parties. The algorithm Draper has created is unique from other pose estimation efforts as it has a comparatively small computational footprint (suitable for use onboard a spacecraft, unlike alternatives) and potentially offers accuracy and precision needed for docking. This presents a solution to the AR&D problem that only requires a camera, which is much smaller, lighter, and requires far less power than competing AR&D sensors. We have demonstrated the algorithm's performance and ability to process 'flight-like' imagery formats with a 'flight-like' trajectory, positioning ourselves to easily process flight data from the upcoming 'ISS Selfie' activity and then compare the algorithm's quantified performance to the simulated imagery. This will bring visual odometry beyond TRL 5, proving its readiness to be demonstrated as part of an integrated system. Once beyond TRL 5, visual odometry will be poised to be demonstrated as part of a system in an in-space demo where relative pose is critical, like Orion AR&D, ISS robotic operations, asteroid proximity operations, and more.
The development and flight test of a deployable precision landing system for spacecraft recovery
NASA Technical Reports Server (NTRS)
Sim, Alex G.; Murray, James E.; Neufeld, David C.; Reed, R. Dale
1993-01-01
A joint NASA Dryden Flight Research Facility and Johnson Space Center program was conducted to determine the feasibility of the autonomous recovery of a spacecraft using a ram-air parafoil system for the final stages of entry from space that included a precision landing. The feasibility of this system was studied using a flight model of a spacecraft in the generic shape of a flattened biconic which weighed approximately 150 lb and was flown under a commercially available, ram-air parachute. Key elements of the vehicle included the Global Positioning System guidance for navigation, flight control computer, ultrasonic sensing for terminal altitude, electronic compass, and onboard data recording. A flight test program was used to develop and refine the vehicle. This vehicle completed an autonomous flight from an altitude of 10,000 ft and a lateral offset of 1.7 miles which resulted in a precision flare and landing into the wind at a predetermined location. At times, the autonomous flight was conducted in the presence of winds approximately equal to vehicle airspeed. Several techniques for computing the winds postflight were evaluated. Future program objectives are also presented.
SLAM algorithm applied to robotics assistance for navigation in unknown environments
2010-01-01
Background The combination of robotic tools with assistance technology determines a slightly explored area of applications and advantages for disability or elder people in their daily tasks. Autonomous motorized wheelchair navigation inside an environment, behaviour based control of orthopaedic arms or user's preference learning from a friendly interface are some examples of this new field. In this paper, a Simultaneous Localization and Mapping (SLAM) algorithm is implemented to allow the environmental learning by a mobile robot while its navigation is governed by electromyographic signals. The entire system is part autonomous and part user-decision dependent (semi-autonomous). The environmental learning executed by the SLAM algorithm and the low level behaviour-based reactions of the mobile robot are robotic autonomous tasks, whereas the mobile robot navigation inside an environment is commanded by a Muscle-Computer Interface (MCI). Methods In this paper, a sequential Extended Kalman Filter (EKF) feature-based SLAM algorithm is implemented. The features correspond to lines and corners -concave and convex- of the environment. From the SLAM architecture, a global metric map of the environment is derived. The electromyographic signals that command the robot's movements can be adapted to the patient's disabilities. For mobile robot navigation purposes, five commands were obtained from the MCI: turn to the left, turn to the right, stop, start and exit. A kinematic controller to control the mobile robot was implemented. A low level behavior strategy was also implemented to avoid robot's collisions with the environment and moving agents. Results The entire system was tested in a population of seven volunteers: three elder, two below-elbow amputees and two young normally limbed patients. The experiments were performed within a closed low dynamic environment. Subjects took an average time of 35 minutes to navigate the environment and to learn how to use the MCI. The SLAM results have shown a consistent reconstruction of the environment. The obtained map was stored inside the Muscle-Computer Interface. Conclusions The integration of a highly demanding processing algorithm (SLAM) with a MCI and the communication between both in real time have shown to be consistent and successful. The metric map generated by the mobile robot would allow possible future autonomous navigation without direct control of the user, whose function could be relegated to choose robot destinations. Also, the mobile robot shares the same kinematic model of a motorized wheelchair. This advantage can be exploited for wheelchair autonomous navigation. PMID:20163735
New Vectorial Propulsion System and Trajectory Control Designs for Improved AUV Mission Autonomy
Gonzalez, Julian; Galarza, Cesar; Aguzzi, Jacopo; del Rio, Joaquin
2018-01-01
Autonomous Underwater Vehicles (AUV) are proving to be a promising platform design for multidisciplinary autonomous operability with a wide range of applications in marine ecology and geoscience. Here, two novel contributions towards increasing the autonomous navigation capability of a new AUV prototype (the Guanay II) as a mix between a propelled vehicle and a glider are presented. Firstly, a vectorial propulsion system has been designed to provide full vehicle maneuverability in both horizontal and vertical planes. Furthermore, two controllers have been designed, based on fuzzy controls, to provide the vehicle with autonomous navigation capabilities. Due to the decoupled system propriety, the controllers in the horizontal plane have been designed separately from the vertical plane. This class of non-linear controllers has been used to interpret linguistic laws into different zones of functionality. This method provided good performance, used as interpolation between different rules or linear controls. Both improvements have been validated through simulations and field tests, displaying good performance results. Finally, the conclusion of this work is that the Guanay II AUV has a solid controller to perform autonomous navigation and carry out vertical immersions. PMID:29673224
2004-07-14
KENNEDY SPACE CENTER, FLA. - At Vandenberg Air Force Base in California, an Orbital Sciences technician works with wiring on the DART (Demonstration for Autonomous Rendezvous Technology) flight demonstrator, a spacecraft developed to prove technologies for locating and maneuvering near an orbiting satellite. Future applications of technologies developed by the DART project will benefit the nation in future space-vehicle systems development requiring in-space assembly, services or other autonomous rendezvous operations. Designed and developed for NASA by Orbital Sciences Corporation in Dulles, Va., the DART spacecraft will be launched on a Pegasus launch vehicle. At about 40,000 feet over the Pacific Ocean, the Pegasus will be released from Orbital’s Stargazer L-1011 aircraft, fire its rocket motors and boost DART into a polar orbit approximately 472 miles by 479 miles. Once in orbit, DART will rendezvous with a target satellite, the Multiple Paths, Beyond-Line-of-Site Communications satellite, also built by Orbital Sciences. DART will then perform several close proximity operations, such as moving toward and away from the satellite using navigation data provided by onboard sensors. DART is scheduled for launch no earlier than Oct. 18.
2004-07-14
KENNEDY SPACE CENTER, FLA. - At Vandenberg Air Force Base in California, Orbital Sciences workers remove the canister from the DART (Demonstration for Autonomous Rendezvous Technology) flight demonstrator, a spacecraft developed to prove technologies for locating and maneuvering near an orbiting satellite. Future applications of technologies developed by the DART project will benefit the nation in future space-vehicle systems development requiring in-space assembly, services or other autonomous rendezvous operations. Designed and developed for NASA by Orbital Sciences Corporation in Dulles, Va., the DART spacecraft will be launched on a Pegasus launch vehicle. At about 40,000 feet over the Pacific Ocean, the Pegasus will be released from Orbital’s Stargazer L-1011 aircraft, fire its rocket motors and boost DART into a polar orbit approximately 472 miles by 479 miles. Once in orbit, DART will rendezvous with a target satellite, the Multiple Paths, Beyond-Line-of-Site Communications satellite, also built by Orbital Sciences. DART will then perform several close proximity operations, such as moving toward and away from the satellite using navigation data provided by onboard sensors. DART is scheduled for launch no earlier than Oct. 18.
2004-07-14
KENNEDY SPACE CENTER, FLA. - At Vandenberg Air Force Base in California, Orbital Sciences technicians watch closely as the DART (Demonstration for Autonomous Rendezvous Technology) flight demonstrator is lowered onto a stand. The spacecraft was developed to prove technologies for locating and maneuvering near an orbiting satellite. Future applications of technologies developed by the DART project will benefit the nation in future space-vehicle systems development requiring in-space assembly, services or other autonomous rendezvous operations. Designed and developed for NASA by Orbital Sciences Corporation in Dulles, Va., the DART spacecraft will be launched on a Pegasus launch vehicle. At about 40,000 feet over the Pacific Ocean, the Pegasus will be released from Orbital’s Stargazer L-1011 aircraft, fire its rocket motors and boost DART into a polar orbit approximately 472 miles by 479 miles. Once in orbit, DART will rendezvous with a target satellite, the Multiple Paths, Beyond-Line-of-Site Communications satellite, also built by Orbital Sciences. DART will then perform several close proximity operations, such as moving toward and away from the satellite using navigation data provided by onboard sensors. DART is scheduled for launch no earlier than Oct. 18.
2004-07-14
KENNEDY SPACE CENTER, FLA. - At Vandenberg Air Force Base in California, the DART (Demonstration for Autonomous Rendezvous Technology) flight demonstrator is revealed after its protective cover has been removed. The spacecraft was developed to prove technologies for locating and maneuvering near an orbiting satellite. Future applications of technologies developed by the DART project will benefit the nation in future space-vehicle systems development requiring in-space assembly, services or other autonomous rendezvous operations. Designed and developed for NASA by Orbital Sciences Corporation in Dulles, Va., the DART spacecraft will be launched on a Pegasus launch vehicle. At about 40,000 feet over the Pacific Ocean, the Pegasus will be released from Orbital’s Stargazer L-1011 aircraft, fire its rocket motors and boost DART into a polar orbit approximately 472 miles by 479 miles. Once in orbit, DART will rendezvous with a target satellite, the Multiple Paths, Beyond-Line-of-Site Communications satellite, also built by Orbital Sciences. DART will then perform several close proximity operations, such as moving toward and away from the satellite using navigation data provided by onboard sensors. DART is scheduled for launch no earlier than Oct. 18.
2004-07-14
KENNEDY SPACE CENTER, FLA. - At Vandenberg Air Force Base in California, the DART (Demonstration for Autonomous Rendezvous Technology) flight demonstrator is revealed after its protective cover has been removed. The spacecraft was developed to prove technologies for locating and maneuvering near an orbiting satellite. Future applications of technologies developed by the DART project will benefit the nation in future space-vehicle systems development requiring in-space assembly, services or other autonomous rendezvous operations. Designed and developed for NASA by Orbital Sciences Corporation in Dulles, Va., the DART spacecraft will be launched on a Pegasus launch vehicle. At about 40,000 feet over the Pacific Ocean, the Pegasus will be released from Orbital’s Stargazer L-1011 aircraft, fire its rocket motors and boost DART into a polar orbit approximately 472 miles by 479 miles. Once in orbit, DART will rendezvous with a target satellite, the Multiple Paths, Beyond-Line-of-Site Communications satellite, also built by Orbital Sciences. DART will then perform several close proximity operations, such as moving toward and away from the satellite using navigation data provided by onboard sensors. DART is scheduled for launch no earlier than Oct. 18.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. At Vandenberg Air Force Base in California, the DART (Demonstration for Autonomous Rendezvous Technology) flight demonstrator is revealed after its protective cover has been removed. The spacecraft was developed to prove technologies for locating and maneuvering near an orbiting satellite. Future applications of technologies developed by the DART project will benefit the nation in future space-vehicle systems development requiring in-space assembly, services or other autonomous rendezvous operations. Designed and developed for NASA by Orbital Sciences Corporation in Dulles, Va., the DART spacecraft will be launched on a Pegasus launch vehicle. At about 40,000 feet over the Pacific Ocean, the Pegasus will be released from Orbitals Stargazer L-1011 aircraft, fire its rocket motors and boost DART into a polar orbit approximately 472 miles by 479 miles. Once in orbit, DART will rendezvous with a target satellite, the Multiple Paths, Beyond-Line-of-Site Communications satellite, also built by Orbital Sciences. DART will then perform several close proximity operations, such as moving toward and away from the satellite using navigation data provided by onboard sensors. DART is scheduled for launch no earlier than Oct. 18.
The IXV experience, from the mission conception to the flight results
NASA Astrophysics Data System (ADS)
Tumino, G.; Mancuso, S.; Gallego, J.-M.; Dussy, S.; Preaud, J.-P.; Di Vita, G.; Brunner, P.
2016-07-01
The atmospheric re-entry domain is a cornerstone of a wide range of space applications, ranging from reusable launcher stages developments, robotic planetary exploration, human space flight, to innovative applications such as reusable research platforms for in orbit validation of multiple space applications technologies. The Intermediate experimental Vehicle (IXV) is an advanced demonstrator which has performed in-flight experimentation of atmospheric re-entry enabling systems and technologies aspects, with significant advancements on Europe's previous flight experiences, consolidating Europe's autonomous position in the strategic field of atmospheric re-entry. The IXV mission objectives were the design, development, manufacturing, assembling and on-ground to in-flight verification of an autonomous European lifting and aerodynamically controlled reentry system, integrating critical re-entry technologies at system level. Among such critical technologies of interest, special attention was paid to aerodynamic and aerothermodynamics experimentation, including advanced instrumentation for aerothermodynamics phenomena investigations, thermal protections and hot-structures, guidance, navigation and flight control through combined jets and aerodynamic surfaces (i.e. flaps), in particular focusing on the technologies integration at system level for flight, successfully performed on February 11th, 2015.
Autonomous RPRV Navigation, Guidance and Control
NASA Technical Reports Server (NTRS)
Johnston, Donald E.; Myers, Thomas T.; Zellner, John W.
1983-01-01
Dryden Flight Research Center has the responsibility for flight testing of advanced remotely piloted research vehicles (RPRV) to explore highly maneuverable aircraft technology, and to test advanced structural concepts, and related aeronautical technologies which can yield important research results with significant cost benefits. The primary purpose is to provide the preliminary design of an upgraded automatic approach and landing control system and flight director display to improve landing performance and reduce pilot workload. A secondary purpose is to determine the feasibility of an onboard autonomous navigation, orbit, and landing capability for safe vehicle recovery in the event of loss of telemetry uplink communication with the vehicles. The current RPRV approach and landing method, the proposed automatic and manual approach and autoland system, and an autonomous navigation, orbit, and landing system concept which is based on existing operational technology are described.
The Personal Satellite Assistant: An Internal Spacecraft Autonomous Mobile Monitor
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.; Gawdiak, Yuri; Clancy, Daniel (Technical Monitor)
2002-01-01
This paper presents an overview of the research and development effort at the NASA Ames Research Center to create an internal spacecraft autonomous mobile monitor capable of performing intra-vehicular sensing activities by autonomously navigating onboard the International Space Station. We describe the capabilities, mission roles, rationale, high-level functional requirements, and design challenges for an autonomous mobile monitor. The rapid prototyping design methodology used, in which five prototypes of increasing fidelity are designed, is described as well as the status of these prototypes, of which two are operational and being tested, and one is actively being designed. The physical test facilities used to perform ground testing are briefly described, including a micro-gravity test facility that permits a prototype to propel itself in 3 dimensions with 6 degrees-of-freedom as if it were in an micro-gravity environment. We also describe an overview of the autonomy framework and its components including the software simulators used in the development process. Sample mission test scenarios are also described. The paper concludes with a discussion of future and related work followed by the summary.
ALHAT COBALT: CoOperative Blending of Autonomous Landing Technology
NASA Technical Reports Server (NTRS)
Carson, John M.
2015-01-01
The COBALT project is a flight demonstration of two NASA ALHAT (Autonomous precision Landing and Hazard Avoidance Technology) capabilities that are key for future robotic or human landing GN&C (Guidance, Navigation and Control) systems. The COBALT payload integrates the Navigation Doppler Lidar (NDL) for ultraprecise velocity and range measurements with the Lander Vision System (LVS) for Terrain Relative Navigation (TRN) position estimates. Terrestrial flight tests of the COBALT payload in an open-loop and closed-loop GN&C configuration will be conducted onboard a commercial, rocket-propulsive Vertical Test Bed (VTB) at a test range in Mojave, CA.
A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots.
Sherwin, Tyrone; Easte, Mikala; Chen, Andrew Tzer-Yeu; Wang, Kevin I-Kai; Dai, Wenbin
2018-02-14
Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system.
A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots
Sherwin, Tyrone; Easte, Mikala; Wang, Kevin I-Kai; Dai, Wenbin
2018-01-01
Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system. PMID:29443906
PointCom: semi-autonomous UGV control with intuitive interface
NASA Astrophysics Data System (ADS)
Rohde, Mitchell M.; Perlin, Victor E.; Iagnemma, Karl D.; Lupa, Robert M.; Rohde, Steven M.; Overholt, James; Fiorani, Graham
2008-04-01
Unmanned ground vehicles (UGVs) will play an important role in the nation's next-generation ground force. Advances in sensing, control, and computing have enabled a new generation of technologies that bridge the gap between manual UGV teleoperation and full autonomy. In this paper, we present current research on a unique command and control system for UGVs named PointCom (Point-and-Go Command). PointCom is a semi-autonomous command system for one or multiple UGVs. The system, when complete, will be easy to operate and will enable significant reduction in operator workload by utilizing an intuitive image-based control framework for UGV navigation and allowing a single operator to command multiple UGVs. The project leverages new image processing algorithms for monocular visual servoing and odometry to yield a unique, high-performance fused navigation system. Human Computer Interface (HCI) techniques from the entertainment software industry are being used to develop video-game style interfaces that require little training and build upon the navigation capabilities. By combining an advanced navigation system with an intuitive interface, a semi-autonomous control and navigation system is being created that is robust, user friendly, and less burdensome than many current generation systems. mand).
Flight Testing ALHAT Precision Landing Technologies Integrated Onboard the Morpheus Rocket Vehicle
NASA Technical Reports Server (NTRS)
Carson, John M. III; Robertson, Edward A.; Trawny, Nikolas; Amzajerdian, Farzin
2015-01-01
A suite of prototype sensors, software, and avionics developed within the NASA Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project were terrestrially demonstrated onboard the NASA Morpheus rocket-propelled Vertical Testbed (VTB) in 2014. The sensors included a LIDAR-based Hazard Detection System (HDS), a Navigation Doppler LIDAR (NDL) velocimeter, and a long-range Laser Altimeter (LAlt) that enable autonomous and safe precision landing of robotic or human vehicles on solid solar system bodies under varying terrain lighting conditions. The flight test campaign with the Morpheus vehicle involved a detailed integration and functional verification process, followed by tether testing and six successful free flights, including one night flight. The ALHAT sensor measurements were integrated into a common navigation solution through a specialized ALHAT Navigation filter that was employed in closed-loop flight testing within the Morpheus Guidance, Navigation and Control (GN&C) subsystem. Flight testing on Morpheus utilized ALHAT for safe landing site identification and ranking, followed by precise surface-relative navigation to the selected landing site. The successful autonomous, closed-loop flight demonstrations of the prototype ALHAT system have laid the foundation for the infusion of safe, precision landing capabilities into future planetary exploration missions.
Vision Based Navigation for Autonomous Cooperative Docking of CubeSats
NASA Astrophysics Data System (ADS)
Pirat, Camille; Ankersen, Finn; Walker, Roger; Gass, Volker
2018-05-01
A realistic rendezvous and docking navigation solution applicable to CubeSats is investigated. The scalability analysis of the ESA Autonomous Transfer Vehicle Guidance, Navigation & Control (GNC) performances and the Russian docking system, shows that the docking of two CubeSats would require a lateral control performance of the order of 1 cm. Line of sight constraints and multipath effects affecting Global Navigation Satellite System (GNSS) measurements in close proximity prevent the use of this sensor for the final approach. This consideration and the high control accuracy requirement led to the use of vision sensors for the final 10 m of the rendezvous and docking sequence. A single monocular camera on the chaser satellite and various sets of Light-Emitting Diodes (LEDs) on the target vehicle ensure the observability of the system throughout the approach trajectory. The simple and novel formulation of the measurement equations allows differentiating unambiguously rotations from translations between the target and chaser docking port and allows a navigation performance better than 1 mm at docking. Furthermore, the non-linear measurement equations can be solved in order to provide an analytic navigation solution. This solution can be used to monitor the navigation filter solution and ensure its stability, adding an extra layer of robustness for autonomous rendezvous and docking. The navigation filter initialization is addressed in detail. The proposed method is able to differentiate LEDs signals from Sun reflections as demonstrated by experimental data. The navigation filter uses a comprehensive linearised coupled rotation/translation dynamics, describing the chaser to target docking port motion. The handover, between GNSS and vision sensor measurements, is assessed. The performances of the navigation function along the approach trajectory is discussed.
2013-09-30
underwater acoustic communication technologies for autonomous distributed underwater networks, through innovative signal processing, coding, and navigation...in real enviroments , an offshore testbed has been developed to conduct field experimetns. The testbed consists of four nodes and has been deployed...Leadership by the Connecticut Technology Council. Dr. Zhaohui Wang joined the faculty of the Department of Electrical and Computer Engineering at
In-Space Networking On NASA's SCaN Testbed
NASA Technical Reports Server (NTRS)
Brooks, David; Eddy, Wesley M.; Clark, Gilbert J., III; Johnson, Sandra K.
2016-01-01
The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios (SDRs) and a programmable flight computer. The purpose of the Testbed is to conduct inspace research in the areas of communication, navigation, and networking in support of NASA missions and communication infrastructure. Multiple reprogrammable elements in the end to end system, along with several communication paths and a semi-operational environment, provides a unique opportunity to explore networking concepts and protocols envisioned for the future Solar System Internet (SSI). This paper will provide a general description of the system's design and the networking protocols implemented and characterized on the testbed, including Encapsulation, IP over CCSDS, and Delay-Tolerant Networking (DTN). Due to the research nature of the implementation, flexibility and robustness are considered in the design to enable expansion for future adaptive and cognitive techniques. Following a detailed design discussion, lessons learned and suggestions for future missions and communication infrastructure elements will be provided. Plans for the evolving research on SCaN Testbed as it moves towards a more adaptive, autonomous system will be discussed.
NASA Technical Reports Server (NTRS)
Srivastava, Deepak; Meyyappan, Meyya; Yan, Jerry (Technical Monitor)
2000-01-01
Advanced miniaturization, a key thrust area to enable new science and exploration missions, provides ultrasmall sensors, power sources, communication, navigation, and propulsion systems with very low mass, volume, and power consumption. Revolutions in electronics and computing will allow reconfigurable, autonomous, 'thinking' spacecraft. Nanotechnology presents a whole new spectrum of opportunities to build device components and systems for entirely new space architectures: (1) networks of ultrasmall probes on planetary surfaces; (2) micro-rovers that drive, hop, fly, and burrow; and (3) collections of microspacecraft making a variety of measurements.
Mobile robot exploration and navigation of indoor spaces using sonar and vision
NASA Technical Reports Server (NTRS)
Kortenkamp, David; Huber, Marcus; Koss, Frank; Belding, William; Lee, Jaeho; Wu, Annie; Bidlack, Clint; Rodgers, Seth
1994-01-01
Integration of skills into an autonomous robot that performs a complex task is described. Time constraints prevented complete integration of all the described skills. The biggest problem was tuning the sensor-based region-finding algorithm to the environment involved. Since localization depended on matching regions found with the a priori map, the robot became lost very quickly. If the low level sensing of the world is not working, then high level reasoning or map making will be unsuccessful.
Autonomous rendezvous and docking: A commercial approach to on-orbit technology validation
NASA Technical Reports Server (NTRS)
Tchoryk, Peter, Jr.; Dobbs, Michael E.; Conrad, David J.; Apley, Dale J.; Whitten, Raymond P.
1991-01-01
The Space Automation and Robotics Center (SpARC), a NASA-sponsored Center for the Commercial Development of Space (CCDS), in conjunction with its corporate affiliates, is planning an on-orbit validation of autonomous rendezvous and docking (ARD) technology. The emphasis in this program is to utilize existing technology and commercially available components whenever possible. The primary subsystems that will be validated by this demonstration include GPS receivers for navigation, a video-based sensor for proximity operations, a fluid connector mechanism to demonstrate fluid resupply capability, and a compliant, single-point docking mechanism. The focus for this initial experiment will be expendable launch vehicle (ELV) based and will make use of two residual Commercial Experiment Transporter (COMET) service modules. The first COMET spacecraft will be launched in late 1992 and will serve as the target vehicle. The ARD demonstration will take place in late 1994, after the second COMET spacecraft has been launched. The service module from the second COMET will serve as the chase vehicle.
Open-Loop Flight Testing of COBALT Navigation and Sensor Technologies for Precise Soft Landing
NASA Technical Reports Server (NTRS)
Carson, John M., III; Restrepo, Caroline I.; Seubert, Carl R.; Amzajerdian, Farzin; Pierrottet, Diego F.; Collins, Steven M.; O'Neal, Travis V.; Stelling, Richard
2017-01-01
An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) payload was conducted onboard the Masten Xodiac suborbital rocket testbed. The payload integrates two complementary sensor technologies that together provide a spacecraft with knowledge during planetary descent and landing to precisely navigate and softly touchdown in close proximity to targeted surface locations. The two technologies are the Navigation Doppler Lidar (NDL), for high-precision velocity and range measurements, and the Lander Vision System (LVS) for map-relative state esti- mates. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a very precise Terrain Relative Navigation (TRN) solution that is suitable for future, autonomous planetary landing systems that require precise and soft landing capabilities. During the open-loop flight campaign, the COBALT payload acquired measurements and generated a precise navigation solution, but the Xodiac vehicle planned and executed its maneuvers based on an independent, GPS-based navigation solution. This minimized the risk to the vehicle during the integration and testing of the new navigation sensing technologies within the COBALT payload.
X-Ray Detection and Processing Models for Spacecraft Navigation and Timing
NASA Technical Reports Server (NTRS)
Sheikh, Suneel; Hanson, John
2013-01-01
The current primary method of deepspace navigation is the NASA Deep Space Network (DSN). High-performance navigation is achieved using Delta Differential One-Way Range techniques that utilize simultaneous observations from multiple DSN sites, and incorporate observations of quasars near the line-of-sight to a spacecraft in order to improve the range and angle measurement accuracies. Over the past four decades, x-ray astronomers have identified a number of xray pulsars with pulsed emissions having stabilities comparable to atomic clocks. The x-ray pulsar-based navigation and time determination (XNAV) system uses phase measurements from these sources to establish autonomously the position of the detector, and thus the spacecraft, relative to a known reference frame, much as the Global Positioning System (GPS) uses phase measurements from radio signals from several satellites to establish the position of the user relative to an Earth-centered fixed frame of reference. While a GPS receiver uses an antenna to detect the radio signals, XNAV uses a detector array to capture the individual xray photons from the x-ray pulsars. The navigation solution relies on detailed xray source models, signal processing, navigation and timing algorithms, and analytical tools that form the basis of an autonomous XNAV system. Through previous XNAV development efforts, some techniques have been established to utilize a pulsar pulse time-of-arrival (TOA) measurement to correct a position estimate. One well-studied approach, based upon Kalman filter methods, optimally adjusts a dynamic orbit propagation solution based upon the offset in measured and predicted pulse TOA. In this delta position estimator scheme, previously estimated values of spacecraft position and velocity are utilized from an onboard orbit propagator. Using these estimated values, the detected arrival times at the spacecraft of pulses from a pulsar are compared to the predicted arrival times defined by the pulsar s pulse timing model. A discrepancy provides an estimate of the spacecraft position offset, since an error in position will relate to the measured time offset of a pulse along the line of sight to the pulsar. XNAV researchers have been developing additional enhanced approaches to process the photon TOAs to arrive at an estimate of spacecraft position, including those using maximum-likelihood estimation, digital phase locked loops, and "single photon processing" schemes that utilize all available time data associated with each photon. Using pulsars from separate, non-coplanar locations provides range and range-rate measurements in each pulsar s direction. Combining these different pulsar measurements solves for offsets in position and velocity in three dimensions, and provides accurate overall navigation for deep space vehicles.
Tracked robot controllers for climbing obstacles autonomously
NASA Astrophysics Data System (ADS)
Vincent, Isabelle
2009-05-01
Research in mobile robot navigation has demonstrated some success in navigating flat indoor environments while avoiding obstacles. However, the challenge of analyzing complex environments to climb obstacles autonomously has had very little success due to the complexity of the task. Unmanned ground vehicles currently exhibit simple autonomous behaviours compared to the human ability to move in the world. This paper presents the control algorithms designed for a tracked mobile robot to autonomously climb obstacles by varying its tracks configuration. Two control algorithms are proposed to solve the autonomous locomotion problem for climbing obstacles. First, a reactive controller evaluates the appropriate geometric configuration based on terrain and vehicle geometric considerations. Then, a reinforcement learning algorithm finds alternative solutions when the reactive controller gets stuck while climbing an obstacle. The methodology combines reactivity to learning. The controllers have been demonstrated in box and stair climbing simulations. The experiments illustrate the effectiveness of the proposed approach for crossing obstacles.
Nature-Inspired Acoustic Sensor Projects
1999-08-24
m). The pager motors are worn on the wrists. Yale Intelligent Sensors Lab 8 Autonomous vehicle navigation Yago – Yale Autonomous Go-Cart Yago is used...proximity sensor determined the presence of close-by objects missed by the sonars. Yago operated autonomously by avoiding obstacles. Problems being
Improved obstacle avoidance and navigation for an autonomous ground vehicle
NASA Astrophysics Data System (ADS)
Giri, Binod; Cho, Hyunsu; Williams, Benjamin C.; Tann, Hokchhay; Shakya, Bicky; Bharam, Vishal; Ahlgren, David J.
2015-01-01
This paper presents improvements made to the intelligence algorithms employed on Q, an autonomous ground vehicle, for the 2014 Intelligent Ground Vehicle Competition (IGVC). In 2012, the IGVC committee combined the formerly separate autonomous and navigation challenges into a single AUT-NAV challenge. In this new challenge, the vehicle is required to navigate through a grassy obstacle course and stay within the course boundaries (a lane of two white painted lines) that guide it toward a given GPS waypoint. Once the vehicle reaches this waypoint, it enters an open course where it is required to navigate to another GPS waypoint while avoiding obstacles. After reaching the final waypoint, the vehicle is required to traverse another obstacle course before completing the run. Q uses modular parallel software architecture in which image processing, navigation, and sensor control algorithms run concurrently. A tuned navigation algorithm allows Q to smoothly maneuver through obstacle fields. For the 2014 competition, most revisions occurred in the vision system, which detects white lines and informs the navigation component. Barrel obstacles of various colors presented a new challenge for image processing: the previous color plane extraction algorithm would not suffice. To overcome this difficulty, laser range sensor data were overlaid on visual data. Q also participates in the Joint Architecture for Unmanned Systems (JAUS) challenge at IGVC. For 2014, significant updates were implemented: the JAUS component accepted a greater variety of messages and showed better compliance to the JAUS technical standard. With these improvements, Q secured second place in the JAUS competition.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
A team KuuKulgur Robot from Estonia is seen on the practice field during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team KuuKulgur is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Sam Ortega, NASA program manager of Centennial Challenges, watches as robots attempt the rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
The team Survey robot retrieves a sample during a demonstration of the level two challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
The University of California Santa Cruz Rover Team prepares their rover for the rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Sample Return Robot Challenge staff members confer before the team Survey robots makes it's attempt at the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Worcester Polytechnic Institute (WPI) President Laurie Leshin, speaks at a breakfast opening the TouchTomorrow Festival, held in conjunction with the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The team AERO robot drives off the starting platform during the level one competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Team Cephal's robot is seen on the starting platform during a rerun of the level one challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
David Miller, NASA Chief Technologist, speaks at a breakfast opening the TouchTomorrow Festival, held in conjunction with the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The Oregon State University Mars Rover Team's robot is seen during level one competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
Jerry Waechter of team Middleman from Dunedin, Florida, works on their robot named Ro-Bear during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team Middleman is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
A robot from the Intrepid Systems team is seen during the rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
A team KuuKulgur robot is seen as it begins the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The team Mountaineers robot is seen as it attempts the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Members of the Oregon State University Mars Rover Team prepare their robot to attempt the level one competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The Stellar Automation Systems team poses for a picture with their robot after attempting the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
The team Survey robot is seen as it conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
All four of team KuuKulgur's robots are seen as they attempt the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Spectators watch as the team Survey robot conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Team Middleman's robot, Ro-Bear, is seen as it starts the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
The team Mountaineers robot is seen after picking up the sample during a rerun of the level one challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Two of team KuuKulgur's robots are seen as they attempt a rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
A robot from the University of Waterloo Robotics Team is seen during the rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Members of team Survey follow their robot as it conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The entrance to Institute Park is seen during the level one challenge as during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Sam Ortega, NASA Centennial Challenges Program Manager, speaks at a breakfast opening the TouchTomorrow Festival, held in conjunction with the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
James Leopore, of team Fetch, from Alexandria, Virginia, speaks with judges as he prepares for the NASA 2014 Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team Fetch is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
A team KuuKulgur robot approaches the sample as it attempts the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
The team survey robot is seen on the starting platform before begging it's attempt at the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The Mountaineers team from West Virginia University, watches as their robot attempts the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
The team Survey robot is seen as it conducts a demonstration of the level two challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Team Survey's robot is seen as it conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
Development of autonomous grasping and navigating robot
NASA Astrophysics Data System (ADS)
Kudoh, Hiroyuki; Fujimoto, Keisuke; Nakayama, Yasuichi
2015-01-01
The ability to find and grasp target items in an unknown environment is important for working robots. We developed an autonomous navigating and grasping robot. The operations are locating a requested item, moving to where the item is placed, finding the item on a shelf or table, and picking the item up from the shelf or the table. To achieve these operations, we designed the robot with three functions: an autonomous navigating function that generates a map and a route in an unknown environment, an item position recognizing function, and a grasping function. We tested this robot in an unknown environment. It achieved a series of operations: moving to a destination, recognizing the positions of items on a shelf, picking up an item, placing it on a cart with its hand, and returning to the starting location. The results of this experiment show the applicability of reducing the workforce with robots.
BIRDY - Interplanetary CubeSat for planetary geodesy of Small Solar System Bodies (SSSB).
NASA Astrophysics Data System (ADS)
Hestroffer, D.; Agnan, M.; Segret, B.; Quinsac, G.; Vannitsen, J.; Rosenblatt, P.; Miau, J. J.
2017-12-01
We are developing the Birdy concept of a scientific interplanetary CubeSat, for cruise, or proximity operations around a Small body of the Solar System (asteroid, comet, irregular satellite). The scientific aim is to characterise the body's shape, gravity field, and internal structure through imaging and radio-science techniques. Radio-science is now of common use in planetary science (flybys or orbiters) to derive the mass of the scientific target and possibly higher order terms of its gravity field. Its application to a nano-satellite brings the advantage of enabling low orbits that can get closer to the body's surface, hence increasing the SNR for precise orbit determination (POD), with a fully dedicated instrument. Additionally, it can be applied to two or more satellites, on a leading-trailing trajectory, to improve the gravity field determination. However, the application of this technique to CubeSats in deep space, and inter-satellite link has to be proven. Interplanetary CubeSats need to overcome a few challenges before reaching successfully their deep-space objectives: link to ground-segment, energy supply, protection against radiation, etc. Besides, the Birdy CubeSat — as our basis concept — is designed to be accompanying a mothercraft, and relies partly on the main mission for reaching the target, as well as on data-link with the Earth. However, constraints to the mothercraft needs to be reduced, by having the CubeSat as autonomous as possible. In this respect, propulsion and auto-navigation are key aspects, that we are studying in a Birdy-T engineering model. We envisage a 3U size CubeSat with radio link, object-tracker and imaging function, and autonomous ionic propulsion system. We are considering two case studies for autonomous guidance, navigation and control, with autonomous propulsion: in cruise and in proximity, necessitating ΔV up to 2m/s for a total budget of about 50m/s. In addition to the propulsion, in-flight orbit determination (IFOD) and maintenance are studied, through analysis of images by an object-tracker and astrometry of solar system objects in front of background stars. Before going to deep-space, our project will start with BIRDY-1 orbiting the Earth, to validate the concepts of adopted propulsion, IFOD and orbit maintenance, as well as the radio-science and POD.
On exploration of geometrically constrained space by medicinal leeches Hirudo verbana.
Adamatzky, Andrew
2015-04-01
Leeches are fascinating creatures: they have simple modular nervous circuitry yet exhibit a rich spectrum of behavioural modes. Leeches could be ideal blue-prints for designing flexible soft robots which are modular, multi-functional, fault-tolerant, easy to control, capable for navigating using optical, mechanical and chemical sensorial inputs, have autonomous inter-segmental coordination and adaptive decision-making. With future designs of leech-robots in mind we study how leeches behave in geometrically constrained spaces. Core results of the paper deal with leeches exploring a row of rooms arranged along a narrow corridor. In laboratory experiments we find that rooms closer to ends of the corridor are explored by leeches more often than rooms in the middle of the corridor. Also, in series of scoping experiments, we evaluate leeches capabilities to navigating in mazes towards sources of vibration and chemo-attraction. We believe our results lay foundation for future developments of robots mimicking behaviour of leeches. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Vector Pursuit Path Tracking for Autonomous Ground Vehicles
2000-08-01
vi INTRODUCTION ...........................................................................................................1...other geometric path-tracking techniques. 1 CHAPTER 1 INTRODUCTION An autonomous vehicle is one that is capable of automatic navigation. It is...Joint Architecture for Unmanned Ground Vehicles ( JAUGS ) working group meeting held at the University of Florida. 5 Figure 1.5: Autonomous
Light Detection and Ranging-Based Terrain Navigation: A Concept Exploration
NASA Technical Reports Server (NTRS)
Campbell, Jacob; UijtdeHaag, Maarten; vanGraas, Frank; Young, Steve
2003-01-01
This paper discusses the use of Airborne Light Detection And Ranging (LiDAR) equipment for terrain navigation. Airborne LiDAR is a relatively new technology used primarily by the geo-spatial mapping community to produce highly accurate and dense terrain elevation maps. In this paper, the term LiDAR refers to a scanning laser ranger rigidly mounted to an aircraft, as opposed to an integrated sensor system that consists of a scanning laser ranger integrated with Global Positioning System (GPS) and Inertial Measurement Unit (IMU) data. Data from the laser range scanner and IMU will be integrated with a terrain database to estimate the aircraft position and data from the laser range scanner will be integrated with GPS to estimate the aircraft attitude. LiDAR data was collected using NASA Dryden's DC-8 flying laboratory in Reno, NV and was used to test the proposed terrain navigation system. The results of LiDAR-based terrain navigation shown in this paper indicate that airborne LiDAR is a viable technology enabler for fully autonomous aircraft navigation. The navigation performance is highly dependent on the quality of the terrain databases used for positioning and therefore high-resolution (2 m post-spacing) data was used as the terrain reference.
Autonomous navigation using lunar beacons
NASA Technical Reports Server (NTRS)
Khatib, A. R.; Ellis, J.; French, J.; Null, G.; Yunck, T.; Wu, S.
1983-01-01
The concept of using lunar beacon signal transmission for on-board navigation for earth satellites and near-earth spacecraft is described. The system would require powerful transmitters on the earth-side of the moon's surface and black box receivers with antennae and microprocessors placed on board spacecraft for autonomous navigation. Spacecraft navigation requires three position and three velocity elements to establish location coordinates. Two beacons could be soft-landed on the lunar surface at the limits of allowable separation and each would transmit a wide-beam signal with cones reaching GEO heights and be strong enough to be received by small antennae in near-earth orbit. The black box processor would perform on-board computation with one-way Doppler/range data and dynamical models. Alternatively, GEO satellites such as the GPS or TDRSS spacecraft can be used with interferometric techniques to provide decimeter-level accuracy for aircraft navigation.
NASA Technical Reports Server (NTRS)
Strube, Matthew; Henry, Ross; Skeleton, Eugene; Eepoel, John Van; Gill, Nat; McKenna, Reed
2015-01-01
Since the last Hubble Servicing Mission five years ago, the Satellite Servicing Capabilities Office (SSCO) at the NASA Goddard Space Flight Center (GSFC) has been focusing on maturing the technologies necessary to robotically service orbiting legacy assets-spacecraft not necessarily designed for in-flight service. Raven, SSCO's next orbital experiment to the International Space Station (ISS), is a real-time autonomous non-cooperative relative navigation system that will mature the estimation algorithms required for rendezvous and proximity operations for a satellite-servicing mission. Raven will fly as a hosted payload as part of the Space Test Program's STP-H5 mission, which will be mounted on an external ExPRESS Logistics Carrier (ELC) and will image the many visiting vehicles arriving and departing from the ISS as targets for observation. Raven will host multiple sensors: a visible camera with a variable field of view lens, a long-wave infrared camera, and a short-wave flash lidar. This sensor suite can be pointed via a two-axis gimbal to provide a wide field of regard to track the visiting vehicles as they make their approach. Various real-time vision processing algorithms will produce range, bearing, and six degree of freedom pose measurements that will be processed in a relative navigation filter to produce an optimal relative state estimate. In this overview paper, we will cover top-level requirements, experimental concept of operations, system design, and the status of Raven integration and test activities.
Control of free-flying space robot manipulator systems
NASA Technical Reports Server (NTRS)
Cannon, Robert H., Jr.
1990-01-01
New control techniques for self contained, autonomous free flying space robots were developed and tested experimentally. Free flying robots are envisioned as a key element of any successful long term presence in space. These robots must be capable of performing the assembly, maintenance, and inspection, and repair tasks that currently require human extravehicular activity (EVA). A set of research projects were developed and carried out using lab models of satellite robots and a flexible manipulator. The second generation space robot models use air cushion vehicle (ACV) technology to simulate in 2-D the drag free, zero g conditions of space. The current work is divided into 5 major projects: Global Navigation and Control of a Free Floating Robot, Cooperative Manipulation from a Free Flying Robot, Multiple Robot Cooperation, Thrusterless Robotic Locomotion, and Dynamic Payload Manipulation. These projects are examined in detail.
An Algorithm for Autonomous Formation Obstacle Avoidance
NASA Astrophysics Data System (ADS)
Cruz, Yunior I.
The level of human interaction with Unmanned Aerial Systems varies greatly from remotely piloted aircraft to fully autonomous systems. In the latter end of the spectrum, the challenge lies in designing effective algorithms to dictate the behavior of the autonomous agents. A swarm of autonomous Unmanned Aerial Vehicles requires collision avoidance and formation flight algorithms to negotiate environmental challenges it may encounter during the execution of its mission, which may include obstacles and chokepoints. In this work, a simple algorithm is developed to allow a formation of autonomous vehicles to perform point to point navigation while avoiding obstacles and navigating through chokepoints. Emphasis is placed on maintaining formation structures. Rather than breaking formation and individually navigating around the obstacle or through the chokepoint, vehicles are required to assemble into appropriately sized/shaped sub-formations, bifurcate around the obstacle or negotiate the chokepoint, and reassemble into the original formation at the far side of the obstruction. The algorithm receives vehicle and environmental properties as inputs and outputs trajectories for each vehicle from start to the desired ending location. Simulation results show that the algorithm safely routes all vehicles past the obstruction while adhering to the aforementioned requirements. The formation adapts and successfully negotiates the obstacles and chokepoints in its path while maintaining proper vehicle separation.
Flight Testing of Terrain-Relative Navigation and Large-Divert Guidance on a VTVL Rocket
NASA Technical Reports Server (NTRS)
Trawny, Nikolas; Benito, Joel; Tweddle, Brent; Bergh, Charles F.; Khanoyan, Garen; Vaughan, Geoffrey M.; Zheng, Jason X.; Villalpando, Carlos Y.; Cheng, Yang; Scharf, Daniel P.;
2015-01-01
Since 2011, the Autonomous Descent and Ascent Powered-Flight Testbed (ADAPT) has been used to demonstrate advanced descent and landing technologies onboard the Masten Space Systems (MSS) Xombie vertical-takeoff, vertical-landing suborbital rocket. The current instantiation of ADAPT is a stand-alone payload comprising sensing and avionics for terrain-relative navigation and fuel-optimal onboard planning of large divert trajectories, thus providing complete pin-point landing capabilities needed for planetary landers. To this end, ADAPT combines two technologies developed at JPL, the Lander Vision System (LVS), and the Guidance for Fuel Optimal Large Diverts (G-FOLD) software. This paper describes the integration and testing of LVS and G-FOLD in the ADAPT payload, culminating in two successful free flight demonstrations on the Xombie vehicle conducted in December 2014.
Navigation of robotic system using cricket motes
NASA Astrophysics Data System (ADS)
Patil, Yogendra J.; Baine, Nicholas A.; Rattan, Kuldip S.
2011-06-01
This paper presents a novel algorithm for self-mapping of the cricket motes that can be used for indoor navigation of autonomous robotic systems. The cricket system is a wireless sensor network that can provide indoor localization service to its user via acoustic ranging techniques. The behavior of the ultrasonic transducer on the cricket mote is studied and the regions where satisfactorily distance measurements can be obtained are recorded. Placing the motes in these regions results fine-grain mapping of the cricket motes. Trilateration is used to obtain a rigid coordinate system, but is insufficient if the network is to be used for navigation. A modified SLAM algorithm is applied to overcome the shortcomings of trilateration. Finally, the self-mapped cricket motes can be used for navigation of autonomous robotic systems in an indoor location.
An Analysis of Navigation Algorithms for Smartphones Using J2ME
NASA Astrophysics Data System (ADS)
Santos, André C.; Tarrataca, Luís; Cardoso, João M. P.
Embedded systems are considered one of the most potential areas for future innovations. Two embedded fields that will most certainly take a primary role in future innovations are mobile robotics and mobile computing. Mobile robots and smartphones are growing in number and functionalities, becoming a presence in our daily life. In this paper, we study the current feasibility of a smartphone to execute navigation algorithms. As a test case, we use a smartphone to control an autonomous mobile robot. We tested three navigation problems: Mapping, Localization and Path Planning. For each of these problems, an algorithm has been chosen, developed in J2ME, and tested on the field. Results show the current mobile Java capacity for executing computationally demanding algorithms and reveal the real possibility of using smartphones for autonomous navigation.
Autonomous optical navigation using nanosatellite-class instruments: a Mars approach case study
NASA Astrophysics Data System (ADS)
Enright, John; Jovanovic, Ilija; Kazemi, Laila; Zhang, Harry; Dzamba, Tom
2018-02-01
This paper examines the effectiveness of small star trackers for orbital estimation. Autonomous optical navigation has been used for some time to provide local estimates of orbital parameters during close approach to celestial bodies. These techniques have been used extensively on spacecraft dating back to the Voyager missions, but often rely on long exposures and large instrument apertures. Using a hyperbolic Mars approach as a reference mission, we present an EKF-based navigation filter suitable for nanosatellite missions. Observations of Mars and its moons allow the estimator to correct initial errors in both position and velocity. Our results show that nanosatellite-class star trackers can produce good quality navigation solutions with low position (<300 {m}) and velocity (<0.15 {m/s}) errors as the spacecraft approaches periapse.
Autonomous Locator of Thermals (ALOFT) Autonomous Soaring Algorithm
2015-04-03
estimator used on the NRL CICADA Mk 3 micro air vehicle [13]. An extended Kalman filter (EKF) was designed to estimate the airspeed sensor bias and...Boulder, 2007. ALOFT Autonomous Soaring Algorithm 31 13. A.D. Kahn and D.J. Edwards, “Navigation, Guidance and Control for the CICADA Expendable
Enabling Autonomous Navigation for Affordable Scooters.
Liu, Kaikai; Mulky, Rajathswaroop
2018-06-05
Despite the technical success of existing assistive technologies, for example, electric wheelchairs and scooters, they are still far from effective enough in helping those in need navigate to their destinations in a hassle-free manner. In this paper, we propose to improve the safety and autonomy of navigation by designing a cutting-edge autonomous scooter, thus allowing people with mobility challenges to ambulate independently and safely in possibly unfamiliar surroundings. We focus on indoor navigation scenarios for the autonomous scooter where the current location, maps, and nearby obstacles are unknown. To achieve semi-LiDAR functionality, we leverage the gyros-based pose data to compensate the laser motion in real time and create synthetic mapping of simple environments with regular shapes and deep hallways. Laser range finders are suitable for long ranges with limited resolution. Stereo vision, on the other hand, provides 3D structural data of nearby complex objects. To achieve simultaneous fine-grained resolution and long range coverage in the mapping of cluttered and complex environments, we dynamically fuse the measurements from the stereo vision camera system, the synthetic laser scanner, and the LiDAR. We propose solutions to self-correct errors in data fusion and create a hybrid map to assist the scooter in achieving collision-free navigation in an indoor environment.
Guidance and control for unmanned ground vehicles
NASA Astrophysics Data System (ADS)
Bateman, Peter J.
1994-06-01
Techniques for the guidance, control, and navigation of unmanned ground vehicles are described in terms of the communication bandwidth requirements for driving and control of a vehicle remote from the human operator. Modes of operation are conveniently classified as conventional teleoperation, supervisory control, and fully autonomous control. The fundamental problem of maintaining a robust non-line-of-sight communications link between the human controller and the remote vehicle is discussed, as this provides the impetus for greater autonomy in the control system and the greatest scope for innovation. While supervisory control still requires the man to be providing the primary navigational intelligence, fully autonomous operation requires that mission navigation is provided solely by on-board machine intelligence. Methods directed at achieving this performance are described using various active and passive sensing of the terrain for route navigation and obstacle detection. Emphasis is given to TV imagery and signal processing techniques for image understanding. Reference is made to the limitations of current microprocessor technology and suitable computer architectures. Some of the more recent control techniques involve the use of neural networks, fuzzy logic, and data fusion and these are discussed in the context of road following and cross country navigation. Examples of autonomous vehicle testbeds operated at various laboratories around the world are given.
Global navigation satellite systems performance analysis and augmentation strategies in aviation
NASA Astrophysics Data System (ADS)
Sabatini, Roberto; Moore, Terry; Ramasamy, Subramanian
2017-11-01
In an era of significant air traffic expansion characterized by a rising congestion of the radiofrequency spectrum and a widespread introduction of Unmanned Aircraft Systems (UAS), Global Navigation Satellite Systems (GNSS) are being exposed to a variety of threats including signal interferences, adverse propagation effects and challenging platform-satellite relative dynamics. Thus, there is a need to characterize GNSS signal degradations and assess the effects of interfering sources on the performance of avionics GNSS receivers and augmentation systems used for an increasing number of mission-essential and safety-critical aviation tasks (e.g., experimental flight testing, flight inspection/certification of ground-based radio navigation aids, wide area navigation and precision approach). GNSS signal deteriorations typically occur due to antenna obscuration caused by natural and man-made obstructions present in the environment (e.g., elevated terrain and tall buildings when flying at low altitude) or by the aircraft itself during manoeuvring (e.g., aircraft wings and empennage masking the on-board GNSS antenna), ionospheric scintillation, Doppler shift, multipath, jamming and spurious satellite transmissions. Anyone of these phenomena can result in partial to total loss of tracking and possible tracking errors, depending on the severity of the effect and the receiver characteristics. After designing GNSS performance threats, the various augmentation strategies adopted in the Communication, Navigation, Surveillance/Air Traffic Management and Avionics (CNS + A) context are addressed in detail. GNSS augmentation can take many forms but all strategies share the same fundamental principle of providing supplementary information whose objective is improving the performance and/or trustworthiness of the system. Hence it is of paramount importance to consider the synergies offered by different augmentation strategies including Space Based Augmentation System (SBAS), Ground Based Augmentation System (GBAS), Aircraft Based Augmentation System (ABAS) and Receiver Autonomous Integrity Monitoring (RAIM). Furthermore, by employing multi-GNSS constellations and multi-sensor data fusion techniques, improvements in availability and continuity can be obtained. SBAS is designed to improve GNSS system integrity and accuracy for aircraft navigation and landing, while an alternative approach to GNSS augmentation is to transmit integrity and differential correction messages from ground-based augmentation systems (GBAS). In addition to existing space and ground based augmentation systems, GNSS augmentation may take the form of additional information being provided by other on-board avionics systems, such as in ABAS. As these on-board systems normally operate via separate principles than GNSS, they are not subject to the same sources of error or interference. Using suitable data link and data processing technologies on the ground, a certified ABAS capability could be a core element of a future GNSS Space-Ground-Aircraft Augmentation Network (SGAAN). Although current augmentation systems can provide significant improvement of GNSS navigation performance, a properly designed and flight-certified SGAAN could play a key role in trusted autonomous system and cyber-physical system applications such as UAS Sense-and-Avoid (SAA).
ERIC Educational Resources Information Center
Doty, Keith L.
1999-01-01
Research on neural networks and hippocampal function demonstrating how mammals construct mental maps and develop navigation strategies is being used to create Intelligent Autonomous Mobile Robots (IAMRs). Such robots are able to recognize landmarks and navigate without "vision." (SK)
ODYSSEUS autonomous walking robot: The leg/arm design
NASA Technical Reports Server (NTRS)
Bourbakis, N. G.; Maas, M.; Tascillo, A.; Vandewinckel, C.
1994-01-01
ODYSSEUS is an autonomous walking robot, which makes use of three wheels and three legs for its movement in the free navigation space. More specifically, it makes use of its autonomous wheels to move around in an environment where the surface is smooth and not uneven. However, in the case that there are small height obstacles, stairs, or small height unevenness in the navigation environment, the robot makes use of both wheels and legs to travel efficiently. In this paper we present the detailed hardware design and the simulated behavior of the extended leg/arm part of the robot, since it plays a very significant role in the robot actions (movements, selection of objects, etc.). In particular, the leg/arm consists of three major parts: The first part is a pipe attached to the robot base with a flexible 3-D joint. This pipe has a rotated bar as an extended part, which terminates in a 3-D flexible joint. The second part of the leg/arm is also a pipe similar to the first. The extended bar of the second part ends at a 2-D joint. The last part of the leg/arm is a clip-hand. It is used for selecting several small weight and size objects, and when it is in a 'closed' mode, it is used as a supporting part of the robot leg. The entire leg/arm part is controlled and synchronized by a microcontroller (68CH11) attached to the robot base.
Development Of Autonomous Systems
NASA Astrophysics Data System (ADS)
Kanade, Takeo
1989-03-01
In the last several years at the Robotics Institute of Carnegie Mellon University, we have been working on two projects for developing autonomous systems: Nablab for Autonomous Land Vehicle and Ambler for Mars Rover. These two systems are for different purposes: the Navlab is a four-wheeled vehicle (van) for road and open terrain navigation, and the Ambler is a six-legged locomotor for Mars exploration. The two projects, however, share many common aspects. Both are large-scale integrated systems for navigation. In addition to the development of individual components (eg., construction and control of the vehicle, vision and perception, and planning), integration of those component technologies into a system by means of an appropriate architecture is a major issue.
Autonomous formation flying based on GPS — PRISMA flight results
NASA Astrophysics Data System (ADS)
D'Amico, Simone; Ardaens, Jean-Sebastien; De Florio, Sergio
2013-01-01
This paper presents flight results from the early harvest of the Spaceborne Autonomous Formation Flying Experiment (SAFE) conducted in the frame of the Swedish PRISMA technology demonstration mission. SAFE represents one of the first demonstrations in low Earth orbit of an advanced guidance, navigation and control system for dual-spacecraft formations. Innovative techniques based on differential GPS-based navigation and relative orbital elements control are validated and tuned in orbit to fulfill the typical requirements of future distributed scientific instruments for remote sensing.
2013-05-01
saliency, natural scene statistics 1. INTRODUCTION Research into the area of autonomous navigation for unmanned ground vehicles (UGV) has accelerated in...recent years. This is partly due to the success of programs such as the DARPA Grand Challenge1 and the dream of driverless cars ,2 but is also due to the...NOTES 14. ABSTRACT There have been several major advances in autonomous navigation for unmanned ground vehicles in controlled urban environments in
Perception system and functions for autonomous navigation in a natural environment
NASA Technical Reports Server (NTRS)
Chatila, Raja; Devy, Michel; Lacroix, Simon; Herrb, Matthieu
1994-01-01
This paper presents the approach, algorithms, and processes we developed for the perception system of a cross-country autonomous robot. After a presentation of the tele-programming context we favor for intervention robots, we introduce an adaptive navigation approach, well suited for the characteristics of complex natural environments. This approach lead us to develop a heterogeneous perception system that manages several different terrain representatives. The perception functionalities required during navigation are listed, along with the corresponding representations we consider. The main perception processes we developed are presented. They are integrated within an on-board control architecture we developed. First results of an ambitious experiment currently underway at LAAS are then presented.
High accuracy autonomous navigation using the global positioning system (GPS)
NASA Technical Reports Server (NTRS)
Truong, Son H.; Hart, Roger C.; Shoan, Wendy C.; Wood, Terri; Long, Anne C.; Oza, Dipak H.; Lee, Taesul
1997-01-01
The application of global positioning system (GPS) technology to the improvement of the accuracy and economy of spacecraft navigation, is reported. High-accuracy autonomous navigation algorithms are currently being qualified in conjunction with the GPS attitude determination flyer (GADFLY) experiment for the small satellite technology initiative Lewis spacecraft. Preflight performance assessments indicated that these algorithms are able to provide a real time total position accuracy of better than 10 m and a velocity accuracy of better than 0.01 m/s, with selective availability at typical levels. It is expected that the position accuracy will be increased to 2 m if corrections are provided by the GPS wide area augmentation system.
Mobile robots IV; Proceedings of the Meeting, Philadelphia, PA, Nov. 6, 7, 1989
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfe, W.J.; Chun, W.H.
1990-01-01
The present conference on mobile robot systems discusses high-speed machine perception based on passive sensing, wide-angle optical ranging, three-dimensional path planning for flying/crawling robots, navigation of autonomous mobile intelligence in an unstructured natural environment, mechanical models for the locomotion of a four-articulated-track robot, a rule-based command language for a semiautonomous Mars rover, and a computer model of the structured light vision system for a Mars rover. Also discussed are optical flow and three-dimensional information for navigation, feature-based reasoning trail detection, a symbolic neural-net production system for obstacle avoidance and navigation, intelligent path planning for robot navigation in an unknown environment,more » behaviors from a hierarchical control system, stereoscopic TV systems, the REACT language for autonomous robots, and a man-amplifying exoskeleton.« less
2012-08-01
CAPE CANAVERAL, Fla. - At the Autonomous Landing and Hazard Avoidance Technology, or ALHAT, field at the north end of the Shuttle Landing Facility, or SLF, at NASA’s Kennedy Space Center in Florida, members of the media view the hazard field and speak with Morpheus managers. At left, in the blue shirt is Gregory Gaddis, Kennedy Project Morpheus/ALHAT site manager. Testing of the prototype lander had been ongoing at NASA’s Johnson Space Center in Houston in preparation for its first free-flight test at Kennedy Space Center. The SLF will provide the lander with the kind of field necessary for realistic testing, complete with rocks, craters and hazards to avoid. Morpheus utilizes an autonomous landing and hazard avoidance technology, or ALHAT, payload that will allow it to navigate to clear landing sites amidst rocks, craters and other hazards during its descent. Project Morpheus is one of 20 small projects comprising the Advanced Exploration Systems, or AES, program in NASA’s Human Exploration and Operations Mission Directorate. AES projects pioneer new approaches for rapidly developing prototype systems, demonstrating key capabilities and validating operational concepts for future human missions beyond Earth orbit. For more information on Project Morpheus, visit http://morpheuslander.jsc.nasa.gov/. Photo credit: NASA/Kim Shiflett
2004-07-14
KENNEDY SPACE CENTER, FLA. - At Vandenberg Air Force Base in California, Orbital Sciences technicians check the bottom of the DART (Demonstration for Autonomous Rendezvous Technology) flight demonstrator as it is raised of its platform. The spacecraft was developed to prove technologies for locating and maneuvering near an orbiting satellite. Future applications of technologies developed by the DART project will benefit the nation in future space-vehicle systems development requiring in-space assembly, services or other autonomous rendezvous operations. Designed and developed for NASA by Orbital Sciences Corporation in Dulles, Va., the DART spacecraft will be launched on a Pegasus launch vehicle. At about 40,000 feet over the Pacific Ocean, the Pegasus will be released from Orbital’s Stargazer L-1011 aircraft, fire its rocket motors and boost DART into a polar orbit approximately 472 miles by 479 miles. Once in orbit, DART will rendezvous with a target satellite, the Multiple Paths, Beyond-Line-of-Site Communications satellite, also built by Orbital Sciences. DART will then perform several close proximity operations, such as moving toward and away from the satellite using navigation data provided by onboard sensors. DART is scheduled for launch no earlier than Oct. 18.
2004-07-14
KENNEDY SPACE CENTER, FLA. - At Vandenberg Air Force Base in California, Orbital Sciences technicians check the bottom of the DART (Demonstration for Autonomous Rendezvous Technology) flight demonstrator as it is raised off its platform. The spacecraft was developed to prove technologies for locating and maneuvering near an orbiting satellite. Future applications of technologies developed by the DART project will benefit the nation in future space-vehicle systems development requiring in-space assembly, services or other autonomous rendezvous operations. Designed and developed for NASA by Orbital Sciences Corporation in Dulles, Va., the DART spacecraft will be launched on a Pegasus launch vehicle. At about 40,000 feet over the Pacific Ocean, the Pegasus will be released from Orbital’s Stargazer L-1011 aircraft, fire its rocket motors and boost DART into a polar orbit approximately 472 miles by 479 miles. Once in orbit, DART will rendezvous with a target satellite, the Multiple Paths, Beyond-Line-of-Site Communications satellite, also built by Orbital Sciences. DART will then perform several close proximity operations, such as moving toward and away from the satellite using navigation data provided by onboard sensors. DART is scheduled for launch no earlier than Oct. 18.
2004-07-14
KENNEDY SPACE CENTER, FLA. - At Vandenberg Air Force Base in California, Orbital Sciences technicians observe closely the movement of the DART (Demonstration for Autonomous Rendezvous Technology) flight demonstrator as it is lowered onto a stand. The spacecraft was developed to prove technologies for locating and maneuvering near an orbiting satellite. Future applications of technologies developed by the DART project will benefit the nation in future space-vehicle systems development requiring in-space assembly, services or other autonomous rendezvous operations. Designed and developed for NASA by Orbital Sciences Corporation in Dulles, Va., the DART spacecraft will be launched on a Pegasus launch vehicle. At about 40,000 feet over the Pacific Ocean, the Pegasus will be released from Orbital’s Stargazer L-1011 aircraft, fire its rocket motors and boost DART into a polar orbit approximately 472 miles by 479 miles. Once in orbit, DART will rendezvous with a target satellite, the Multiple Paths, Beyond-Line-of-Site Communications satellite, also built by Orbital Sciences. DART will then perform several close proximity operations, such as moving toward and away from the satellite using navigation data provided by onboard sensors. DART is scheduled for launch no earlier than Oct. 18.
Mini AERCam Inspection Robot for Human Space Missions
NASA Technical Reports Server (NTRS)
Fredrickson, Steven E.; Duran, Steve; Mitchell, Jennifer D.
2004-01-01
The Engineering Directorate of NASA Johnson Space Center has developed a nanosatellite-class free-flyer intended for future external inspection and remote viewing of human spacecraft. The Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam) technology demonstration unit has been integrated into the approximate form and function of a flight system. The spherical Mini AERCam free flyer is 7.5 inches in diameter and weighs approximately 10 pounds, yet it incorporates significant additional capabilities compared to the 35 pound, 14 inch AERCam Sprint that flew as a Shuttle flight experiment in 1997. Mini AERCam hosts a full suite of miniaturized avionics, instrumentation, communications, navigation, imaging, power, and propulsion subsystems, including digital video cameras and a high resolution still image camera. The vehicle is designed for either remotely piloted operations or supervised autonomous operations including automatic stationkeeping and point-to-point maneuvering. Mini AERCam is designed to fulfill the unique requirements and constraints associated with using a free flyer to perform external inspections and remote viewing of human spacecraft operations. This paper describes the application of Mini AERCam for stand-alone spacecraft inspection, as well as for roles on teams of humans and robots conducting future space exploration missions.
Global Precipitation Measurement (GPM) Orbit Design and Autonomous Maneuvers
NASA Technical Reports Server (NTRS)
Folta, David; Mendelsohn, Chad
2003-01-01
The NASA Goddard Space Flight Center's Global Precipitation Measurement (GPM) mission will meet a challenge of measuring worldwide precipitation every three hours. The GPM spacecraft, part of a constellation, will be required to maintain a circular orbit in a high drag environment to accomplish this challenge. Analysis by the Flight Dynamics Analysis Branch has shown that the prime orbit altitude of 40% is necessary to prevent ground track repeating. Combined with goals to minimize maneuver impacts to science data collection and enabling reasonable long-term orbit predictions, the GPM project has decided to fly an autonomous maneuver system. This system is a derivative of the successful New Millennium Program technology flown onboard the Earth Observing-1 mission. This paper presents the driving science requirements and goals of the mission and shows how they will be met. Analysis of the orbit optimization and the AV requirements for several ballistic properties are presented. The architecture of the autonomous maneuvering system to meet the goals and requirements is presented along with simulations using a GPM prototype. Additionally, the use of the GPM autonomous system to mitigate possible collision avoidance and to aid other spacecraft systems during navigation outages is explored.
An Architecture for Autonomous Rovers on Future Planetary Missions
NASA Astrophysics Data System (ADS)
Ocon, J.; Avilés, M.; Graziano, M.
2018-04-01
This paper proposes an architecture for autonomous planetary rovers. This architecture combines a set of characteristics required in this type of system: high level of abstraction, reactive event-based activity execution, and automous navigation.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Team AERO, from the Worcester Polytechnic Institute (WPI) transports their robot to the competition field for the level one of the competition during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Robots that will be competing in the Level one competition are seen as they sit in impound prior to the start of competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Ahti Heinla, left, and Sulo Kallas, right, from Estonia, prepare team KuuKulgur's robot for the rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
A sample can be seen on the competition field as the team Survey robot conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Jascha Little of team Survey is seen as he follows the teams robot as it conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The University of California Santa Cruz Rover Team poses for a picture with their robot after attempting the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
The University of California Santa Cruz Rover Team's robot is seen prior to starting it's second attempt at the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The Oregon State University Mars Rover Team poses for a picture with their robot following their attempt at the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The University of Waterloo Robotics Team, from Canada, prepares to place their robot on the start platform during the level one challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
The University of Waterloo Robotics Team, from Ontario, Canada, prepares their robot for the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. The team from the University of Waterloo is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Sam Ortega, NASA program manager for Centennial Challenges, is interviewed by a member of the media before the start of level two competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Jim Rothrock, left, and Carrie Johnson, right, of the Wunderkammer Laboratory team pose for a picture with their robot after attempting the level one competition during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
The Oregon State University Mars Rover Team follows their robot on the practice field during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. The Oregon State University Mars Rover Team is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Jerry Waechter of team Middleman from Dunedin, Florida, speaks about his team's robot, Ro-Bear, as it makes it attempt at the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
The Oregon State University Mars Rover Team, from Corvallis, Oregon, follows their robot on the practice field during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. The Oregon State University Mars Rover Team is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2012-08-01
CAPE CANAVERAL, Fla. - At the Autonomous Landing and Hazard Avoidance Technology, or ALHAT, field at the north end of the Shuttle Landing Facility, or SLF, at NASA’s Kennedy Space Center in Florida, members of the media view the hazard field and speak with Morpheus managers. At far left, in the white shirt is Jon Olansen, Johnson Space Center Project Morpheus Manager. At left, in the blue shirt is Chirold Epp, JSC project manager for ALHAT. Testing of the prototype lander had been ongoing at NASA’s Johnson Space Center in Houston in preparation for its first free-flight test at Kennedy Space Center. The SLF will provide the lander with the kind of field necessary for realistic testing, complete with rocks, craters and hazards to avoid. Morpheus utilizes an autonomous landing and hazard avoidance technology, or ALHAT, payload that will allow it to navigate to clear landing sites amidst rocks, craters and other hazards during its descent. Project Morpheus is one of 20 small projects comprising the Advanced Exploration Systems, or AES, program in NASA’s Human Exploration and Operations Mission Directorate. AES projects pioneer new approaches for rapidly developing prototype systems, demonstrating key capabilities and validating operational concepts for future human missions beyond Earth orbit. For more information on Project Morpheus, visit http://morpheuslander.jsc.nasa.gov/. Photo credit: NASA/Kim Shiflett
2012-08-01
CAPE CANAVERAL, Fla. - At the Autonomous Landing and Hazard Avoidance Technology, or ALHAT, field at the north end of the Shuttle Landing Facility, or SLF, at NASA’s Kennedy Space Center in Florida, members of the media view the hazard field and speak with Morpheus managers. In the white shirt is Jon Olansen, Johnson Space Center Project Morpheus Manager. Behind Olansen is Gregory Gaddis, Kennedy Project Morpheus/ALHAT site manager. Testing of the prototype lander had been ongoing at NASA’s Johnson Space Center in Houston in preparation for its first free-flight test at Kennedy Space Center. The SLF will provide the lander with the kind of field necessary for realistic testing, complete with rocks, craters and hazards to avoid. Morpheus utilizes an autonomous landing and hazard avoidance technology, or ALHAT, payload that will allow it to navigate to clear landing sites amidst rocks, craters and other hazards during its descent. Project Morpheus is one of 20 small projects comprising the Advanced Exploration Systems, or AES, program in NASA’s Human Exploration and Operations Mission Directorate. AES projects pioneer new approaches for rapidly developing prototype systems, demonstrating key capabilities and validating operational concepts for future human missions beyond Earth orbit. For more information on Project Morpheus, visit http://morpheuslander.jsc.nasa.gov/. Photo credit: NASA/Kim Shiflett
NASA Precision Landing Technologies Completes Initial Flight Tests on Vertical Testbed Rocket
2017-04-19
This 2-minute, 40-second video shows how over the past 5 weeks, NASA and Masten Space Systems teams have prepared for and conducted sub-orbital rocket flight tests of next-generation lander navigation technology through the CoOperative Blending of Autonomous Landing Technologies (COBALT) project. The COBALT payload was integrated onto Masten’s rocket, Xodiac. The Xodiac vehicle used the Global Positioning System (GPS) for navigation during this first campaign, which was intentional to verify and refine COBALT system performance. The joint teams conducted numerous ground verification tests, made modifications in the process, practiced and refined operations’ procedures, conducted three tether tests, and have now flown two successful free flights. This successful, collaborative campaign has provided the COBALT and Xodiac teams with the valuable performance data needed to refine the systems and prepare them for the second flight test campaign this summer when the COBALT system will navigate the Xodiac rocket to a precision landing. The technologies within COBALT provide a spacecraft with knowledge during entry, descent, and landing that enables it to precisely navigate and softly land close to surface locations that have been previously too risky to target with current capabilities. The technologies will enable future exploration destinations on Mars, the moon, Europa, and other planets and moons. The two primary navigation components within COBALT include the Langley Research Center’s Navigation Doppler Lidar, which provides ultra-precise velocity and line-of-sight range measurements, and Jet Propulsion Laboratory’s Lander Vision System (LVS), which provides navigation estimates relative to an existing surface map. The integrated system is being flight tested onboard a Masten suborbital rocket vehicle called Xodiac. The COBALT project is led by the Johnson Space Center, with funding provided through the Game Changing Development, Flight Opportunities program, and Advanced Exploration Systems programs. Based at NASA’s Armstrong Flight Research Center in Edwards, CA, the Flight Opportunities program funds technology development flight tests on commercial suborbital space providers of which Masten is a vendor. The program has previously tested the LVS on the Masten rocket and validated the technology for the Mars 2020 rover.
Communication and Control for Fleets of Autonomous Underwater Vehicles
2006-10-30
Washington State University (WSU) on fuzzy logic control systems [2-4] and autonomous vehicles [5-10]. The ALWSE-MC program developed at NAVSEA CSS was...rotating head sonar on crawlers as an additional sensor for navigation. We have previously investigated the use of video cameras on autonomous vehicles for...simulates autonomous vehicles performing mine reconnaissance/mapping, clearance, and surveillance in a littoral region. Three simulations were preformed
Acoustic Communications and Navigation for Mobile Under-Ice Sensors
2017-02-04
From- To) 04/02/2017 Final Report 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Acoustic Communications and Navigation for Mobile Under-Ice Sensors...development and fielding of a new acoustic communications and navigation system for use on autonomous platforms (gliders and profiling floats) under the...contact below the ice. 15. SUBJECT TERMS Arctic Ocean, Undersea Workstations & Vehicles, Signal Processing, Navigation, Underwater Acoustics 16
Cooper, Andrew James; Redman, Chelsea Anne; Stoneham, David Mark; Gonzalez, Luis Felipe; Etse, Victor Kwesi
2015-08-28
This paper presents an unmanned aircraft system (UAS) that uses a probabilistic model for autonomous front-on environmental sensing or photography of a target. The system is based on low-cost and readily-available sensor systems in dynamic environments and with the general intent of improving the capabilities of dynamic waypoint-based navigation systems for a low-cost UAS. The behavioural dynamics of target movement for the design of a Kalman filter and Markov model-based prediction algorithm are included. Geometrical concepts and the Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of a target, thus delivering a new waypoint for autonomous navigation. The results of the application to aerial filming with low-cost UAS are presented, achieving the desired goal of maintained front-on perspective without significant constraint to the route or pace of target movement.
Navigation of autonomous vehicles for oil spill cleaning in dynamic and uncertain environments
NASA Astrophysics Data System (ADS)
Jin, Xin; Ray, Asok
2014-04-01
In the context of oil spill cleaning by autonomous vehicles in dynamic and uncertain environments, this paper presents a multi-resolution algorithm that seamlessly integrates the concepts of local navigation and global navigation based on the sensory information; the objective here is to enable adaptive decision making and online replanning of vehicle paths. The proposed algorithm provides a complete coverage of the search area for clean-up of the oil spills and does not suffer from the problem of having local minima, which is commonly encountered in potential-field-based methods. The efficacy of the algorithm is tested on a high-fidelity player/stage simulator for oil spill cleaning in a harbour, where the underlying oil weathering process is modelled as 2D random-walk particle tracking. A preliminary version of this paper was presented by X. Jin and A. Ray as 'Coverage Control of Autonomous Vehicles for Oil Spill Cleaning in Dynamic and Uncertain Environments', Proceedings of the American Control Conference, Washington, DC, June 2013, pp. 2600-2605.
Cooper, Andrew James; Redman, Chelsea Anne; Stoneham, David Mark; Gonzalez, Luis Felipe; Etse, Victor Kwesi
2015-01-01
This paper presents an unmanned aircraft system (UAS) that uses a probabilistic model for autonomous front-on environmental sensing or photography of a target. The system is based on low-cost and readily-available sensor systems in dynamic environments and with the general intent of improving the capabilities of dynamic waypoint-based navigation systems for a low-cost UAS. The behavioural dynamics of target movement for the design of a Kalman filter and Markov model-based prediction algorithm are included. Geometrical concepts and the Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of a target, thus delivering a new waypoint for autonomous navigation. The results of the application to aerial filming with low-cost UAS are presented, achieving the desired goal of maintained front-on perspective without significant constraint to the route or pace of target movement. PMID:26343680
Survey of computer vision technology for UVA navigation
NASA Astrophysics Data System (ADS)
Xie, Bo; Fan, Xiang; Li, Sijian
2017-11-01
Navigation based on computer version technology, which has the characteristics of strong independence, high precision and is not susceptible to electrical interference, has attracted more and more attention in the filed of UAV navigation research. Early navigation project based on computer version technology mainly applied to autonomous ground robot. In recent years, the visual navigation system is widely applied to unmanned machine, deep space detector and underwater robot. That further stimulate the research of integrated navigation algorithm based on computer version technology. In China, with many types of UAV development and two lunar exploration, the three phase of the project started, there has been significant progress in the study of visual navigation. The paper expounds the development of navigation based on computer version technology in the filed of UAV navigation research and draw a conclusion that visual navigation is mainly applied to three aspects as follows.(1) Acquisition of UAV navigation parameters. The parameters, including UAV attitude, position and velocity information could be got according to the relationship between the images from sensors and carrier's attitude, the relationship between instant matching images and the reference images and the relationship between carrier's velocity and characteristics of sequential images.(2) Autonomous obstacle avoidance. There are many ways to achieve obstacle avoidance in UAV navigation. The methods based on computer version technology ,including feature matching, template matching, image frames and so on, are mainly introduced. (3) The target tracking, positioning. Using the obtained images, UAV position is calculated by using optical flow method, MeanShift algorithm, CamShift algorithm, Kalman filtering and particle filter algotithm. The paper expounds three kinds of mainstream visual system. (1) High speed visual system. It uses parallel structure, with which image detection and processing are carried out at high speed. The system is applied to rapid response system. (2) The visual system of distributed network. There are several discrete image data acquisition sensor in different locations, which transmit image data to the node processor to increase the sampling rate. (3) The visual system combined with observer. The system combines image sensors with the external observers to make up for lack of visual equipment. To some degree, these systems overcome lacks of the early visual system, including low frequency, low processing efficiency and strong noise. In the end, the difficulties of navigation based on computer version technology in practical application are briefly discussed. (1) Due to the huge workload of image operation , the real-time performance of the system is poor. (2) Due to the large environmental impact , the anti-interference ability of the system is poor.(3) Due to the ability to work in a particular environment, the system has poor adaptability.
77 FR 27202 - 36(b)(1) Arms Sales Notification
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-09
... includes: Electronic Warfare Systems, Command, Control, Communication, Computers and Intelligence/Communication, Navigational and Identifications (C4I/CNI), Autonomic Logistics Global Support System (ALGS... Systems, Command, Control, Communication, Computers and Intelligence/Communication, Navigational and...
RAIM availability for supplemental GPS navigation
DOT National Transportation Integrated Search
1992-06-29
This paper examines GPS receiver autonomous integrity monitoring (RAIM) availability for supplemental navigation based on the approximate radial-error protection (ARP) method. This method applies ceiling levels for the ARP figure of merit to screen o...
Li, Hong; Liu, Mingyong; Zhang, Feihu
2017-01-01
This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments.
Li, Hong; Liu, Mingyong; Zhang, Feihu
2017-01-01
This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments. PMID:28747884
Research of autonomous celestial navigation based on new measurement model of stellar refraction
NASA Astrophysics Data System (ADS)
Yu, Cong; Tian, Hong; Zhang, Hui; Xu, Bo
2014-09-01
Autonomous celestial navigation based on stellar refraction has attracted widespread attention for its high accuracy and full autonomy.In this navigation method, establishment of accurate stellar refraction measurement model is the fundament and key issue to achieve high accuracy navigation. However, the existing measurement models are limited due to the uncertainty of atmospheric parameters. Temperature, pressure and other factors which affect the stellar refraction within the height of earth's stratosphere are researched, and the varying model of atmosphere with altitude is derived on the basis of standard atmospheric data. Furthermore, a novel measurement model of stellar refraction in a continuous range of altitudes from 20 km to 50 km is produced by modifying the fixed altitude (25 km) measurement model, and equation of state with the orbit perturbations is established, then a simulation is performed using the improved Extended Kalman Filter. The results show that the new model improves the navigation accuracy, which has a certain practical application value.
Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles.
Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F
2016-09-16
Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV's navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results.
Project Longshot: A mission to Alpha Centauri
NASA Technical Reports Server (NTRS)
West, Curtis; Chamberlain, Sally; Pagan, Neftali; Stevens, Robert
1989-01-01
Project Longshot, an exercise in the Advanced Design Program for Space, had as its destination Alpha Centauri, the closest star system to our own solar system. Alpha Centauri, a trinary star system, is 4.34 light years from earth. Although Project Longshot is impossible based on existing technologies, areas that require further investigation in order to make this feat possible are identified. Three areas where advances in technology are needed are propulsion, data processing for autonomous command and control functions, and reliability. Propulsion, possibly by antimatter annihilation; navigation and navigation aids; reliable hardware and instruments; artificial intelligence to eliminate the need for command telemetry; laser communication; and a reliable, compact, and lightweight power system that converts energy efficiently and reliably present major challenges. Project Longshot promises exciting advances in science and technology and new information concerning the universe.
DEMONSTRATION OF AUTONOMOUS AIR MONITORING THROUGH ROBOTICS
This project included modifying an existing teleoperated robot to include autonomous navigation, large object avoidance, and air monitoring and demonstrating that prototype robot system in indoor and outdoor environments. An existing teleoperated "Surveyor" robot developed by ARD...
NASA Astrophysics Data System (ADS)
Liu, Yahui; Fan, Xiaoqian; Lv, Chen; Wu, Jian; Li, Liang; Ding, Dawei
2018-02-01
Information fusion method of INS/GPS navigation system based on filtering technology is a research focus at present. In order to improve the precision of navigation information, a navigation technology based on Adaptive Kalman Filter with attenuation factor is proposed to restrain noise in this paper. The algorithm continuously updates the measurement noise variance and processes noise variance of the system by collecting the estimated and measured values, and this method can suppress white noise. Because a measured value closer to the current time would more accurately reflect the characteristics of the noise, an attenuation factor is introduced to increase the weight of the current value, in order to deal with the noise variance caused by environment disturbance. To validate the effectiveness of the proposed algorithm, a series of road tests are carried out in urban environment. The GPS and IMU data of the experiments were collected and processed by dSPACE and MATLAB/Simulink. Based on the test results, the accuracy of the proposed algorithm is 20% higher than that of a traditional Adaptive Kalman Filter. It also shows that the precision of the integrated navigation can be improved due to the reduction of the influence of environment noise.
Autonomous Flight Safety System
NASA Technical Reports Server (NTRS)
Simpson, James
2010-01-01
The Autonomous Flight Safety System (AFSS) is an independent self-contained subsystem mounted onboard a launch vehicle. AFSS has been developed by and is owned by the US Government. Autonomously makes flight termination/destruct decisions using configurable software-based rules implemented on redundant flight processors using data from redundant GPS/IMU navigation sensors. AFSS implements rules determined by the appropriate Range Safety officials.
NASA Technical Reports Server (NTRS)
Milenkovic, Zoran; DSouza, Christopher; Huish, David; Bendle, John; Kibler, Angela
2012-01-01
The exploration goals of Orion / MPCV Project will require a mature Rendezvous, Proximity Operations and Docking (RPOD) capability. Ground testing autonomous docking with a next-generation sensor such as the Vision Navigation Sensor (VNS) is a critical step along the path of ensuring successful execution of autonomous RPOD for Orion. This paper will discuss the testing rationale, the test configuration, the test limitations and the results obtained from tests that have been performed at the Lockheed Martin Space Operations Simulation Center (SOSC) to evaluate and mature the Orion RPOD system. We will show that these tests have greatly increased the confidence in the maturity of the Orion RPOD design, reduced some of the latent risks and in doing so validated the design philosophy of the Orion RPOD system. This paper is organized as follows: first, the objectives of the test are given. Descriptions of the SOSC facility, and the Orion RPOD system and associated components follow. The details of the test configuration of the components in question are presented prior to discussing preliminary results of the tests. The paper concludes with closing comments.
Development of a GPS/INS/MAG navigation system and waypoint navigator for a VTOL UAV
NASA Astrophysics Data System (ADS)
Meister, Oliver; Mönikes, Ralf; Wendel, Jan; Frietsch, Natalie; Schlaile, Christian; Trommer, Gert F.
2007-04-01
Unmanned aerial vehicles (UAV) can be used for versatile surveillance and reconnaissance missions. If a UAV is capable of flying automatically on a predefined path the range of possible applications is widened significantly. This paper addresses the development of the integrated GPS/INS/MAG navigation system and a waypoint navigator for a small vertical take-off and landing (VTOL) unmanned four-rotor helicopter with a take-off weight below 1 kg. The core of the navigation system consists of low cost inertial sensors which are continuously aided with GPS, magnetometer compass, and a barometric height information. Due to the fact, that the yaw angle becomes unobservable during hovering flight, the integration with a magnetic compass is mandatory. This integration must be robust with respect to errors caused by the terrestrial magnetic field deviation and interferences from surrounding electronic devices as well as ferrite metals. The described integration concept with a Kalman filter overcomes the problem that erroneous magnetic measurements yield to an attitude error in the roll and pitch axis. The algorithm provides long-term stable navigation information even during GPS outages which is mandatory for the flight control of the UAV. In the second part of the paper the guidance algorithms are discussed in detail. These algorithms allow the UAV to operate in a semi-autonomous mode position hold as well an complete autonomous waypoint mode. In the position hold mode the helicopter maintains its position regardless of wind disturbances which ease the pilot job during hold-and-stare missions. The autonomous waypoint navigator enable the flight outside the range of vision and beyond the range of the radio link. Flight test results of the implemented modes of operation are shown.
Cheap or Robust? The practical realization of self-driving wheelchair technology.
Burhanpurkar, Maya; Labbe, Mathieu; Guan, Charlie; Michaud, Francois; Kelly, Jonathan
2017-07-01
To date, self-driving experimental wheelchair technologies have been either inexpensive or robust, but not both. Yet, in order to achieve real-world acceptance, both qualities are fundamentally essential. We present a unique approach to achieve inexpensive and robust autonomous and semi-autonomous assistive navigation for existing fielded wheelchairs, of which there are approximately 5 million units in Canada and United States alone. Our prototype wheelchair platform is capable of localization and mapping, as well as robust obstacle avoidance, using only a commodity RGB-D sensor and wheel odometry. As a specific example of the navigation capabilities, we focus on the single most common navigation problem: the traversal of narrow doorways in arbitrary environments. The software we have developed is generalizable to corridor following, desk docking, and other navigation tasks that are either extremely difficult or impossible for people with upper-body mobility impairments.
Local navigation and fuzzy control realization for autonomous guided vehicle
NASA Astrophysics Data System (ADS)
El-Konyaly, El-Sayed H.; Saraya, Sabry F.; Shehata, Raef S.
1996-10-01
This paper addresses the problem of local navigation for an autonomous guided vehicle (AGV) in a structured environment that contains static and dynamic obstacles. Information about the environment is obtained via a CCD camera. The problem is formulated as a dynamic feedback control problem in which speed and steering decisions are made on the fly while the AGV is moving. A decision element (DE) that uses local information is proposed. The DE guides the vehicle in the environment by producing appropriate navigation decisions. Dynamic models of a three-wheeled vehicle for driving and steering mechanisms are derived. The interaction between them is performed via the local feedback DE. A controller, based on fuzzy logic, is designed to drive the vehicle safely in an intelligent and human-like manner. The effectiveness of the navigation and control strategies in driving the AGV is illustrated and evaluated.
Development and Evaluation of Positioning Systems for Autonomous Vehicle Navigation
2001-12-01
generation of autonomous vehicles to utilize NTV technology is built on a commercially-available vehicle built by ASV. The All-Purpose Remote Transport...larger scale, AFRL and CIMAR are involved in the development of a standard approach in the design and specification of autonomous vehicles being...1996. Shi92 Shin, D.H., Sanjiv, S., and Lee, J.J., “Explicit Path Tracking by Autonomous Vehicles ,” Robotica, 10, (1992), 69-87. Ste95
Visual Requirements for Human Drivers and Autonomous Vehicles
DOT National Transportation Integrated Search
2016-03-01
Identification of published literature between 1995 and 2013, focusing on determining the quantity and quality of visual information needed under both driving modes (i.e., human and autonomous) to navigate the road safely, especially as it pertains t...
Doppler lidar sensor for precision navigation in GPS-deprived environment
NASA Astrophysics Data System (ADS)
Amzajerdian, F.; Pierrottet, D. F.; Hines, G. D.; Petway, L. B.; Barnes, B. W.
2013-05-01
Landing mission concepts that are being developed for exploration of solar system bodies are increasingly ambitious in their implementations and objectives. Most of these missions require accurate position and velocity data during their descent phase in order to ensure safe, soft landing at the pre-designated sites. Data from the vehicle's Inertial Measurement Unit will not be sufficient due to significant drift error after extended travel time in space. Therefore, an onboard sensor is required to provide the necessary data for landing in the GPS-deprived environment of space. For this reason, NASA Langley Research Center has been developing an advanced Doppler lidar sensor capable of providing accurate and reliable data suitable for operation in the highly constrained environment of space. The Doppler lidar transmits three laser beams in different directions toward the ground. The signal from each beam provides the platform velocity and range to the ground along the laser line-of-sight (LOS). The six LOS measurements are then combined in order to determine the three components of the vehicle velocity vector, and to accurately measure altitude and attitude angles relative to the local ground. These measurements are used by an autonomous Guidance, Navigation, and Control system to accurately navigate the vehicle from a few kilometers above the ground to the designated location and to execute a gentle touchdown. A prototype version of our lidar sensor has been completed for a closed-loop demonstration onboard a rocket-powered terrestrial free-flyer vehicle.
Doppler Lidar Sensor for Precision Navigation in GPS-Deprived Environment
NASA Technical Reports Server (NTRS)
Amzajerdian, F.; Pierrottet, D. F.; Hines, G. D.; Hines, G. D.; Petway, L. B.; Barnes, B. W.
2013-01-01
Landing mission concepts that are being developed for exploration of solar system bodies are increasingly ambitious in their implementations and objectives. Most of these missions require accurate position and velocity data during their descent phase in order to ensure safe, soft landing at the pre-designated sites. Data from the vehicle's Inertial Measurement Unit will not be sufficient due to significant drift error after extended travel time in space. Therefore, an onboard sensor is required to provide the necessary data for landing in the GPS-deprived environment of space. For this reason, NASA Langley Research Center has been developing an advanced Doppler lidar sensor capable of providing accurate and reliable data suitable for operation in the highly constrained environment of space. The Doppler lidar transmits three laser beams in different directions toward the ground. The signal from each beam provides the platform velocity and range to the ground along the laser line-of-sight (LOS). The six LOS measurements are then combined in order to determine the three components of the vehicle velocity vector, and to accurately measure altitude and attitude angles relative to the local ground. These measurements are used by an autonomous Guidance, Navigation, and Control system to accurately navigate the vehicle from a few kilometers above the ground to the designated location and to execute a gentle touchdown. A prototype version of our lidar sensor has been completed for a closed-loop demonstration onboard a rocket-powered terrestrial free-flyer vehicle.
A Multi-Function Guidance, Navigation and Control System for Future Earth and Space Missions
NASA Technical Reports Server (NTRS)
Gambino, Joel; Dennehy, Neil; Bauer, Frank H. (Technical Monitor)
2002-01-01
Over the past several years the Guidance, Navigation and Control Center (GNCC) at NASA's Goddard Space Flight Center (GSFC) has actively engaged in the development of advanced GN&C technology to enable future Earth and Space science missions. The Multi-Function GN&C System (MFGS) design presented in this paper represents the successful coalescence of several discrete GNCC hardware and software technology innovations into one single highly integrated, compact, low power and low cost unit that simultaneously provides autonomous real time on-board attitude determination solutions and navigation solutions with accuracies that satisfy many future GSFC mission requirements. The MFGS is intended to operate as a single self-contained multifunction unit combining the functions now typically performed by a number of hardware units on a spacecraft. However, recognizing the need to satisfy a variety of future mission requirements, design provisions have been included to permit the unit to interface with a number of external remotely mounted sensors and actuators such as magnetometers, sun sensors, star cameras, reaction wheels and thrusters. The result is a highly versatile MFGS that can be configured in multiple ways to suit a realm of mission-specific GN&C requirements. It is envisioned that the MFGS will perform a mission enabling role by filling the microsat GN&C technology gap. In addition, GSFC believes that the MFGS could be employed to significantly reduce volume, power and mass requirements on conventional satellites.
GPS navigation algorithms for Autonomous Airborne Refueling of Unmanned Air Vehicles
NASA Astrophysics Data System (ADS)
Khanafseh, Samer Mahmoud
Unmanned Air Vehicles (UAVs) have recently generated great interest because of their potential to perform hazardous missions without risking loss of life. If autonomous airborne refueling is possible for UAVs, mission range and endurance will be greatly enhanced. However, concerns about UAV-tanker proximity, dynamic mobility and safety demand that the relative navigation system meets stringent requirements on accuracy, integrity, and continuity. In response, this research focuses on developing high-performance GPS-based navigation architectures for Autonomous Airborne Refueling (AAR) of UAVs. The AAR mission is unique because of the potentially severe sky blockage introduced by the tanker. To address this issue, a high-fidelity dynamic sky blockage model was developed and experimentally validated. In addition, robust carrier phase differential GPS navigation algorithms were derived, including a new method for high-integrity reacquisition of carrier cycle ambiguities for recently-blocked satellites. In order to evaluate navigation performance, world-wide global availability and sensitivity covariance analyses were conducted. The new navigation algorithms were shown to be sufficient for turn-free scenarios, but improvement in performance was necessary to meet the difficult requirements for a general refueling mission with banked turns. Therefore, several innovative methods were pursued to enhance navigation performance. First, a new theoretical approach was developed to quantify the position-domain integrity risk in cycle ambiguity resolution problems. A mechanism to implement this method with partially-fixed cycle ambiguity vectors was derived, and it was used to define tight upper bounds on AAR navigation integrity risk. A second method, where a new algorithm for optimal fusion of measurements from multiple antennas was developed, was used to improve satellite coverage in poor visibility environments such as in AAR. Finally, methods for using data-link extracted measurements as an additional inter-vehicle ranging measurement were also introduced. The algorithms and methods developed in this work are generally applicable to realize high-performance GPS-based navigation in partially obstructed environments. Navigation performance for AAR was quantified through covariance analysis, and it was shown that the stringent navigation requirements for this application are achievable. Finally, a real-time implementation of the algorithms was developed and successfully validated in autopiloted flight tests.
A reactive system for open terrain navigation: Performance and limitations
NASA Technical Reports Server (NTRS)
Langer, D.; Rosenblatt, J.; Hebert, M.
1994-01-01
We describe a core system for autonomous navigation in outdoor natural terrain. The system consists of three parts: a perception module which processes range images to identify untraversable regions of the terrain, a local map management module which maintains a representation of the environment in the vicinity of the vehicle, and a planning module which issues commands to the vehicle controller. Our approach is to use the concept of 'early traversability evaluation', and on the use of reactive planning for generating commands to drive the vehicle. We argue that our approach leads to a robust and efficient navigation system. We illustrate our approach by an experiment in which a vehicle travelled autonomously for one kilometer through unmapped cross-country terrain.
NASA Astrophysics Data System (ADS)
Cheng, Xiang-Qin; Qu, Jing-Yuan; Yan, Zhe-Ping; Bian, Xin-Qian
2010-03-01
In order to improve the security and reliability for autonomous underwater vehicle (AUV) navigation, an H∞ robust fault-tolerant controller was designed after analyzing variations in state-feedback gain. Operating conditions and the design method were then analyzed so that the control problem could be expressed as a mathematical optimization problem. This permitted the use of linear matrix inequalities (LMI) to solve for the H∞ controller for the system. When considering different actuator failures, these conditions were then also mathematically expressed, allowing the H∞ robust controller to solve for these events and thus be fault-tolerant. Finally, simulation results showed that the H∞ robust fault-tolerant controller could provide precise AUV navigation control with strong robustness.
Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) Project Status as of May 2010
NASA Technical Reports Server (NTRS)
Striepe, Scott A.; Epp, Chirold D.; Robertson, Edward A.
2010-01-01
This paper includes the current status of NASA s Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) Project. The ALHAT team has completed several flight tests and two major design analysis cycles. These tests and analyses examine terrain relative navigation sensors, hazard detection and avoidance sensors and algorithms, and hazard relative navigation algorithms, and the guidance and navigation system using these ALHAT functions. The next flight test is scheduled for July 2010. The paper contains results from completed flight tests and analysis cycles. ALHAT system status, upcoming tests and analyses is also addressed. The current ALHAT plans as of May 2010 are discussed. Application of the ALHAT system to landing on bodies other than the Moon is included
Ramos, A G; García-Garrido, V J; Mancho, A M; Wiggins, S; Coca, J; Glenn, S; Schofield, O; Kohut, J; Aragon, D; Kerfoot, J; Haskins, T; Miles, T; Haldeman, C; Strandskov, N; Allsup, B; Jones, C; Shapiro, J
2018-03-15
Transoceanic Gliders are Autonomous Underwater Vehicles (AUVs) for which there is a developing and expanding range of applications in open-seas research, technology and underwater clean transport. Mature glider autonomy, operating depth (0-1000 meters) and low energy consumption without a CO 2 footprint enable evolutionary access across ocean basins. Pursuant to the first successful transatlantic glider crossing in December 2009, the Challenger Mission has opened the door to long-term, long-distance routine transoceanic AUV missions. These vehicles, which glide through the water column between 0 and 1000 meters depth, are highly sensitive to the ocean current field. Consequently, it is essential to exploit the complex space-time structure of the ocean current field in order to plan a path that optimizes scientific payoff and navigation efficiency. This letter demonstrates the capability of dynamical system theory for achieving this goal by realizing the real-time navigation strategy for the transoceanic AUV named Silbo, which is a Slocum deep-glider (0-1000 m), that crossed the North Atlantic from April 2016 to March 2017. Path planning in real time based on this approach has facilitated an impressive speed up of the AUV to unprecedented velocities resulting in major battery savings on the mission, offering the potential for routine transoceanic long duration missions.
Experiments in autonomous robotics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamel, W.R.
1987-01-01
The Center for Engineering Systems Advanced Research (CESAR) is performing basic research in autonomous robotics for energy-related applications in hazardous environments. The CESAR research agenda includes a strong experimental component to assure practical evaluation of new concepts and theories. An evolutionary sequence of mobile research robots has been planned to support research in robot navigation, world sensing, and object manipulation. A number of experiments have been performed in studying robot navigation and path planning with planar sonar sensing. Future experiments will address more complex tasks involving three-dimensional sensing, dexterous manipulation, and human-scale operations.
Development and demonstration of autonomous behaviors for urban environment exploration
NASA Astrophysics Data System (ADS)
Ahuja, Gaurav; Fellars, Donald; Kogut, Gregory; Pacis Rius, Estrellina; Schoolov, Misha; Xydes, Alexander
2012-06-01
Under the Urban Environment Exploration project, the Space and Naval Warfare Systems Center Pacic (SSC- PAC) is maturing technologies and sensor payloads that enable man-portable robots to operate autonomously within the challenging conditions of urban environments. Previously, SSC-PAC has demonstrated robotic capabilities to navigate and localize without GPS and map the ground oors of various building sizes.1 SSC-PAC has since extended those capabilities to localize and map multiple multi-story buildings within a specied area. To facilitate these capabilities, SSC-PAC developed technologies that enable the robot to detect stairs/stairwells, maintain localization across multiple environments (e.g. in a 3D world, on stairs, with/without GPS), visualize data in 3D, plan paths between any two points within the specied area, and avoid 3D obstacles. These technologies have been developed as independent behaviors under the Autonomous Capabilities Suite, a behavior architecture, and demonstrated at a MOUT site at Camp Pendleton. This paper describes the perceptions and behaviors used to produce these capabilities, as well as an example demonstration scenario.
Crew/Robot Coordinated Planetary EVA Operations at a Lunar Base Analog Site
NASA Technical Reports Server (NTRS)
Diftler, M. A.; Ambrose, R. O.; Bluethmann, W. J.; Delgado, F. J.; Herrera, E.; Kosmo, J. J.; Janoiko, B. A.; Wilcox, B. H.; Townsend, J. A.; Matthews, J. B.;
2007-01-01
Under the direction of NASA's Exploration Technology Development Program, robots and space suited subjects from several NASA centers recently completed a very successful demonstration of coordinated activities indicative of base camp operations on the lunar surface. For these activities, NASA chose a site near Meteor Crater, Arizona close to where Apollo Astronauts previously trained. The main scenario demonstrated crew returning from a planetary EVA (extra-vehicular activity) to a temporary base camp and entering a pressurized rover compartment while robots performed tasks in preparation for the next EVA. Scenario tasks included: rover operations under direct human control and autonomous modes, crew ingress and egress activities, autonomous robotic payload removal and stowage operations under both local control and remote control from Houston, and autonomous robotic navigation and inspection. In addition to the main scenario, participants had an opportunity to explore additional robotic operations: hill climbing, maneuvering heaving loads, gathering geo-logical samples, drilling, and tether operations. In this analog environment, the suited subjects and robots experienced high levels of dust, rough terrain, and harsh lighting.
2014-09-30
underwater acoustic communication technologies for autonomous distributed underwater networks , through innovative signal processing, coding, and...4. TITLE AND SUBTITLE Advancing Underwater Acoustic Communication for Autonomous Distributed Networks via Sparse Channel Sensing, Coding, and...coding: 3) OFDM modulated dynamic coded cooperation in underwater acoustic channels; 3 Localization, Networking , and Testbed: 4) On-demand
Bioinspired engineering of exploration systems for NASA and DoD
NASA Technical Reports Server (NTRS)
Thakoor, Sarita; Chahl, Javaan; Srinivasan, M. V.; Young, L.; Werblin, Frank; Hine, Butler; Zornetzer, Steven
2002-01-01
A new approach called bioinspired engineering of exploration systems (BEES) and its value for solving pressing NASA and DoD needs are described. Insects (for example honeybees and dragonflies) cope remarkably well with their world, despite possessing a brain containing less than 0.01% as many neurons as the human brain. Although most insects have immobile eyes with fixed focus optics and lack stereo vision, they use a number of ingenious, computationally simple strategies for perceiving their world in three dimensions and navigating successfully within it. We are distilling selected insect-inspired strategies to obtain novel solutions for navigation, hazard avoidance, altitude hold, stable flight, terrain following, and gentle deployment of payload. Such functionality provides potential solutions for future autonomous robotic space and planetary explorers. A BEES approach to developing lightweight low-power autonomous flight systems should be useful for flight control of such biomorphic flyers for both NASA and DoD needs. Recent biological studies of mammalian retinas confirm that representations of multiple features of the visual world are systematically parsed and processed in parallel. Features are mapped to a stack of cellular strata within the retina. Each of these representations can be efficiently modeled in semiconductor cellular nonlinear network (CNN) chips. We describe recent breakthroughs in exploring the feasibility of the unique blending of insect strategies of navigation with mammalian visual search, pattern recognition, and image understanding into hybrid biomorphic flyers for future planetary and terrestrial applications. We describe a few future mission scenarios for Mars exploration, uniquely enabled by these newly developed biomorphic flyers.
Autonomous docking ground demonstration
NASA Technical Reports Server (NTRS)
Lamkin, Steve L.; Le, Thomas Quan; Othon, L. T.; Prather, Joseph L.; Eick, Richard E.; Baxter, Jim M.; Boyd, M. G.; Clark, Fred D.; Spehar, Peter T.; Teters, Rebecca T.
1991-01-01
The Autonomous Docking Ground Demonstration is an evaluation of the laser sensor system to support the docking phase (12 ft to contact) when operated in conjunction with the guidance, navigation, and control (GN&C) software. The docking mechanism being used was developed for the Apollo/Soyuz Test Program. This demonstration will be conducted using the 6-DOF Dynamic Test System (DTS). The DTS simulates the Space Station Freedom as the stationary or target vehicle and the Orbiter as the active or chase vehicle. For this demonstration, the laser sensor will be mounted on the target vehicle and the retroflectors will be on the chase vehicle. This arrangement was chosen to prevent potential damage to the laser. The laser sensor system, GN&C, and 6-DOF DTS will be operated closed-loop. Initial conditions to simulate vehicle misalignments, translational and rotational, will be introduced within the constraints of the systems involved.
Autonomous learning based on cost assumptions: theoretical studies and experiments in robot control.
Ribeiro, C H; Hemerly, E M
2000-02-01
Autonomous learning techniques are based on experience acquisition. In most realistic applications, experience is time-consuming: it implies sensor reading, actuator control and algorithmic update, constrained by the learning system dynamics. The information crudeness upon which classical learning algorithms operate make such problems too difficult and unrealistic. Nonetheless, additional information for facilitating the learning process ideally should be embedded in such a way that the structural, well-studied characteristics of these fundamental algorithms are maintained. We investigate in this article a more general formulation of the Q-learning method that allows for a spreading of information derived from single updates towards a neighbourhood of the instantly visited state and converges to optimality. We show how this new formulation can be used as a mechanism to safely embed prior knowledge about the structure of the state space, and demonstrate it in a modified implementation of a reinforcement learning algorithm in a real robot navigation task.
Coordinating teams of autonomous vehicles: an architectural perspective
NASA Astrophysics Data System (ADS)
Czichon, Cary; Peterson, Robert W.; Mettala, Erik G.; Vondrak, Ivo
2005-05-01
In defense-related robotics research, a mission level integration gap exists between mission tasks (tactical) performed by ground, sea, or air applications and elementary behaviors enacted by processing, communications, sensors, and weaponry resources (platform specific). The gap spans ensemble (heterogeneous team) behaviors, automatic MOE/MOP tracking, and tactical task modeling/simulation for virtual and mixed teams comprised of robotic and human combatants. This study surveys robotic system architectures, compares approaches for navigating problem/state spaces by autonomous systems, describes an architecture for an integrated, repository-based modeling, simulation, and execution environment, and outlines a multi-tiered scheme for robotic behavior components that is agent-based, platform-independent, and extendable via plug-ins. Tools for this integrated environment, along with a distributed agent framework for collaborative task performance are being developed by a U.S. Army funded SBIR project (RDECOM Contract N61339-04-C-0005).
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Russel Howe of team Survey, center, works on a laptop to prepare the team's robot for a demonstration run after the team's robot failed to leave the starting platform during it's attempt at the level two challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Russel Howe of team Survey speaks with Sample Return Robot Challenge staff members after the team's robot failed to leave the starting platform during it's attempt at the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Kenneth Stafford, Assistant Director of Robotics Engineering and Director of the Robotics Resource Center at the Worcester Polytechnic Institute (WPI), verifies the location of the target sample during the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Members of the Mountaineers team from West Virginia University celebrate after their robot returned to the starting platform after picking up the sample during a rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
A pair of Worcester Polytechnic Institute (WPI) students walk past a pair of team KuuKulgur's robots on the campus quad, during a final tuneup before the start of competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team KuuKulgur is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
1987-06-01
by block numoiber) The study of human driving of automotive vehicles is an important aid to the development of viable autonomous vehicle navigation...of human driving which could provide some different insights into possible approaches to autonomous vehicle control. At the start of this work, it was...advanced work in the behavioral aspects of human driving . Research of this nature can have a significant impact on the development of autonomous vehicles
NASA Technical Reports Server (NTRS)
Mitchell, Jennifer D.; Cryan, Scott P.; Baker, Kenneth; Martin, Toby; Goode, Robert; Key, Kevin W.; Manning, Thomas; Chien, Chiun-Hong
2008-01-01
The Exploration Systems Architecture defines missions that require rendezvous, proximity operations, and docking (RPOD) of two spacecraft both in Low Earth Orbit (LEO) and in Low Lunar Orbit (LLO). Uncrewed spacecraft must perform automated and/or autonomous rendezvous, proximity operations and docking operations (commonly known as Automated Rendezvous and Docking, AR&D). The crewed versions may also perform AR&D, possibly with a different level of automation and/or autonomy, and must also provide the crew with relative navigation information for manual piloting. The capabilities of the RPOD sensors are critical to the success of the Constellation Program; this is carried as one of the CEV Project top risks. The Exploration Technology Development Program (ETDP) AR&D Sensor Technology Project seeks to reduce this risk by increasing technology maturation of selected relative navigation sensor technologies through testing and simulation. One of the project activities is a series of "pathfinder" testing and simulation activities to integrate relative navigation sensors with the Johnson Space Center Six-Degree-of-Freedom Test System (SDTS). The SDTS will be the primary testing location for the Orion spacecraft s Low Impact Docking System (LIDS). Project team members have integrated the Orion simulation with the SDTS computer system so that real-time closed loop testing can be performed with relative navigation sensors and the docking system in the loop during docking and undocking scenarios. Two relative navigation sensors are being used as part of a "pathfinder" activity in order to pave the way for future testing with the actual Orion sensors. This paper describes the test configuration and test results.
Considerations for an Integrated UAS CNS Architecture
NASA Technical Reports Server (NTRS)
Templin, Fred L.; Jain, Raj; Sheffield, Greg; Taboso-Bellesteros, Pedro; Ponchak, Denise
2017-01-01
The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) is investigating revolutionary and advanced universal, reliable, always available, cyber secure and affordable Communication, Navigation, Surveillance (CNS) options for all altitudes of UAS operations. In Spring 2015, NASA issued a Call for Proposals under NASA Research Announcements (NRA) NNH15ZEA001N, Amendment 7 Subtopic 2.4. Boeing was selected to conduct a study with the objective to determine the most promising candidate technologies for Unmanned Air Systems (UAS) air-to-air and air-to-ground data exchange and analyze their suitability in a post-NextGen NAS environment. The overall objectives are to develop UAS CNS requirements and then develop architectures that satisfy the requirements for UAS in both controlled and uncontrolled air space. This contract is funded under NASAs Aeronautics Research Mission Directorates (ARMD) Aviation Operations and Safety Program (AOSP) Safe Autonomous Systems Operations (SASO) project and proposes technologies for the Unmanned Air Systems Traffic Management (UTM) service.There is a need for accommodating large-scale populations of Unmanned Air Systems (UAS) in the national air space. Scale obviously impacts capacity planning for Communication, Navigation, and Surveillance (CNS) technologies. For example, can wireless communications data links provide the necessary capacity for accommodating millions of small UASs (sUAS) nationwide? Does the communications network provide sufficient Internet Protocol (IP) address space to allow air traffic control to securely address both UAS teams as a whole as well as individual UAS within each team? Can navigation and surveillance approaches assure safe route planning and safe separation of vehicles even in crowded skies?Our objective is to identify revolutionary and advanced CNS alternatives supporting UASs operating at all altitudes and in all airspace while accurately navigating in the absence of navigational aids. These CNS alternatives must be reliable, redundant, always available, cyber-secure, and affordable for all types of vehicles including small UAS to large transport category aircraft. The approach will identify CNS technology candidates that can meet the needs of the range of UAS missions to specific air traffic management applications where they will be most beneficial and cost effective.
Decentralized reinforcement-learning control and emergence of motion patterns
NASA Astrophysics Data System (ADS)
Svinin, Mikhail; Yamada, Kazuyaki; Okhura, Kazuhiro; Ueda, Kanji
1998-10-01
In this paper we propose a system for studying emergence of motion patterns in autonomous mobile robotic systems. The system implements an instance-based reinforcement learning control. Three spaces are of importance in formulation of the control scheme. They are the work space, the sensor space, and the action space. Important feature of our system is that all these spaces are assumed to be continuous. The core part of the system is a classifier system. Based on the sensory state space analysis, the control is decentralized and is specified at the lowest level of the control system. However, the local controllers are implicitly connected through the perceived environment information. Therefore, they constitute a dynamic environment with respect to each other. The proposed control scheme is tested under simulation for a mobile robot in a navigation task. It is shown that some patterns of global behavior--such as collision avoidance, wall-following, light-seeking--can emerge from the local controllers.
Switching Reinforcement Learning for Continuous Action Space
NASA Astrophysics Data System (ADS)
Nagayoshi, Masato; Murao, Hajime; Tamaki, Hisashi
Reinforcement Learning (RL) attracts much attention as a technique of realizing computational intelligence such as adaptive and autonomous decentralized systems. In general, however, it is not easy to put RL into practical use. This difficulty includes a problem of designing a suitable action space of an agent, i.e., satisfying two requirements in trade-off: (i) to keep the characteristics (or structure) of an original search space as much as possible in order to seek strategies that lie close to the optimal, and (ii) to reduce the search space as much as possible in order to expedite the learning process. In order to design a suitable action space adaptively, we propose switching RL model to mimic a process of an infant's motor development in which gross motor skills develop before fine motor skills. Then, a method for switching controllers is constructed by introducing and referring to the “entropy”. Further, through computational experiments by using robot navigation problems with one and two-dimensional continuous action space, the validity of the proposed method has been confirmed.
Bioinspired polarization navigation sensor for autonomous munitions systems
NASA Astrophysics Data System (ADS)
Giakos, G. C.; Quang, T.; Farrahi, T.; Deshpande, A.; Narayan, C.; Shrestha, S.; Li, Y.; Agarwal, M.
2013-05-01
Small unmanned aerial vehicles UAVs (SUAVs), micro air vehicles (MAVs), Automated Target Recognition (ATR), and munitions guidance, require extreme operational agility and robustness which can be partially offset by efficient bioinspired imaging sensor designs capable to provide enhanced guidance, navigation and control capabilities (GNC). Bioinspired-based imaging technology can be proved useful either for long-distance surveillance of targets in a cluttered environment, or at close distances limited by space surroundings and obstructions. The purpose of this study is to explore the phenomenology of image formation by different insect eye architectures, which would directly benefit the areas of defense and security, on the following four distinct areas: a) fabrication of the bioinspired sensor b) optical architecture, c) topology, and d) artificial intelligence. The outcome of this study indicates that bioinspired imaging can impact the areas of defense and security significantly by dedicated designs fitting into different combat scenarios and applications.
Autonomous Vision Navigation for Spacecraft in Lunar Orbit
NASA Astrophysics Data System (ADS)
Bader, Nolan A.
NASA aims to achieve unprecedented navigational reliability for the first manned lunar mission of the Orion spacecraft in 2023. A technique for accomplishing this is to integrate autonomous feature tracking as an added means of improving position and velocity estimation. In this thesis, a template matching algorithm and optical sensor are tested onboard three simulated lunar trajectories using linear covariance techniques under various conditions. A preliminary characterization of the camera gives insight into its ability to determine azimuth and elevation angles to points on the surface of the Moon. A navigation performance analysis shows that an optical camera sensor can aid in decreasing position and velocity errors, particularly in a loss of communication scenario. Furthermore, it is found that camera quality and computational capability are driving factors affecting the performance of such a system.
ARK: Autonomous mobile robot in an industrial environment
NASA Technical Reports Server (NTRS)
Nickerson, S. B.; Jasiobedzki, P.; Jenkin, M.; Jepson, A.; Milios, E.; Down, B.; Service, J. R. R.; Terzopoulos, D.; Tsotsos, J.; Wilkes, D.
1994-01-01
This paper describes research on the ARK (Autonomous Mobile Robot in a Known Environment) project. The technical objective of the project is to build a robot that can navigate in a complex industrial environment using maps with permanent structures. The environment is not altered in any way by adding easily identifiable beacons and the robot relies on naturally occurring objects to use as visual landmarks for navigation. The robot is equipped with various sensors that can detect unmapped obstacles, landmarks and objects. In this paper we describe the robot's industrial environment, it's architecture, a novel combined range and vision sensor and our recent results in controlling the robot in the real-time detection of objects using their color and in the processing of the robot's range and vision sensor data for navigation.
Underwater terrain-aided navigation system based on combination matching algorithm.
Li, Peijuan; Sheng, Guoliang; Zhang, Xiaofei; Wu, Jingqiu; Xu, Baochun; Liu, Xing; Zhang, Yao
2018-07-01
Considering that the terrain-aided navigation (TAN) system based on iterated closest contour point (ICCP) algorithm diverges easily when the indicative track of strapdown inertial navigation system (SINS) is large, Kalman filter is adopted in the traditional ICCP algorithm, difference between matching result and SINS output is used as the measurement of Kalman filter, then the cumulative error of the SINS is corrected in time by filter feedback correction, and the indicative track used in ICCP is improved. The mathematic model of the autonomous underwater vehicle (AUV) integrated into the navigation system and the observation model of TAN is built. Proper matching point number is designated by comparing the simulation results of matching time and matching precision. Simulation experiments are carried out according to the ICCP algorithm and the mathematic model. It can be concluded from the simulation experiments that the navigation accuracy and stability are improved with the proposed combinational algorithm in case that proper matching point number is engaged. It will be shown that the integrated navigation system is effective in prohibiting the divergence of the indicative track and can meet the requirements of underwater, long-term and high precision of the navigation system for autonomous underwater vehicles. Copyright © 2017. Published by Elsevier Ltd.
Mini AERCam: A Free-Flying Robot for Space Inspection
NASA Technical Reports Server (NTRS)
Fredrickson, Steven
2001-01-01
The NASA Johnson Space Center Engineering Directorate is developing the Autonomous Extravehicular Robotic Camera (AERCam), a free-flying camera system for remote viewing and inspection of human spacecraft. The AERCam project team is currently developing a miniaturized version of AERCam known as Mini AERCam, a spherical nanosatellite 7.5 inches in diameter. Mini AERCam development builds on the success of AERCam Sprint, a 1997 Space Shuttle flight experiment, by integrating new on-board sensing and processing capabilities while simultaneously reducing volume by 80%. Achieving these productivity-enhancing capabilities in a smaller package depends on aggressive component miniaturization. Technology innovations being incorporated include micro electromechanical system (MEMS) gyros, "camera-on-a-chip" CMOS imagers, rechargeable xenon gas propulsion, rechargeable lithium ion battery, custom avionics based on the PowerPC 740 microprocessor, GPS relative navigation, digital radio frequency communications and tracking, micropatch antennas, digital instrumentation, and dense mechanical packaging. The Mini AERCam free-flyer will initially be integrated into an approximate flight-like configuration for laboratory demonstration on an airbearing table. A pilot-in-the-loop and hardware-in-the-loop simulation to simulate on-orbit navigation and dynamics will complement the airbearing table demonstration. The Mini AERCam lab demonstration is intended to form the basis for future development of an AERCam flight system that provides on-orbit views of the Space Shuttle and International Space Station unobtainable from fixed cameras, cameras on robotic manipulators, or cameras carried by space-walking crewmembers.
Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles
Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F.
2016-01-01
Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV’s navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results. PMID:27649203
Global Precipitation Measurement (GPM) Orbit Design and Autonomous Maneuvers
NASA Technical Reports Server (NTRS)
Folta, David; Mendelsohn, Chad; Mailhe, Laurie
2003-01-01
The NASA Goddard Space Flight Center's Global Precipitation Measurement (GPM) mission must meet the challenge of measuring worldwide precipitation every three hours. The GPM core spacecraft, part of a constellation, will be required to maintain a circular orbit in a high drag environment at a near-critical inclination. Analysis shows that a mean orbit altitude of 407 km is necessary to prevent ground track repeating. Combined with goals to minimize maneuver operation impacts to science data collection and to enable reasonable long-term orbit predictions, the GPM project has decided to fly the GSFC autonomous maneuver system, AutoCon(TM). This system is a follow-up version of the highly successful New Millennium Program technology flown onboard the Earth Observing-1 formation flying mission. This paper presents the driving science requirements and goals of the GPM mission and shows how they will be met. Selection of the mean semi-major axis, eccentricity, and the AV budget for several ballistic properties are presented. The architecture of the autonomous maneuvering system to meet the goals and requirements is presented along with simulations using GPM parameters. Additionally, the use of the GPM autonomous system to mitigate possible collision avoidance and to aid other spacecraft systems during navigation outages is explored.
Autonomous Aerial Refueling Ground Test Demonstration—A Sensor-in-the-Loop, Non-Tracking Method
Chen, Chao-I; Koseluk, Robert; Buchanan, Chase; Duerner, Andrew; Jeppesen, Brian; Laux, Hunter
2015-01-01
An essential capability for an unmanned aerial vehicle (UAV) to extend its airborne duration without increasing the size of the aircraft is called the autonomous aerial refueling (AAR). This paper proposes a sensor-in-the-loop, non-tracking method for probe-and-drogue style autonomous aerial refueling tasks by combining sensitivity adjustments of a 3D Flash LIDAR camera with computer vision based image-processing techniques. The method overcomes the inherit ambiguity issues when reconstructing 3D information from traditional 2D images by taking advantage of ready to use 3D point cloud data from the camera, followed by well-established computer vision techniques. These techniques include curve fitting algorithms and outlier removal with the random sample consensus (RANSAC) algorithm to reliably estimate the drogue center in 3D space, as well as to establish the relative position between the probe and the drogue. To demonstrate the feasibility of the proposed method on a real system, a ground navigation robot was designed and fabricated. Results presented in the paper show that using images acquired from a 3D Flash LIDAR camera as real time visual feedback, the ground robot is able to track a moving simulated drogue and continuously narrow the gap between the robot and the target autonomously. PMID:25970254
An onboard navigation system which fulfills Mars aerocapture guidance requirements
NASA Technical Reports Server (NTRS)
Brand, Timothy J.; Fuhry, Douglas P.; Shepperd, Stanley W.
1989-01-01
The development of a candidate autonomous onboard Mars approach navigation scheme capable of supporting aerocapture into Mars orbit is discussed. An aerocapture guidance and navigation system which can run independently of the preaerocapture navigation was used to define a preliminary set of accuracy requirements at entry interface. These requirements are used to evaluate the proposed preaerocapture navigation scheme. This scheme uses optical sightings on Deimos with a star tracker and an inertial measurement unit for instrumentation as a source for navigation nformation. Preliminary results suggest that the approach will adequately support aerocaputre into Mars orbit.
Robot navigation research using the HERMIES mobile robot
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnett, D.L.
1989-01-01
In recent years robot navigation has attracted much attention from researchers around the world. Not only are theoretical studies being simulated on sophisticated computers, but many mobile robots are now used as test vehicles for these theoretical studies. Various algorithms have been perfected for navigation in a known static environment; but navigation in an unknown and dynamic environment poses a much more challenging problem for researchers. Many different methodologies have been developed for autonomous robot navigation, but each methodology is usually restricted to a particular type of environment. One important research focus of the Center for Engineering Systems Advanced researchmore » (CESAR) at Oak Ridge National Laboratory, is autonomous navigation in unknown and dynamic environments using the series of HERMIES mobile robots. The research uses an expert system for high-level planning interfaced with C-coded routines for implementing the plans, and for quick processing of data requested by the expert system. In using this approach, the navigation is not restricted to one methodology since the expert system can activate a rule module for the methodology best suited for the current situation. Rule modules can be added the rule base as they are developed and tested. Modules are being developed or enhanced for navigating from a map, searching for a target, exploring, artificial potential-field navigation, navigation using edge-detection, etc. This paper will report on the various rule modules and methods of navigation in use, or under development at CESAR, using the HERMIES-IIB robot as a testbed. 13 refs., 5 figs., 1 tab.« less
Autonomous navigation and control of a Mars rover
NASA Technical Reports Server (NTRS)
Miller, D. P.; Atkinson, D. J.; Wilcox, B. H.; Mishkin, A. H.
1990-01-01
A Mars rover will need to be able to navigate autonomously kilometers at a time. This paper outlines the sensing, perception, planning, and execution monitoring systems that are currently being designed for the rover. The sensing is based around stereo vision. The interpretation of the images use a registration of the depth map with a global height map provided by an orbiting spacecraft. Safe, low energy paths are then planned through the map, and expectations of what the rover's articulation sensors should sense are generated. These expectations are then used to ensure that the planned path is correctly being executed.
Autonomous Mars ascent and orbit rendezvous for earth return missions
NASA Technical Reports Server (NTRS)
Edwards, H. C.; Balmanno, W. F.; Cruz, Manuel I.; Ilgen, Marc R.
1991-01-01
The details of tha assessment of autonomous Mars ascent and orbit rendezvous for earth return missions are presented. Analyses addressing navigation system assessments, trajectory planning, targeting approaches, flight control guidance strategies, and performance sensitivities are included. Tradeoffs in the analysis and design process are discussed.
Vision-Aided Autonomous Landing and Ingress of Micro Aerial Vehicles
NASA Technical Reports Server (NTRS)
Brockers, Roland; Ma, Jeremy C.; Matthies, Larry H.; Bouffard, Patrick
2012-01-01
Micro aerial vehicles have limited sensor suites and computational power. For reconnaissance tasks and to conserve energy, these systems need the ability to autonomously land at vantage points or enter buildings (ingress). But for autonomous navigation, information is needed to identify and guide the vehicle to the target. Vision algorithms can provide egomotion estimation and target detection using input from cameras that are easy to include in miniature systems.
Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm
Kyristsis, Sarantis; Antonopoulos, Angelos; Chanialakis, Theofilos; Stefanakis, Emmanouel; Linardos, Christos; Tripolitsiotis, Achilles; Partsinevelos, Panagiotis
2016-01-01
Nowadays, various unmanned aerial vehicle (UAV) applications become increasingly demanding since they require real-time, autonomous and intelligent functions. Towards this end, in the present study, a fully autonomous UAV scenario is implemented, including the tasks of area scanning, target recognition, geo-location, monitoring, following and finally landing on a high speed moving platform. The underlying methodology includes AprilTag target identification through Graphics Processing Unit (GPU) parallelized processing, image processing and several optimized locations and approach algorithms employing gimbal movement, Global Navigation Satellite System (GNSS) readings and UAV navigation. For the experimentation, a commercial and a custom made quad-copter prototype were used, portraying a high and a low-computational embedded platform alternative. Among the successful targeting and follow procedures, it is shown that the landing approach can be successfully performed even under high platform speeds. PMID:27827883
Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm.
Kyristsis, Sarantis; Antonopoulos, Angelos; Chanialakis, Theofilos; Stefanakis, Emmanouel; Linardos, Christos; Tripolitsiotis, Achilles; Partsinevelos, Panagiotis
2016-11-03
Nowadays, various unmanned aerial vehicle (UAV) applications become increasingly demanding since they require real-time, autonomous and intelligent functions. Towards this end, in the present study, a fully autonomous UAV scenario is implemented, including the tasks of area scanning, target recognition, geo-location, monitoring, following and finally landing on a high speed moving platform. The underlying methodology includes AprilTag target identification through Graphics Processing Unit (GPU) parallelized processing, image processing and several optimized locations and approach algorithms employing gimbal movement, Global Navigation Satellite System (GNSS) readings and UAV navigation. For the experimentation, a commercial and a custom made quad-copter prototype were used, portraying a high and a low-computational embedded platform alternative. Among the successful targeting and follow procedures, it is shown that the landing approach can be successfully performed even under high platform speeds.
Autonomous Underwater Navigation and Optical Mapping in Unknown Natural Environments.
Hernández, Juan David; Istenič, Klemen; Gracias, Nuno; Palomeras, Narcís; Campos, Ricard; Vidal, Eduard; García, Rafael; Carreras, Marc
2016-07-26
We present an approach for navigating in unknown environments while, simultaneously, gathering information for inspecting underwater structures using an autonomous underwater vehicle (AUV). To accomplish this, we first use our pipeline for mapping and planning collision-free paths online, which endows an AUV with the capability to autonomously acquire optical data in close proximity. With that information, we then propose a reconstruction pipeline to create a photo-realistic textured 3D model of the inspected area. These 3D models are also of particular interest to other fields of study in marine sciences, since they can serve as base maps for environmental monitoring, thus allowing change detection of biological communities and their environment over time. Finally, we evaluate our approach using the Sparus II, a torpedo-shaped AUV, conducting inspection missions in a challenging, real-world and natural scenario.
Autonomous Underwater Navigation and Optical Mapping in Unknown Natural Environments
Hernández, Juan David; Istenič, Klemen; Gracias, Nuno; Palomeras, Narcís; Campos, Ricard; Vidal, Eduard; García, Rafael; Carreras, Marc
2016-01-01
We present an approach for navigating in unknown environments while, simultaneously, gathering information for inspecting underwater structures using an autonomous underwater vehicle (AUV). To accomplish this, we first use our pipeline for mapping and planning collision-free paths online, which endows an AUV with the capability to autonomously acquire optical data in close proximity. With that information, we then propose a reconstruction pipeline to create a photo-realistic textured 3D model of the inspected area. These 3D models are also of particular interest to other fields of study in marine sciences, since they can serve as base maps for environmental monitoring, thus allowing change detection of biological communities and their environment over time. Finally, we evaluate our approach using the Sparus II, a torpedo-shaped AUV, conducting inspection missions in a challenging, real-world and natural scenario. PMID:27472337
Requirements for an Integrated UAS CNS Architecture
NASA Technical Reports Server (NTRS)
Templin, Fred; Jain, Raj; Sheffield, Greg; Taboso, Pedro; Ponchak, Denise
2017-01-01
The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) is investigating revolutionary and advanced universal, reliable, always available, cyber secure and affordable Communication, Navigation, Surveillance (CNS) options for all altitudes of UAS operations. In Spring 2015, NASA issued a Call for Proposals under NASA Research Announcements (NRA) NNH15ZEA001N, Amendment 7 Subtopic 2.4. Boeing was selected to conduct a study with the objective to determine the most promising candidate technologies for Unmanned Air Systems (UAS) air-to-air and air-to-ground data exchange and analyze their suitability in a post-NextGen NAS environment. The overall objectives are to develop UAS CNS requirements and then develop architectures that satisfy the requirements for UAS in both controlled and uncontrolled air space. This contract is funded under NASAs Aeronautics Research Mission Directorates (ARMD) Aviation Operations and Safety Program (AOSP) Safe Autonomous Systems Operations (SASO) project and proposes technologies for the Unmanned Air Systems Traffic Management (UTM) service. Communications, Navigation and Surveillance (CNS) requirements must be developed in order to establish a CNS architecture supporting Unmanned Air Systems integration in the National Air Space (UAS in the NAS). These requirements must address cybersecurity, future communications, satellite-based navigation APNT, and scalable surveillance and situational awareness. CNS integration, consolidation and miniaturization requirements are also important to support the explosive growth in small UAS deployment. Air Traffic Management (ATM) must also be accommodated to support critical Command and Control (C2) for Air Traffic Controllers (ATC). This document therefore presents UAS CNS requirements that will guide the architecture.
Multidisciplinary unmanned technology teammate (MUTT)
NASA Astrophysics Data System (ADS)
Uzunovic, Nenad; Schneider, Anne; Lacaze, Alberto; Murphy, Karl; Del Giorno, Mark
2013-01-01
The U.S. Army Tank Automotive Research, Development and Engineering Center (TARDEC) held an autonomous robot competition called CANINE in June 2012. The goal of the competition was to develop innovative and natural control methods for robots. This paper describes the winning technology, including the vision system, the operator interaction, and the autonomous mobility. The rules stated only gestures or voice commands could be used for control. The robots would learn a new object at the start of each phase, find the object after it was thrown into a field, and return the object to the operator. Each of the six phases became more difficult, including clutter of the same color or shape as the object, moving and stationary obstacles, and finding the operator who moved from the starting location to a new location. The Robotic Research Team integrated techniques in computer vision, speech recognition, object manipulation, and autonomous navigation. A multi-filter computer vision solution reliably detected the objects while rejecting objects of similar color or shape, even while the robot was in motion. A speech-based interface with short commands provided close to natural communication of complicated commands from the operator to the robot. An innovative gripper design allowed for efficient object pickup. A robust autonomous mobility and navigation solution for ground robotic platforms provided fast and reliable obstacle avoidance and course navigation. The research approach focused on winning the competition while remaining cognizant and relevant to real world applications.
1994-09-01
Hyslop , G.L., Schieber, G.E., Schwartz, M.K., "Automated Mission Planning for the Standoff Land Attack Missile (SLAM)", Proceedings of the...1993, pp. 277-290. [PARK80] Parkinson, B.W., "Overview", Global Positioning System, Vol. 1, The Institute of Navigation, Washington, D.C., 1980 , pp...Navigation Message", Global Positioning System, Vol. 1, The Institute of Navigation, Washington, D.C., 1980 , pp. 55-73. 139 [WOOD851 Wooden, W. H
Ultra-Wideband Tracking System Design for Relative Navigation
NASA Technical Reports Server (NTRS)
Ni, Jianjun David; Arndt, Dickey; Bgo, Phong; Dekome, Kent; Dusl, John
2011-01-01
This presentation briefly discusses a design effort for a prototype ultra-wideband (UWB) time-difference-of-arrival (TDOA) tracking system that is currently under development at NASA Johnson Space Center (JSC). The system is being designed for use in localization and navigation of a rover in a GPS deprived environment for surface missions. In one application enabled by the UWB tracking, a robotic vehicle carrying equipments can autonomously follow a crewed rover from work site to work site such that resources can be carried from one landing mission to the next thereby saving up-mass. The UWB Systems Group at JSC has developed a UWB TDOA High Resolution Proximity Tracking System which can achieve sub-inch tracking accuracy of a target within the radius of the tracking baseline [1]. By extending the tracking capability beyond the radius of the tracking baseline, a tracking system is being designed to enable relative navigation between two vehicles for surface missions. A prototype UWB TDOA tracking system has been designed, implemented, tested, and proven feasible for relative navigation of robotic vehicles. Future work includes testing the system with the application code to increase the tracking update rate and evaluating the linear tracking baseline to improve the flexibility of antenna mounting on the following vehicle.
DOT National Transportation Integrated Search
2008-01-28
The Volpe Center designed, implemented, and deployed a Global Positioning System (GPS) Receiver Autonomous Integrity Monitoring (RAIM) prediction system in the mid 1990s to support both Air Force and Federal Aviation Administration (FAA) use of TSO C...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pin, F.G.; de Saussure, G.; Spelt, P.F.
1988-01-01
This paper describes recent research activities at the Center for Engineering Systems Advanced Research (CESAR) in the area of sensor based reasoning, with emphasis being given to their application and implementation on our HERMIES-IIB autonomous mobile vehicle. These activities, including navigation and exploration in a-priori unknown and dynamic environments, goal recognition, vision-guided manipulation and sensor-driven machine learning, are discussed within the framework of a scenario in which an autonomous robot is asked to navigate through an unknown dynamic environment, explore, find and dock at the panel, read and understand the status of the panel's meters and dials, learn the functioningmore » of a process control panel, and successfully manipulate the control devices of the panel to solve a maintenance emergency problems. A demonstration of the successful implementation of the algorithms on our HERMIES-IIB autonomous robot for resolution of this scenario is presented. Conclusions are drawn concerning the applicability of the methodologies to more general classes of problems and implications for future work on sensor-driven reasoning for autonomous robots are discussed. 8 refs., 3 figs.« less
NASA Johnson Space Center: Mini AERCam Testing with GSS6560
NASA Technical Reports Server (NTRS)
Cryant, Scott P.
2004-01-01
This slide presentation reviews the testing of the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam) with the GPS/SBAS simulation system, GSS6560. There is a listing of several GPS based programs at NASA Johnson, including the testing of Shuttle testing of the GPS system. Including information about Space Integrated GPS/INS (SIGI) testing. There is also information about the standalone ISS SIGI test,and testing of the SIGI for the Crew Return Vehicle. The Mini AERCam is a small, free-flying camera for remote inspections of the ISS, it uses precise relative navigation with differential carrier phase GPS to provide situational awareness to operators. The closed loop orbital testing with and without the use of the GSS6550 system of the Mini AERCam system is reviewed.
Machine vision and appearance based learning
NASA Astrophysics Data System (ADS)
Bernstein, Alexander
2017-03-01
Smart algorithms are used in Machine vision to organize or extract high-level information from the available data. The resulted high-level understanding the content of images received from certain visual sensing system and belonged to an appearance space can be only a key first step in solving various specific tasks such as mobile robot navigation in uncertain environments, road detection in autonomous driving systems, etc. Appearance-based learning has become very popular in the field of machine vision. In general, the appearance of a scene is a function of the scene content, the lighting conditions, and the camera position. Mobile robots localization problem in machine learning framework via appearance space analysis is considered. This problem is reduced to certain regression on an appearance manifold problem, and newly regression on manifolds methods are used for its solution.
Software Construction and Analysis Tools for Future Space Missions
NASA Technical Reports Server (NTRS)
Lowry, Michael R.; Clancy, Daniel (Technical Monitor)
2002-01-01
NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.
Project Morpheus: Morpheus 1.5A Lander Failure Investigation Results
NASA Technical Reports Server (NTRS)
Devolites, Jennifer L.; Olansen, Jon B.; Munday, Stephen R.
2013-01-01
On August 9, 2012 the Morpheus 1.5A vehicle crashed shortly after lift off from the Kennedy Space Center. The loss was limited to the vehicle itself which was pre-declared to be a test failure and not a mishap. The Morpheus project is demonstrating advanced technologies for in space and planetary surface vehicles including: autonomous flight control, landing site hazard identification and safe site selection, relative surface and hazard navigation, precision landing, modular reusable flight software, and high performance, non-toxic, cryogenic liquid Oxygen and liquid Methane integrated main engine and attitude control propulsion system. A comprehensive failure investigation isolated the fault to the Inertial Measurement Unit (IMU) data path to the flight computer. Several improvements have been identified and implemented for the 1.5B and 1.5C vehicles.
2012-08-09
CAPE CANAVERAL, Fla. – At the Shuttle Landing Facility at NASA’s Kennedy Space Center in Florida, the Morpheus prototype lander begins to lift off of the ground during a free-flight test. Testing of the prototype lander had been ongoing at NASA’s Johnson Space Center in Houston in preparation for its first free-flight test at Kennedy Space Center. Morpheus was manufactured and assembled at JSC and Armadillo Aerospace. Morpheus is large enough to carry 1,100 pounds of cargo to the moon – for example, a humanoid robot, a small rover, or a small laboratory to convert moon dust into oxygen. The primary focus of the test is to demonstrate an integrated propulsion and guidance, navigation and control system that can fly a lunar descent profile to exercise the Autonomous Landing and Hazard Avoidance Technology, or ALHAT, safe landing sensors and closed-loop flight control. For more information on Project Morpheus, visit http://morpheuslander.jsc.nasa.gov/. Photo credit: NASA
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
The NASA Centennial Challenges prize, level one, is presented to team Mountaineers for successfully completing level one of the NASA 2014 Sample Return Robot Challenge, from left, Ryan Watson, Team Mountaineers; Lucas Behrens, Team Mountaineers; Jarred Strader, Team Mountaineers; Yu Gu, Team Mountaineers; Scott Harper, Team Mountaineers; Dorothy Rasco, NASA Deputy Associate Administrator for the Space Technology Mission Directorate; Laurie Leshin, Worcester Polytechnic Institute (WPI) President; David Miller, NASA Chief Technologist; Alexander Hypes, Team Mountaineers; Nick Ohi,Team Mountaineers; Marvin Cheng, Team Mountaineers; Sam Ortega, NASA Program Manager for Centennial Challenges; and Tanmay Mandal, Team Mountaineers;, Saturday, June 14, 2014, at Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team Mountaineers was the only team to complete the level one challenge. During the competition, teams were required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge was to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
The NASA Centennial Challenges prize, level one, is presented to team Mountaineers for successfully completing level one of the NASA 2014 Sample Return Robot Challenge, from left, Ken Stafford, WPI Challenge technical advisor; Colleen Shaver, WPI Challenge Manager; Ryan Watson, Team Mountaineers; Marvin Cheng, Team Mountaineers; Alexander Hypes, Team Mountaineers; Jarred Strader, Team Mountaineers; Lucas Behrens, Team Mountaineers; Yu Gu, Team Mountaineers; Nick Ohi, Team Mountaineers; Dorothy Rasco, NASA Deputy Associate Administrator for the Space Technology Mission Directorate; Scott Harper, Team Mountaineers; Tanmay Mandal, Team Mountaineers; David Miller, NASA Chief Technologist; Sam Ortega, NASA Program Manager for Centennial Challenges, Saturday, June 14, 2014, at Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team Mountaineers was the only team to complete the level one challenge. During the competition, teams were required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge was to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
Miniature Robotic Spacecraft for Inspecting Other Spacecraft
NASA Technical Reports Server (NTRS)
Fredrickson, Steven; Abbott, Larry; Duran, Steve; Goode, Robert; Howard, Nathan; Jochim, David; Rickman, Steve; Straube, Tim; Studak, Bill; Wagenknecht, Jennifer;
2004-01-01
A report discusses the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam)-- a compact robotic spacecraft intended to be released from a larger spacecraft for exterior visual inspection of the larger spacecraft. The Mini AERCam is a successor to the AERCam Sprint -- a prior miniature robotic inspection spacecraft that was demonstrated in a space-shuttle flight experiment in 1997. The prototype of the Mini AERCam is a demonstration unit having approximately the form and function of a flight system. The Mini AERCam is approximately spherical with a diameter of about 7.5 in. (.19 cm) and a weight of about 10 lb (.4.5 kg), yet it has significant additional capabilities, relative to the 14-in. (36-cm), 35-lb (16-kg) AERCam Sprint. The Mini AERCam includes miniaturized avionics, instrumentation, communications, navigation, imaging, power, and propulsion subsystems, including two digital video cameras and a high-resolution still camera. The Mini AERCam is designed for either remote piloting or supervised autonomous operations, including station keeping and point-to-point maneuvering. The prototype has been tested on an air-bearing table and in a hardware-in-the-loop orbital simulation of the dynamics of maneuvering in proximity to the International Space Station.
NASA Technical Reports Server (NTRS)
Trawny, Nikolas; Huertas, Andres; Luna, Michael E.; Villalpando, Carlos Y.; Martin, Keith E.; Carson, John M.; Johnson, Andrew E.; Restrepo, Carolina; Roback, Vincent E.
2015-01-01
The Hazard Detection System (HDS) is a component of the ALHAT (Autonomous Landing and Hazard Avoidance Technology) sensor suite, which together provide a lander Guidance, Navigation and Control (GN&C) system with the relevant measurements necessary to enable safe precision landing under any lighting conditions. The HDS consists of a stand-alone compute element (CE), an Inertial Measurement Unit (IMU), and a gimbaled flash LIDAR sensor that are used, in real-time, to generate a Digital Elevation Map (DEM) of the landing terrain, detect candidate safe landing sites for the vehicle through Hazard Detection (HD), and generate hazard-relative navigation (HRN) measurements used for safe precision landing. Following an extensive ground and helicopter test campaign, ALHAT was integrated onto the Morpheus rocket-powered terrestrial test vehicle in March 2014. Morpheus and ALHAT then performed five successful free flights at the simulated lunar hazard field constructed at the Shuttle Landing Facility (SLF) at Kennedy Space Center, for the first time testing the full system on a lunar-like approach geometry in a relevant dynamic environment. During these flights, the HDS successfully generated DEMs, correctly identified safe landing sites and provided HRN measurements to the vehicle, marking the first autonomous landing of a NASA rocket-powered vehicle in hazardous terrain. This paper provides a brief overview of the HDS architecture and describes its in-flight performance.
Optimal rotation sequences for active perception
NASA Astrophysics Data System (ADS)
Nakath, David; Rachuy, Carsten; Clemens, Joachim; Schill, Kerstin
2016-05-01
One major objective of autonomous systems navigating in dynamic environments is gathering information needed for self localization, decision making, and path planning. To account for this, such systems are usually equipped with multiple types of sensors. As these sensors often have a limited field of view and a fixed orientation, the task of active perception breaks down to the problem of calculating alignment sequences which maximize the information gain regarding expected measurements. Action sequences that rotate the system according to the calculated optimal patterns then have to be generated. In this paper we present an approach for calculating these sequences for an autonomous system equipped with multiple sensors. We use a particle filter for multi- sensor fusion and state estimation. The planning task is modeled as a Markov decision process (MDP), where the system decides in each step, what actions to perform next. The optimal control policy, which provides the best action depending on the current estimated state, maximizes the expected cumulative reward. The latter is computed from the expected information gain of all sensors over time using value iteration. The algorithm is applied to a manifold representation of the joint space of rotation and time. We show the performance of the approach in a spacecraft navigation scenario where the information gain is changing over time, caused by the dynamic environment and the continuous movement of the spacecraft
Satellite Imagery Assisted Road-Based Visual Navigation System
NASA Astrophysics Data System (ADS)
Volkova, A.; Gibbens, P. W.
2016-06-01
There is a growing demand for unmanned aerial systems as autonomous surveillance, exploration and remote sensing solutions. Among the key concerns for robust operation of these systems is the need to reliably navigate the environment without reliance on global navigation satellite system (GNSS). This is of particular concern in Defence circles, but is also a major safety issue for commercial operations. In these circumstances, the aircraft needs to navigate relying only on information from on-board passive sensors such as digital cameras. An autonomous feature-based visual system presented in this work offers a novel integral approach to the modelling and registration of visual features that responds to the specific needs of the navigation system. It detects visual features from Google Earth* build a feature database. The same algorithm then detects features in an on-board cameras video stream. On one level this serves to localise the vehicle relative to the environment using Simultaneous Localisation and Mapping (SLAM). On a second level it correlates them with the database to localise the vehicle with respect to the inertial frame. The performance of the presented visual navigation system was compared using the satellite imagery from different years. Based on comparison results, an analysis of the effects of seasonal, structural and qualitative changes of the imagery source on the performance of the navigation algorithm is presented. * The algorithm is independent of the source of satellite imagery and another provider can be used
Structured Kernel Subspace Learning for Autonomous Robot Navigation.
Kim, Eunwoo; Choi, Sungjoon; Oh, Songhwai
2018-02-14
This paper considers two important problems for autonomous robot navigation in a dynamic environment, where the goal is to predict pedestrian motion and control a robot with the prediction for safe navigation. While there are several methods for predicting the motion of a pedestrian and controlling a robot to avoid incoming pedestrians, it is still difficult to safely navigate in a dynamic environment due to challenges, such as the varying quality and complexity of training data with unwanted noises. This paper addresses these challenges simultaneously by proposing a robust kernel subspace learning algorithm based on the recent advances in nuclear-norm and l 1 -norm minimization. We model the motion of a pedestrian and the robot controller using Gaussian processes. The proposed method efficiently approximates a kernel matrix used in Gaussian process regression by learning low-rank structured matrix (with symmetric positive semi-definiteness) to find an orthogonal basis, which eliminates the effects of erroneous and inconsistent data. Based on structured kernel subspace learning, we propose a robust motion model and motion controller for safe navigation in dynamic environments. We evaluate the proposed robust kernel learning in various tasks, including regression, motion prediction, and motion control problems, and demonstrate that the proposed learning-based systems are robust against outliers and outperform existing regression and navigation methods.
Real-time visual mosaicking and navigation on the seafloor
NASA Astrophysics Data System (ADS)
Richmond, Kristof
Remote robotic exploration holds vast potential for gaining knowledge about extreme environments accessible to humans only with great difficulty. Robotic explorers have been sent to other solar system bodies, and on this planet into inaccessible areas such as caves and volcanoes. In fact, the largest unexplored land area on earth lies hidden in the airless cold and intense pressure of the ocean depths. Exploration in the oceans is further hindered by water's high absorption of electromagnetic radiation, which both inhibits remote sensing from the surface, and limits communications with the bottom. The Earth's oceans thus provide an attractive target for developing remote exploration capabilities. As a result, numerous robotic vehicles now routinely survey this environment, from remotely operated vehicles piloted over tethers from the surface to torpedo-shaped autonomous underwater vehicles surveying the mid-waters. However, these vehicles are limited in their ability to navigate relative to their environment. This limits their ability to return to sites with precision without the use of external navigation aids, and to maneuver near and interact with objects autonomously in the water and on the sea floor. The enabling of environment-relative positioning on fully autonomous underwater vehicles will greatly extend their power and utility for remote exploration in the furthest reaches of the Earth's waters---even under ice and under ground---and eventually in extraterrestrial liquid environments such as Europa's oceans. This thesis presents an operational, fielded system for visual navigation of underwater robotic vehicles in unexplored areas of the seafloor. The system does not depend on external sensing systems, using only instruments on board the vehicle. As an area is explored, a camera is used to capture images and a composite view, or visual mosaic, of the ocean bottom is created in real time. Side-to-side visual registration of images is combined with dead-reckoned navigation information in a framework allowing the creation and updating of large, locally consistent mosaics. These mosaics are used as maps in which the vehicle can navigate and localize itself with respect to points in the environment. The system achieves real-time performance in several ways. First, wherever possible, direct sensing of motion parameters is used in place of extracting them from visual data. Second, trajectories are chosen to enable a hierarchical search for side-to-side links which limits the amount of searching performed without sacrificing robustness. Finally, the map estimation is formulated as a sparse, linear information filter allowing rapid updating of large maps. The visual navigation enabled by the work in this thesis represents a new capability for remotely operated vehicles, and an enabling capability for a new generation of autonomous vehicles which explore and interact with remote, unknown and unstructured underwater environments. The real-time mosaic can be used on current tethered vehicles to create pilot aids and provide a vehicle user with situational awareness of the local environment and the position of the vehicle within it. For autonomous vehicles, the visual navigation system enables precise environment-relative positioning and mapping, without requiring external navigation systems, opening the way for ever-expanding autonomous exploration capabilities. The utility of this system was demonstrated in the field at sites of scientific interest using the ROVs Ventana and Tiburon operated by the Monterey Bay Aquarium Research Institute. A number of sites in and around Monterey Bay, California were mosaicked using the system, culminating in a complete imaging of the wreck site of the USS Macon , where real-time visual mosaics containing thousands of images were generated while navigating using only sensor systems on board the vehicle.
Autonomous precision landing using terrain-following navigation
NASA Technical Reports Server (NTRS)
Vaughan, R. M.; Gaskell, R. W.; Halamek, P.; Klumpp, A. R.; Synnott, S. P.
1991-01-01
Terrain-following navigation studies that have been done over the past two years in the navigation system section at JPL are described. A descent to Mars scenario based on Mars Rover and Sample Return mission profiles is described, and navigation and image processing issues pertaining to descent phases where landmark picture can be obtained are examined. A covariance analysis is performed to verify that landmark measurements from a terrain-following navigation system can satisfy precision landing requirements. Image processing problems involving known landmarks in actual pictures are considered. Mission design alternatives that can alleviate some of these problems are suggested.
A Self-Tuning Kalman Filter for Autonomous Navigation Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, Son H.
1999-01-01
Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS (Global Positioning Systems) data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.
A Self-Tuning Kalman Filter for Autonomous Navigation using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, S. H.
1999-01-01
Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.
Autonomous Navigation Apparatus With Neural Network for a Mobile Vehicle
NASA Technical Reports Server (NTRS)
Quraishi, Naveed (Inventor)
1996-01-01
An autonomous navigation system for a mobile vehicle arranged to move within an environment includes a plurality of sensors arranged on the vehicle and at least one neural network including an input layer coupled to the sensors, a hidden layer coupled to the input layer, and an output layer coupled to the hidden layer. The neural network produces output signals representing respective positions of the vehicle, such as the X coordinate, the Y coordinate, and the angular orientation of the vehicle. A plurality of patch locations within the environment are used to train the neural networks to produce the correct outputs in response to the distances sensed.
Daytime Water Detection by Fusing Multiple Cues for Autonomous Off-Road Navigation
NASA Technical Reports Server (NTRS)
Rankin, A. L.; Matthies, L. H.; Huertas, A.
2004-01-01
Detecting water hazards is a significant challenge to unmanned ground vehicle autonomous off-road navigation. This paper focuses on detecting the presence of water during the daytime using color cameras. A multi-cue approach is taken. Evidence of the presence of water is generated from color, texture, and the detection of reflections in stereo range data. A rule base for fusing water cues was developed by evaluating detection results from an extensive archive of data collection imagery containing water. This software has been implemented into a run-time passive perception subsystem and tested thus far under Linux on a Pentium based processor.
Obstacle Avoidance On Roadways Using Range Data
NASA Astrophysics Data System (ADS)
Dunlay, R. Terry; Morgenthaler, David G.
1987-02-01
This report describes range data based obstacle avoidance techniques developed for use on an autonomous road-following robot vehicle. The purpose of these techniques is to detect and locate obstacles present in a road environment for navigation of a robot vehicle equipped with an active laser-based range sensor. Techniques are presented for obstacle detection, obstacle location, and coordinate transformations needed in the construction of Scene Models (symbolic structures representing the 3-D obstacle boundaries used by the vehicle's Navigator for path planning). These techniques have been successfully tested on an outdoor robotic vehicle, the Autonomous Land Vehicle (ALV), at speeds up to 3.5 km/hour.
Dynamic multisensor fusion for mobile robot navigation in an indoor environment
NASA Astrophysics Data System (ADS)
Jin, Taeseok; Lee, Jang-Myung; Luk, Bing L.; Tso, Shiu K.
2001-10-01
In this study, as the preliminary step for developing a multi-purpose Autonomous robust carrier mobile robot to transport trolleys or heavy goods and serve as robotic nursing assistant in hospital wards. The aim of this paper is to present the use of multi-sensor data fusion such as sonar, CCD camera dn IR sensor for map-building mobile robot to navigate, and presents an experimental mobile robot designed to operate autonomously within both indoor and outdoor environments. Smart sensory systems are crucial for successful autonomous systems. We will give an explanation for the robot system architecture designed and implemented in this study and a short review of existing techniques, since there exist several recent thorough books and review paper on this paper. Instead we will focus on the main results with relevance to the intelligent service robot project at the Centre of Intelligent Design, Automation & Manufacturing (CIDAM). We will conclude by discussing some possible future extensions of the project. It is first dealt with the general principle of the navigation and guidance architecture, then the detailed functions recognizing environments updated, obstacle detection and motion assessment, with the first results form the simulations run.
Neuro-fuzzy controller to navigate an unmanned vehicle.
Selma, Boumediene; Chouraqui, Samira
2013-12-01
A Neuro-fuzzy control method for an Unmanned Vehicle (UV) simulation is described. The objective is guiding an autonomous vehicle to a desired destination along a desired path in an environment characterized by a terrain and a set of distinct objects, such as obstacles like donkey traffic lights and cars circulating in the trajectory. The autonomous navigate ability and road following precision are mainly influenced by its control strategy and real-time control performance. Fuzzy Logic Controller can very well describe the desired system behavior with simple "if-then" relations owing the designer to derive "if-then" rules manually by trial and error. On the other hand, Neural Networks perform function approximation of a system but cannot interpret the solution obtained neither check if its solution is plausible. The two approaches are complementary. Combining them, Neural Networks will allow learning capability while Fuzzy-Logic will bring knowledge representation (Neuro-Fuzzy). In this paper, an artificial neural network fuzzy inference system (ANFIS) controller is described and implemented to navigate the autonomous vehicle. Results show several improvements in the control system adjusted by neuro-fuzzy techniques in comparison to the previous methods like Artificial Neural Network (ANN).
NASA Astrophysics Data System (ADS)
Brown, C. David; Ih, Charles S.; Arce, Gonzalo R.; Fertell, David A.
1987-01-01
Vision systems for mobile robots or autonomous vehicles navigating in an unknown terrain environment must provide a rapid and accurate method of segmenting the scene ahead into regions of pathway and background. A major distinguishing feature between the pathway and background is the three dimensional texture of these two regions. Typical methods of textural image segmentation are very computationally intensive, often lack the required robustness, and are incapable of sensing the three dimensional texture of various regions of the scene. A method is presented where scanned laser projected lines of structured light, viewed by a stereoscopically located single video camera, resulted in an image in which the three dimensional characteristics of the scene were represented by the discontinuity of the projected lines. This image was conducive to processing with simple regional operators to classify regions as pathway or background. Design of some operators and application methods, and demonstration on sample images are presented. This method provides rapid and robust scene segmentation capability that has been implemented on a microcomputer in near real time, and should result in higher speed and more reliable robotic or autonomous navigation in unstructured environments.
Non-destructive inspection in industrial equipment using robotic mobile manipulation
NASA Astrophysics Data System (ADS)
Maurtua, Iñaki; Susperregi, Loreto; Ansuategui, Ander; Fernández, Ane; Ibarguren, Aitor; Molina, Jorge; Tubio, Carlos; Villasante, Cristobal; Felsch, Torsten; Pérez, Carmen; Rodriguez, Jorge R.; Ghrissi, Meftah
2016-05-01
MAINBOT project has developed service robots based applications to autonomously execute inspection tasks in extensive industrial plants in equipment that is arranged horizontally (using ground robots) or vertically (climbing robots). The industrial objective has been to provide a means to help measuring several physical parameters in multiple points by autonomous robots, able to navigate and climb structures, handling non-destructive testing sensors. MAINBOT has validated the solutions in two solar thermal plants (cylindrical-parabolic collectors and central tower), that are very demanding from mobile manipulation point of view mainly due to the extension (e.g. a thermal solar plant of 50Mw, with 400 hectares, 400.000 mirrors, 180 km of absorber tubes, 140m height tower), the variability of conditions (outdoor, day-night), safety requirements, etc. Once the technology was validated in simulation, the system was deployed in real setups and different validation tests carried out. In this paper two of the achievements related with the ground mobile inspection system are presented: (1) Autonomous navigation localization and planning algorithms to manage navigation in huge extensions and (2) Non-Destructive Inspection operations: thermography based detection algorithms to provide automatic inspection abilities to the robots.
Cerebellum Augmented Rover Development
NASA Technical Reports Server (NTRS)
King, Matthew
2005-01-01
Bio-Inspired Technologies and Systems (BITS) are a very natural result of thinking about Nature's way of solving problems. Knowledge of animal behaviors an be used in developing robotic behaviors intended for planetary exploration. This is the expertise of the JFL BITS Group and has served as a philosophical model for NMSU RioRobolab. Navigation is a vital function for any autonomous system. Systems must have the ability to determine a safe path between their current location and some target location. The MER mission, as well as other JPL rover missions, uses a method known as dead-reckoning to determine position information. Dead-reckoning uses wheel encoders to sense the wheel's rotation. In a sandy environment such as Mars, this method is highly inaccurate because the wheels will slip in the sand. Improving positioning error will allow the speed of an autonomous navigating rover to be greatly increased. Therefore, local navigation based upon landmark tracking is desirable in planetary exploration. The BITS Group is developing navigation technology based upon landmark tracking. Integration of the current rover architecture with a cerebellar neural network tracking algorithm will demonstrate that this approach to navigation is feasible and should be implemented in future rover and spacecraft missions.
NASA Technical Reports Server (NTRS)
D'Souza, Christopher; Milenkovich, Zoran; Wilson, Zachary; Huich, David; Bendle, John; Kibler, Angela
2011-01-01
The Space Operations Simulation Center (SOSC) at the Lockheed Martin (LM) Waterton Campus in Littleton, Colorado is a dynamic test environment focused on Autonomous Rendezvous and Docking (AR&D) development testing and risk reduction activities. The SOSC supports multiple program pursuits and accommodates testing Guidance, Navigation, and Control (GN&C) algorithms for relative navigation, hardware testing and characterization, as well as software and test process development. The SOSC consists of a high bay (60 meters long by 15.2 meters wide by 15.2 meters tall) with dual six degree-of-freedom (6DOF) motion simulators and a single fixed base 6DOF robot. The large testing area (maximum sensor-to-target effective range of 60 meters) allows for large-scale, flight-like simulations of proximity maneuvers and docking events. The facility also has two apertures for access to external extended-range outdoor target test operations. In addition, the facility contains four Mission Operations Centers (MOCs) with connectivity to dual high bay control rooms and a data/video interface room. The high bay is rated at Class 300,000 (. 0.5 m maximum particles/m3) cleanliness and includes orbital lighting simulation capabilities.
The study of stereo vision technique for the autonomous vehicle
NASA Astrophysics Data System (ADS)
Li, Pei; Wang, Xi; Wang, Jiang-feng
2015-08-01
The stereo vision technology by two or more cameras could recovery 3D information of the field of view. This technology can effectively help the autonomous navigation system of unmanned vehicle to judge the pavement conditions within the field of view, and to measure the obstacles on the road. In this paper, the stereo vision technology in measuring the avoidance of the autonomous vehicle is studied and the key techniques are analyzed and discussed. The system hardware of the system is built and the software is debugged, and finally the measurement effect is explained by the measured data. Experiments show that the 3D reconstruction, within the field of view, can be rebuilt by the stereo vision technology effectively, and provide the basis for pavement condition judgment. Compared with unmanned vehicle navigation radar used in measuring system, the stereo vision system has the advantages of low cost, distance and so on, it has a good application prospect.
HERMIES-3: A step toward autonomous mobility, manipulation, and perception
NASA Technical Reports Server (NTRS)
Weisbin, C. R.; Burks, B. L.; Einstein, J. R.; Feezell, R. R.; Manges, W. W.; Thompson, D. H.
1989-01-01
HERMIES-III is an autonomous robot comprised of a seven degree-of-freedom (DOF) manipulator designed for human scale tasks, a laser range finder, a sonar array, an omni-directional wheel-driven chassis, multiple cameras, and a dual computer system containing a 16-node hypercube expandable to 128 nodes. The current experimental program involves performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES-III). The environment in which the robots operate has been designed to include multiple valves, pipes, meters, obstacles on the floor, valves occluded from view, and multiple paths of differing navigation complexity. The ongoing research program supports the development of autonomous capability for HERMIES-IIB and III to perform complex navigation and manipulation under time constraints, while dealing with imprecise sensory information.
Sahl, Jason W; Fairfield, Nathaniel; Harris, J Kirk; Wettergreen, David; Stone, William C; Spear, John R
2010-03-01
The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (approximately 318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment.
Systems and Methods for Automated Vessel Navigation Using Sea State Prediction
NASA Technical Reports Server (NTRS)
Huntsberger, Terrance L. (Inventor); Howard, Andrew B. (Inventor); Reinhart, Rene Felix (Inventor); Aghazarian, Hrand (Inventor); Rankin, Arturo (Inventor)
2017-01-01
Systems and methods for sea state prediction and autonomous navigation in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes a method of predicting a future sea state including generating a sequence of at least two 3D images of a sea surface using at least two image sensors, detecting peaks and troughs in the 3D images using a processor, identifying at least one wavefront in each 3D image based upon the detected peaks and troughs using the processor, characterizing at least one propagating wave based upon the propagation of wavefronts detected in the sequence of 3D images using the processor, and predicting a future sea state using at least one propagating wave characterizing the propagation of wavefronts in the sequence of 3D images using the processor. Another embodiment includes a method of autonomous vessel navigation based upon a predicted sea state and target location.
Systems and Methods for Automated Vessel Navigation Using Sea State Prediction
NASA Technical Reports Server (NTRS)
Aghazarian, Hrand (Inventor); Reinhart, Rene Felix (Inventor); Huntsberger, Terrance L. (Inventor); Rankin, Arturo (Inventor); Howard, Andrew B. (Inventor)
2015-01-01
Systems and methods for sea state prediction and autonomous navigation in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes a method of predicting a future sea state including generating a sequence of at least two 3D images of a sea surface using at least two image sensors, detecting peaks and troughs in the 3D images using a processor, identifying at least one wavefront in each 3D image based upon the detected peaks and troughs using the processor, characterizing at least one propagating wave based upon the propagation of wavefronts detected in the sequence of 3D images using the processor, and predicting a future sea state using at least one propagating wave characterizing the propagation of wavefronts in the sequence of 3D images using the processor. Another embodiment includes a method of autonomous vessel navigation based upon a predicted sea state and target location.
NASA Technical Reports Server (NTRS)
Larimer, Stanley J.; Lisec, Thomas R.; Spiessbach, Andrew J.
1989-01-01
Under a contract with NASA's Jet Propulsion Laboratory, Martin Marietta has developed several alternative rover concepts for unmanned exploration of the planet Mars. One of those concepts, the 'Walking Beam', is the subject of this paper. This concept was developed with the goal of achieving many of the capabilities of more sophisticated articulated-leg walkers with a much simpler, more robust, less computationally demanding and more power efficient design. It consists of two large-base tripods nested one within the other which alternately translate with respect to each other along a 5-meter beam to propel the vehicle. The semiautonomous navigation system relies on terrain geometry sensors and tacticle feedback from each foot to autonomously select a path which avoids hazards along a route designated from earth. Both mobility and navigation features of this concept are discussed including a top-level description of the vehicle's physical characteristics, deployment strategy, mobility elements, sensor suite, theory of operation, navigation and control processes, and estimated performance.
NASA Technical Reports Server (NTRS)
2004-01-01
This pair of pieced-together images was taken by the Mars Exploration Rover Spirit's left navigation camera looking aft on March 6, 2004. It reveals the long and rocky path of nearly 240 meters (787 feet) that Spirit had traveled since safely arriving at Gusev Crater on Jan. 3, 2004.
The lander can still be seen in the distance, but will never be 'home' again for the journeying rover. This image is also a tribute to the effectiveness of the autonomous navigation system that the rovers use during parts of their martian drives. Instead of driving directly through the 'hollow' seen in the middle right of the image, the autonomous navigation system guided Spirit around the high ridge bordering the hollow. In the two days after these images were taken, Spirit has traveled roughly 60 meters (197 feet) farther toward its destination at the crater nicknamed 'Bonneville'.NASA Astrophysics Data System (ADS)
Henderson, D. W.
Military users are becoming increasingly dependent on satellites for vital services related to communication, surveillance information, navigation, and meteorological data. The current military spacecraft, however, need the services of a ground support network which is vulnerable in connection with a variety of threats. It has, therefore, been proposed to decrease the dependence of the satellites on the ground segment by improving satellite autonomy, and the Satellite Autonomy Program at the recently created Air Force Space Technology Center is developing the Autonomous Redundancy and Maintenance Management Subsystem (ARMMS) for a near term generic autonomy solution. Attention is given to the implementation of autonomy and technological requirements for ensuring autonomy.
Foliage discrimination using a rotating ladar
NASA Technical Reports Server (NTRS)
Castano, A.; Matthies, L.
2003-01-01
We present a real time algorithm that detects foliage using range from a rotating laser. Objects not classified as foliage are conservatively labeled as non-driving obstacles. In contrast to related work that uses range statistics to classify objects, we exploit the expected localities and continuities of an obstacle, in both space and time. Also, instead of attempting to find a single accurate discriminating factor for every ladar return, we hypothesize the class of some few returns and then spread the confidence (and classification) to other returns using the locality constraints. The Urbie robot is presently using this algorithm to descriminate drivable grass from obstacles during outdoor autonomous navigation tasks.
Autonomous navigation system. [gyroscopic pendulum for air navigation
NASA Technical Reports Server (NTRS)
Merhav, S. J. (Inventor)
1981-01-01
An inertial navigation system utilizing a servo-controlled two degree of freedom pendulum to obtain specific force components in the locally level coordinate system is described. The pendulum includes a leveling gyroscope and an azimuth gyroscope supported on a two gimbal system. The specific force components in the locally level coordinate system are converted to components in the geographical coordinate system by means of a single Euler transformation. The standard navigation equations are solved to determine longitudinal and lateral velocities. Finally, vehicle position is determined by a further integration.
Automation and robotics technology for intelligent mining systems
NASA Technical Reports Server (NTRS)
Welsh, Jeffrey H.
1989-01-01
The U.S. Bureau of Mines is approaching the problems of accidents and efficiency in the mining industry through the application of automation and robotics to mining systems. This technology can increase safety by removing workers from hazardous areas of the mines or from performing hazardous tasks. The short-term goal of the Automation and Robotics program is to develop technology that can be implemented in the form of an autonomous mining machine using current continuous mining machine equipment. In the longer term, the goal is to conduct research that will lead to new intelligent mining systems that capitalize on the capabilities of robotics. The Bureau of Mines Automation and Robotics program has been structured to produce the technology required for the short- and long-term goals. The short-term goal of application of automation and robotics to an existing mining machine, resulting in autonomous operation, is expected to be accomplished within five years. Key technology elements required for an autonomous continuous mining machine are well underway and include machine navigation systems, coal-rock interface detectors, machine condition monitoring, and intelligent computer systems. The Bureau of Mines program is described, including status of key technology elements for an autonomous continuous mining machine, the program schedule, and future work. Although the program is directed toward underground mining, much of the technology being developed may have applications for space systems or mining on the Moon or other planets.
Multi-agent robotic systems and applications for satellite missions
NASA Astrophysics Data System (ADS)
Nunes, Miguel A.
A revolution in the space sector is happening. It is expected that in the next decade there will be more satellites launched than in the previous sixty years of space exploration. Major challenges are associated with this growth of space assets such as the autonomy and management of large groups of satellites, in particular with small satellites. There are two main objectives for this work. First, a flexible and distributed software architecture is presented to expand the possibilities of spacecraft autonomy and in particular autonomous motion in attitude and position. The approach taken is based on the concept of distributed software agents, also referred to as multi-agent robotic system. Agents are defined as software programs that are social, reactive and proactive to autonomously maximize the chances of achieving the set goals. Part of the work is to demonstrate that a multi-agent robotic system is a feasible approach for different problems of autonomy such as satellite attitude determination and control and autonomous rendezvous and docking. The second main objective is to develop a method to optimize multi-satellite configurations in space, also known as satellite constellations. This automated method generates new optimal mega-constellations designs for Earth observations and fast revisit times on large ground areas. The optimal satellite constellation can be used by researchers as the baseline for new missions. The first contribution of this work is the development of a new multi-agent robotic system for distributing the attitude determination and control subsystem for HiakaSat. The multi-agent robotic system is implemented and tested on the satellite hardware-in-the-loop testbed that simulates a representative space environment. The results show that the newly proposed system for this particular case achieves an equivalent control performance when compared to the monolithic implementation. In terms on computational efficiency it is found that the multi-agent robotic system has a consistent lower CPU load of 0.29 +/- 0.03 compared to 0.35 +/- 0.04 for the monolithic implementation, a 17.1 % reduction. The second contribution of this work is the development of a multi-agent robotic system for the autonomous rendezvous and docking of multiple spacecraft. To compute the maneuvers guidance, navigation and control algorithms are implemented as part of the multi-agent robotic system. The navigation and control functions are implemented using existing algorithms, but one important contribution of this section is the introduction of a new six degrees of freedom guidance method which is part of the guidance, navigation and control architecture. This new method is an explicit solution to the guidance problem, and is particularly useful for real time guidance for attitude and position, as opposed to typical guidance methods which are based on numerical solutions, and therefore are computationally intensive. A simulation scenario is run for docking four CubeSats deployed radially from a launch vehicle. Considering fully actuated CubeSats, the simulations show docking maneuvers that are successfully completed within 25 minutes which is approximately 30% of a full orbital period in low earth orbit. The final section investigates the problem of optimization of satellite constellations for fast revisit time, and introduces a new method to generate different constellation configurations that are evaluated with a genetic algorithm. Two case studies are presented. The first is the optimization of a constellation for rapid coverage of the oceans of the globe in 24 hours or less. Results show that for an 80 km sensor swath width 50 satellites are required to cover the oceans with a 24 hour revisit time. The second constellation configuration study focuses on the optimization for the rapid coverage of the North Atlantic Tracks for air traffic monitoring in 3 hours or less. The results show that for a fixed swath width of 160 km and for a 3 hour revisit time 52 satellites are required.
TDRSS Onboard Navigation System (TONS) flight qualification experiment
NASA Technical Reports Server (NTRS)
Gramling, C. J.; Hart, R. C.; Folta, D. C.; Long, A. C.
1994-01-01
The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) is currently developing an operational Tracking and Data Relay Satellite (TDRS) System (TDRSS) Onboard Navigation System (TONS) to provide realtime, autonomous, high-accuracy navigation products to users of TDRSS. A TONS experiment was implemented on the Explorer Platform/Extreme Ultraviolet Explorer (EP/EUVE) spacecraft, launched June 7, 1992, to flight qualify the TONS operational system using TDRSS forward-link communications services. This paper provides a detailed evaluation of the flight hardware, an ultrastable oscillator (USO) and Doppler extractor (DE) card in one of the TDRSS user transponders and the ground-based prototype flight software performance, based on the 1 year of TONS experiment operation. The TONS experiment results are used to project the expected performance of the TONS 1 operational system. TONS 1 processes Doppler data derived from scheduled forward-link S-band services using a sequential estimation algorithm enhanced by a sophisticated process noise model to provide onboard orbit and frequency determination and time maintenance. TONS 1 will be the prime navigation system on the Earth Observing System (EOS)-AM1 spacecraft, currently scheduled for launch in 1998. Inflight evaluation of the USO and DE short-term and long-term stability indicates that the performance is excellent. Analysis of the TONS prototype flight software performance indicates that realtime onboard position accuracies of better than 25 meters root-mean-square are achievable with one tracking contact every one to two orbits for the EP/EUVE 525-kilometer altitude, 28.5 degree inclination orbit. The success of the TONS experiment demonstrates the flight readiness of TONS to support the EOS-AM1 mission.
Proceedings of the 20th International Symposium on Space Flight Dynamics
NASA Technical Reports Server (NTRS)
Woodard, Mark (Editor); Stengle, Tom (Editor)
2007-01-01
Topics include: Measuring Image Navigation and Registration Performance at the 3-Sigma Level Using Platinum Quality Landmarks; Flight Dynamics Performances of the MetOp A Satellite during the First Months of Operations; Visual Navigation - SARE Mission; Determining a Method of Enabling and Disabling the Integral Torque in the SDO Science and Inertial Mode Controllers; Guaranteeing Pointing Performance of the SDO Sun-Pointing Controllers in Light of Nonlinear Effects; SDO Delta H Mode Design and Analysis; Observing Mode Attitude Controller for the Lunar Reconnaissance Orbiter; Broken-Plane Maneuver Applications for Earth to Mars Trajectories; ExoMars Mission Analysis and Design - Launch, Cruise and Arrival Analyses; Mars Reconnaissance Orbiter Aerobraking Daily Operations and Collision Avoidance; Mars Reconnaissance Orbiter Interplanetary Cruise Navigation; Motion Parameters Determination of the SC and Phobos in the Project Phobos-Grunt; GRAS NRT Precise Orbit Determination: Operational Experience; Orbit Determination of LEO Satellites for a Single Pass through a Radar: Comparison of Methods; Orbit Determination System for Low Earth Orbit Satellites; Precise Orbit Determination for ALOS; Anti-Collision Function Design and Performances of the CNES Formation Flying Experiment on the PRISMA Mission; CNES Approaching Guidance Experiment within FFIORD; Maneuver Recovery Analysis for the Magnetospheric Multiscale Mission; SIMBOL-X: A Formation Flying Mission on HEO for Exploring the Universe; Spaceborne Autonomous and Ground Based Relative Orbit Control for the TerraSAR-X/TanDEM-X Formation; First In-Orbit Experience of TerraSAR-X Flight Dynamics Operations; Automated Target Planning for FUSE Using the SOVA Algorithm; Space Technology 5 Post-Launch Ground Attitude Estimation Experience; Standardizing Navigation Data: A Status Update; and A Study into the Method of Precise Orbit Determination of a HEO Orbiter by GPS and Accelerometer.
Autonomous navigation and mobility for a planetary rover
NASA Technical Reports Server (NTRS)
Miller, David P.; Mishkin, Andrew H.; Lambert, Kenneth E.; Bickler, Donald; Bernard, Douglas E.
1989-01-01
This paper presents an overview of the onboard subsystems that will be used in guiding a planetary rover. Particular emphasis is placed on the planning and sensing systems and their associated costs, particularly in computation. Issues that will be used in evaluating trades between the navigation system and mobility system are also presented.
Spacecraft angular velocity estimation algorithm for star tracker based on optical flow techniques
NASA Astrophysics Data System (ADS)
Tang, Yujie; Li, Jian; Wang, Gangyi
2018-02-01
An integrated navigation system often uses the traditional gyro and star tracker for high precision navigation with the shortcomings of large volume, heavy weight and high-cost. With the development of autonomous navigation for deep space and small spacecraft, star tracker has been gradually used for attitude calculation and angular velocity measurement directly. At the same time, with the dynamic imaging requirements of remote sensing satellites and other imaging satellites, how to measure the angular velocity in the dynamic situation to improve the accuracy of the star tracker is the hotspot of future research. We propose the approach to measure angular rate with a nongyro and improve the dynamic performance of the star tracker. First, the star extraction algorithm based on morphology is used to extract the star region, and the stars in the two images are matched according to the method of angular distance voting. The calculation of the displacement of the star image is measured by the improved optical flow method. Finally, the triaxial angular velocity of the star tracker is calculated by the star vector using the least squares method. The method has the advantages of fast matching speed, strong antinoise ability, and good dynamic performance. The triaxial angular velocity of star tracker can be obtained accurately with these methods. So, the star tracker can achieve better tracking performance and dynamic attitude positioning accuracy to lay a good foundation for the wide application of various satellites and complex space missions.
Cybersecurity for aerospace autonomous systems
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2015-05-01
High profile breaches have occurred across numerous information systems. One area where attacks are particularly problematic is autonomous control systems. This paper considers the aerospace information system, focusing on elements that interact with autonomous control systems (e.g., onboard UAVs). It discusses the trust placed in the autonomous systems and supporting systems (e.g., navigational aids) and how this trust can be validated. Approaches to remotely detect the UAV compromise, without relying on the onboard software (on a potentially compromised system) as part of the process are discussed. How different levels of autonomy (task-based, goal-based, mission-based) impact this remote characterization is considered.
Considerations for an Integrated UAS CNS Architecture
NASA Technical Reports Server (NTRS)
Templin, Fred L.; Jain, Raj; Sheffield, Greg; Taboso-Bellesteros, Pedro; Ponchak, Denise
2017-01-01
The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) is investigating revolutionary and advanced universal, reliable, always available, cyber secure and affordable Communication, Navigation, Surveillance (CNS) options for all altitudes of UAS operations. In Spring 2015, NASA issued a Call for Proposals under NASA Research Announcements (NRA) NNH15ZEA001N, Amendment 7 Subtopic 2.4. Boeing was selected to conduct a study with the objective to determine the most promising candidate technologies for Unmanned Air Systems (UAS) air-to-air and air-to-ground data exchange and analyze their suitability in a post-NextGen NAS environment. The overall objectives are to develop UAS CNS requirements and then develop architectures that satisfy the requirements for UAS in both controlled and uncontrolled air space. This contract is funded under NASAs Aeronautics Research Mission Directorates (ARMD) Aviation Operations and Safety Program (AOSP) Safe Autonomous Systems Operations (SASO) project and proposes technologies for the Unmanned Air Systems Traffic Management (UTM) service.There is a need for accommodating large-scale populations of Unmanned Air Systems (UAS) in the national air space. Scale obviously impacts capacity planning for Communication, Navitation, and Surveillance (CNS) technologies. For example, can wireless communications data links provide the necessary capacity for accommodating millions of small UASs (sUAS) nationwide? Does the communications network provide sufficient Internet Protocol (IP) address space to allow air traffic control to securely address both UAS teams as a whole as well as individual UAS within each team? Can navigation and surveillance approaches assure safe route planning and safe separation of vehicles even in crowded skies?Our objective is to identify revolutionary and advanced CNS alternatives supporting UASs operating at all altitudes and in all airspace while accurately navigating in the absence of navigational aids. These CNS alternatives must be reliable, redundant, always available, cyber-secure, and affordable for all types of vehicles including small UAS to large transport category aircraft. The approach will identify CNS technology candidates that can meet the needs of the range of UAS missions to specific air traffic management applications where they will be most beneficial and cost effective.
Orion Optical Navigation Progress Toward Exploration Mission 1
NASA Technical Reports Server (NTRS)
Holt, Greg N.; D'Souza, Christopher N.; Saley, David
2018-01-01
Optical navigation of human spacecraft was proposed on Gemini and implemented successfully on Apollo as a means of autonomously operating the vehicle in the event of lost communication with controllers on Earth. The Orion emergency return system utilizing optical navigation has matured in design over the last several years, and is currently undergoing the final implementation and test phase in preparation for Exploration Mission 1 (EM-1) in 2019. The software development is past its Critical Design Review, and is progressing through test and certification for human rating. The filter architecture uses a square-root-free UDU covariance factorization. Linear Covariance Analysis (LinCov) was used to analyze the measurement models and the measurement error models on a representative EM-1 trajectory. The Orion EM-1 flight camera was calibrated at the Johnson Space Center (JSC) electro-optics lab. To permanently stake the focal length of the camera a 500 mm focal length refractive collimator was used. Two Engineering Design Unit (EDU) cameras and an EDU star tracker were used for a live-sky test in Denver. In-space imagery with high-fidelity truth metadata is rare so these live-sky tests provide one of the closest real-world analogs to operational use. A hardware-in-the-loop test rig was developed in the Johnson Space Center Electro-Optics Lab to exercise the OpNav system prior to integrated testing on the Orion vehicle. The software is verified with synthetic images. Several hundred off-nominal images are also used to analyze robustness and fault detection in the software. These include effects such as stray light, excess radiation damage, and specular reflections, and are used to help verify the tuning parameters chosen for the algorithms such as earth atmosphere bias, minimum pixel intensity, and star detection thresholds.
State estimation for autonomous flight in cluttered environments
NASA Astrophysics Data System (ADS)
Langelaan, Jacob Willem
Safe, autonomous operation in complex, cluttered environments is a critical challenge facing autonomous mobile systems. The research described in this dissertation was motivated by a particularly difficult example of autonomous mobility: flight of a small Unmanned Aerial Vehicle (UAV) through a forest. In cluttered environments (such as forests or natural and urban canyons) signals from navigation beacons such as GPS may frequently be occluded. Direct measurements of vehicle position are therefore unavailable, and information required for flight control, obstacle avoidance, and navigation must be obtained using only on-board sensors. However, payload limitations of small UAVs restrict both the mass and physical dimensions of sensors that can be carried. This dissertation describes the development and proof-of-concept demonstration of a navigation system that uses only a low-cost inertial measurement unit and a monocular camera. Micro electromechanical inertial measurements units are well suited to small UAV applications and provide measurements of acceleration and angular rate. However, they do not provide information about nearby obstacles (needed for collision avoidance) and their noise and bias characteristics lead to unbounded growth in computed position. A monocular camera can provide bearings to nearby obstacles and landmarks. These bearings can be used both to enable obstacle avoidance and to aid navigation. Presented here is a solution to the problem of estimating vehicle state (position, orientation and velocity) as well as positions of obstacles in the environment using only inertial measurements and bearings to obstacles. This is a highly nonlinear estimation problem, and standard estimation techniques such as the Extended Kalman Filter are prone to divergence in this application. In this dissertation a Sigma Point Kalman Filter is implemented, resulting in an estimator which is able to cope with the significant nonlinearities in the system equations and uncertainty in state estimates while remaining tractable for real-time operation. In addition, the issues of data association and landmark initialization are addressed. Estimator performance is examined through Monte Carlo simulations in both two and three dimensions for scenarios involving UAV flight in cluttered environments. Hardware tests and simulations demonstrate navigation through an obstacle-strewn environment by a small Unmanned Ground Vehicle.
NASA Astrophysics Data System (ADS)
Wright, Adam A.; Momin, Orko; Shin, Young Ho; Shakya, Rahul; Nepal, Kumud; Ahlgren, David J.
2010-01-01
This paper presents the application of a distributed systems architecture to an autonomous ground vehicle, Q, that participates in both the autonomous and navigation challenges of the Intelligent Ground Vehicle Competition. In the autonomous challenge the vehicle is required to follow a course, while avoiding obstacles and staying within the course boundaries, which are marked by white lines. For the navigation challenge, the vehicle is required to reach a set of target destinations, known as way points, with given GPS coordinates and avoid obstacles that it encounters in the process. Previously the vehicle utilized a single laptop to execute all processing activities including image processing, sensor interfacing and data processing, path planning and navigation algorithms and motor control. National Instruments' (NI) LabVIEW served as the programming language for software implementation. As an upgrade to last year's design, a NI compact Reconfigurable Input/Output system (cRIO) was incorporated to the system architecture. The cRIO is NI's solution for rapid prototyping that is equipped with a real time processor, an FPGA and modular input/output. Under the current system, the real time processor handles the path planning and navigation algorithms, the FPGA gathers and processes sensor data. This setup leaves the laptop to focus on running the image processing algorithm. Image processing as previously presented by Nepal et. al. is a multi-step line extraction algorithm and constitutes the largest processor load. This distributed approach results in a faster image processing algorithm which was previously Q's bottleneck. Additionally, the path planning and navigation algorithms are executed more reliably on the real time processor due to the deterministic nature of operation. The implementation of this architecture required exploration of various inter-system communication techniques. Data transfer between the laptop and the real time processor using UDP packets was established as the most reliable protocol after testing various options. Improvement can be made to the system by migrating more algorithms to the hardware based FPGA to further speed up the operations of the vehicle.
PRoViScout: a planetary scouting rover demonstrator
NASA Astrophysics Data System (ADS)
Paar, Gerhard; Woods, Mark; Gimkiewicz, Christiane; Labrosse, Frédéric; Medina, Alberto; Tyler, Laurence; Barnes, David P.; Fritz, Gerald; Kapellos, Konstantinos
2012-01-01
Mobile systems exploring Planetary surfaces in future will require more autonomy than today. The EU FP7-SPACE Project ProViScout (2010-2012) establishes the building blocks of such autonomous exploration systems in terms of robotics vision by a decision-based combination of navigation and scientific target selection, and integrates them into a framework ready for and exposed to field demonstration. The PRoViScout on-board system consists of mission management components such as an Executive, a Mars Mission On-Board Planner and Scheduler, a Science Assessment Module, and Navigation & Vision Processing modules. The platform hardware consists of the rover with the sensors and pointing devices. We report on the major building blocks and their functions & interfaces, emphasizing on the computer vision parts such as image acquisition (using a novel zoomed 3D-Time-of-Flight & RGB camera), mapping from 3D-TOF data, panoramic image & stereo reconstruction, hazard and slope maps, visual odometry and the recognition of potential scientifically interesting targets.
Comparative analysis of ROS-based monocular SLAM methods for indoor navigation
NASA Astrophysics Data System (ADS)
Buyval, Alexander; Afanasyev, Ilya; Magid, Evgeni
2017-03-01
This paper presents a comparison of four most recent ROS-based monocular SLAM-related methods: ORB-SLAM, REMODE, LSD-SLAM, and DPPTAM, and analyzes their feasibility for a mobile robot application in indoor environment. We tested these methods using video data that was recorded from a conventional wide-angle full HD webcam with a rolling shutter. The camera was mounted on a human-operated prototype of an unmanned ground vehicle, which followed a closed-loop trajectory. Both feature-based methods (ORB-SLAM, REMODE) and direct SLAMrelated algorithms (LSD-SLAM, DPPTAM) demonstrated reasonably good results in detection of volumetric objects, corners, obstacles and other local features. However, we met difficulties with recovering typical for offices homogeneously colored walls, since all of these methods created empty spaces in a reconstructed sparse 3D scene. This may cause collisions of an autonomously guided robot with unfeatured walls and thus limits applicability of maps, which are obtained by the considered monocular SLAM-related methods for indoor robot navigation.
2011-12-01
study new multi-agent algorithms to avoid collision and obstacles. Others, including Hanford et al. [2], have tried to build low-cost experimental...2007. [2] S. D. Hanford , L. N. Long, and J. F. Horn, “A Small Semi-Autonomous Rotary-Wing Unmanned Air Vehicle ( UAV ),” 2003 AIAA Atmospheric
Design of an autonomous exterior security robot
NASA Technical Reports Server (NTRS)
Myers, Scott D.
1994-01-01
This paper discusses the requirements and preliminary design of robotic vehicle designed for performing autonomous exterior perimeter security patrols around warehouse areas, ammunition supply depots, and industrial parks for the U.S. Department of Defense. The preliminary design allows for the operation of up to eight vehicles in a six kilometer by six kilometer zone with autonomous navigation and obstacle avoidance. In addition to detection of crawling intruders at 100 meters, the system must perform real-time inventory checking and database comparisons using a microwave tags system.
Relative Navigation of Formation-Flying Satellites
NASA Technical Reports Server (NTRS)
Long, Anne; Kelbel, David; Lee, Taesul; Leung, Dominic; Carpenter, J. Russell; Grambling, Cheryl
2002-01-01
This paper compares autonomous relative navigation performance for formations in eccentric, medium and high-altitude Earth orbits using Global Positioning System (GPS) Standard Positioning Service (SPS), crosslink, and celestial object measurements. For close formations, the relative navigation accuracy is highly dependent on the magnitude of the uncorrelated measurement errors. A relative navigation position accuracy of better than 10 centimeters root-mean-square (RMS) can be achieved for medium-altitude formations that can continuously track at least one GPS signal. A relative navigation position accuracy of better than 15 meters RMS can be achieved for high-altitude formations that have sparse tracking of the GPS signals. The addition of crosslink measurements can significantly improve relative navigation accuracy for formations that use sparse GPS tracking or celestial object measurements for absolute navigation.
Autonomous Agents on Expedition: Humans and Progenitor Ants and Planetary Exploration
NASA Astrophysics Data System (ADS)
Rilee, M. L.; Clark, P. E.; Curtis, S. A.; Truszkowski, W. F.
2002-01-01
The Autonomous Nano-Technology Swarm (ANTS) is an advanced mission architecture based on a social insect analog of many specialized spacecraft working together to achieve mission goals. The principal mission concept driving the ANTS architecture is a Main Belt Asteroid Survey in the 2020s that will involve a thousand or more nano-technology enabled, artificially intelligent, autonomous pico-spacecraft (< 1 kg). The objective of this survey is to construct a compendium of composition, shape, and other physical parameter observations of a significant fraction of asteroid belt objects. Such an atlas will be of primary scientific importance for the understanding of Solar System origins and evolution and will lay the foundation for future exploration and capitalization of space. As the capabilities enabling ANTS are developed over the next two decades, these capabilities will need to be proven. Natural milestones for this process include the deployment of progenitors to ANTS on human expeditions to space and remote missions with interfaces for human interaction and control. These progenitors can show up in a variety of forms ranging from spacecraft subsystems and advanced handheld sensors, through complete prototypical ANTS spacecraft. A critical capability to be demonstrated is reliable, long-term autonomous operations across the ANTS architecture. High level, mission-oriented behaviors are to be managed by a control / communications layer of the swarm, whereas common low level functions required of all spacecraft, e.g. attitude control and guidance and navigation, are handled autonomically on each spacecraft. At the higher levels of mission planning and social interaction deliberative techniques are to be used. For the asteroid survey, ANTS acts as a large community of cooperative agents while for precursor missions there arises the intriguing possibility of Progenitor ANTS and humans acting together as agents. For optimal efficiency and responsiveness for individual spacecraft at the lowest levels of control we have been studying control methods based on nonlinear dynamical systems. We describe the critically important autonomous control architecture of the ANTS mission concept and a sequence of partial implementations that feature increasingly autonomous behaviors. The scientific and engineering roles that these Progenitor ANTS could play in human missions or remote missions with near real time human interactions, particularly to the Moon and Mars, will be discussed.
Rule-based navigation control design for autonomous flight
NASA Astrophysics Data System (ADS)
Contreras, Hugo; Bassi, Danilo
2008-04-01
This article depicts a navigation control system design that is based on a set of rules in order to follow a desired trajectory. The full control of the aircraft considered here comprises: a low level stability control loop, based on classic PID controller and the higher level navigation whose main job is to exercise lateral control (course) and altitude control, trying to follow a desired trajectory. The rules and PID gains were adjusted systematically according to the result of flight simulation. In spite of its simplicity, the rule-based navigation control proved to be robust, even with big perturbation, like crossing winds.
TDRSS Onboard Navigation System (TONS) experiment for the Explorer Platform (EP)
NASA Astrophysics Data System (ADS)
Gramling, C. J.; Hornstein, R. S.; Long, A. C.; Samii, M. V.; Elrod, B. D.
A TDRSS Onboard Navigation System (TONS) is currently being developed by NASA to provide a high-accuracy autonomous spacecraft navigation capability for users of TDRSS and its successor, the Advanced TDRSS. A TONS experiment will be performed in conjunction with the Explorer Platform (EP)/EUV Explorer mission to flight-qualify TONS Block I. This paper presents an overview of TDRSS on-board navigation goals and plans and the technical objectives of the TONS experiment. The operations concept of the experiment is described, including the characteristics of the ultrastable oscillator, the Doppler extractor, the signal-acquisition process, the TONS ground-support system, and the navigation flight software. A description of the on-board navigation algorithms and the rationale for their selection is also presented.
Miniaturized Autonomous Extravehicular Robotic Camera (Mini AERCam)
NASA Technical Reports Server (NTRS)
Fredrickson, Steven E.
2001-01-01
The NASA Johnson Space Center (JSC) Engineering Directorate is developing the Autonomous Extravehicular Robotic Camera (AERCam), a low-volume, low-mass free-flying camera system . AERCam project team personnel recently initiated development of a miniaturized version of AERCam known as Mini AERCam. The Mini AERCam target design is a spherical "nanosatellite" free-flyer 7.5 inches in diameter and weighing 1 0 pounds. Mini AERCam is building on the success of the AERCam Sprint STS-87 flight experiment by adding new on-board sensing and processing capabilities while simultaneously reducing volume by 80%. Achieving enhanced capability in a smaller package depends on applying miniaturization technology across virtually all subsystems. Technology innovations being incorporated include micro electromechanical system (MEMS) gyros, "camera-on-a-chip" CMOS imagers, rechargeable xenon gas propulsion system , rechargeable lithium ion battery, custom avionics based on the PowerPC 740 microprocessor, GPS relative navigation, digital radio frequency communications and tracking, micropatch antennas, digital instrumentation, and dense mechanical packaging. The Mini AERCam free-flyer will initially be integrated into an approximate flight-like configuration for demonstration on an airbearing table. A pilot-in-the-loop and hardware-in-the-loop simulation to simulate on-orbit navigation and dynamics will complement the airbearing table demonstration. The Mini AERCam lab demonstration is intended to form the basis for future development of an AERCam flight system that provides beneficial on-orbit views unobtainable from fixed cameras, cameras on robotic manipulators, or cameras carried by EVA crewmembers.
Robust analysis of an underwater navigational strategy in electrically heterogeneous corridors.
Dimble, Kedar D; Ranganathan, Badri N; Keshavan, Jishnu; Humbert, J Sean
2016-08-01
Obstacles and other global stimuli provide relevant navigational cues to a weakly electric fish. In this work, robust analysis of a control strategy based on electrolocation for performing obstacle avoidance in electrically heterogeneous corridors is presented and validated. Static output feedback control is shown to achieve the desired goal of reflexive obstacle avoidance in such environments in simulation and experimentation. The proposed approach is computationally inexpensive and readily implementable on a small scale underwater vehicle, making underwater autonomous navigation feasible in real-time.
Real-time, autonomous precise satellite orbit determination using the global positioning system
NASA Astrophysics Data System (ADS)
Goldstein, David Ben
2000-10-01
The desire for autonomously generated, rapidly available, and highly accurate satellite ephemeris is growing with the proliferation of constellations of satellites and the cost and overhead of ground tracking resources. Autonomous Orbit Determination (OD) may be done on the ground in a post-processing mode or in real-time on board a satellite and may be accomplished days, hours or immediately after observations are processed. The Global Positioning System (GPS) is now widely used as an alternative to ground tracking resources to supply observation data for satellite positioning and navigation. GPS is accurate, inexpensive, provides continuous coverage, and is an excellent choice for autonomous systems. In an effort to estimate precise satellite ephemeris in real-time on board a satellite, the Goddard Space Flight Center (GSFC) created the GPS Enhanced OD Experiment (GEODE) flight navigation software. This dissertation offers alternative methods and improvements to GEODE to increase on board autonomy and real-time total position accuracy and precision without increasing computational burden. First, GEODE is modified to include a Gravity Acceleration Approximation Function (GAAF) to replace the traditional spherical harmonic representation of the gravity field. Next, an ionospheric correction method called Differenced Range Versus Integrated Doppler (DRVID) is applied to correct for ionospheric errors in the GPS measurements used in GEODE. Then, Dynamic Model Compensation (DMC) is added to estimate unmodeled and/or mismodeled forces in the dynamic model and to provide an alternative process noise variance-covariance formulation. Finally, a Genetic Algorithm (GA) is implemented in the form of Genetic Model Compensation (GMC) to optimize DMC forcing noise parameters. Application of GAAF, DRVID and DMC improved GEODE's position estimates by 28.3% when applied to GPS/MET data collected in the presence of Selective Availability (SA), 17.5% when SA is removed from the GPS/MET data and 10.8% on SA free TOPEX data. Position estimates with RSS errors below I meter are now achieved using SA free TOPEX data. DRVID causes an increase in computational burden while GAAF and DMC reduce computational burden. The net effect of applying GAAF, DRVID and DMC is an improvement in GEODE's accuracy/precision without an increase in computational burden.
Science Benefits of Onboard Spacecraft Navigation
NASA Technical Reports Server (NTRS)
Cangahuala, Al; Bhaskaran, Shyam; Owen, Bill
2012-01-01
Primitive bodies (asteroids and comets), which have remained relatively unaltered since their formation, are important targets for scientific missions that seek to understand the evolution of the solar system. Often the first step is to fly by these bodies with robotic spacecraft. The key to maximizing data returns from these flybys is to determine the spacecraft trajectory relative to the target body-in short, navigate the spacecraft- with sufficient accuracy so that the target is guaranteed to be in the instruments' field of view. The most powerful navigation data in these scenarios are images taken by the spacecraft of the target against a known star field (onboard astrometry). Traditionally, the relative trajectory of the spacecraft must be estimated hours to days in advance using images collected by the spacecraft. This is because of (1)!the long round-trip light times between the spacecraft and the Earth and (2)!the time needed to downlink and process navigation data on the ground, make decisions based on the result, and build and uplink instrument pointing sequences from the results. The light time and processing time compromise navigation accuracy considerably, because there is not enough time to use more accurate data collected closer to the target-such data are more accurate because the angular capability of the onboard astrometry is essentially constant as the distance to the target decreases, resulting in better "plane-of- sky" knowledge of the target. Excellent examples of these timing limitations are high-speed comet encounters. Comets are difficult to observe up close; their orbits often limit scientists to brief, rapid flybys, and their coma further restricts viewers from seeing the nucleus in any detail, unless they can view the nucleus at close range. Comet nuclei details are typically discernable for much shorter durations than the roundtrip light time to Earth, so robotic spacecraft must be able to perform onboard navigation. This onboard navigation can be accomplished through a self- contained system that by eliminating light time restrictions dramatically improves the relative trajectory knowledge and control and subsequently increases the amount of quality data collected. Flybys are one-time events, so the system's underlying algorithms and software must be extremely robust. The autonomous software must also be able to cope with the unknown size, shape, and orientation of the previously unseen comet nucleus. Furthermore, algorithms must be reliable in the presence of imperfections and/or damage to onboard cameras accrued after many years of deep-space operations. The AutoNav operational flight software packages, developed by scientists at the Jet Propulsion Laboratory (JPL) under contract with NASA, meet all these requirements. They have been directly responsible for the successful encounters on all of NASA's close-up comet-imaging missions (see Figure !1). AutoNav is the only system to date that has autonomously tracked comet nuclei during encounters and performed autonomous interplanetary navigation. AutoNav has enabled five cometary flyby missions (Table!1) residing on four NASA spacecraft provided by three different spacecraft builders. Using this software, missions were able to process a combined total of nearly 1000 images previously unseen by humans. By eliminating the need to navigate spacecraft from Earth, the accuracy gained by AutoNav during flybys compared to ground-based navigation is about 1!order of magnitude in targeting and 2!orders of magnitude in time of flight. These benefits ensure that pointing errors do not compromise data gathered during flybys. In addition, these benefits can be applied to flybys of other solar system objects, flybys at much slower relative velocities, mosaic imaging campaigns, and other proximity activities (e.g., orbiting, hovering, and descent/ascent).
NASA Technical Reports Server (NTRS)
Bishop, Robert H.; DeMars, Kyle; Trawny, Nikolas; Crain, Tim; Hanak, Chad; Carson, John M.; Christian, John
2016-01-01
The navigation filter architecture successfully deployed on the Morpheus flight vehicle is presented. The filter was developed as a key element of the NASA Autonomous Landing and Hazard Avoidance Technology (ALHAT) project and over the course of 15 free fights was integrated into the Morpheus vehicle, operations, and flight control loop. Flight testing completed by demonstrating autonomous hazard detection and avoidance, integration of an altimeter, surface relative velocity (velocimeter) and hazard relative navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman flter software, and landing within 2 meters of the vertical testbed GPS-based navigation solution at the safe landing site target. Morpheus followed a trajectory that included an ascent phase followed by a partial descent-to-landing, although the proposed filter architecture is applicable to more general planetary precision entry, descent, and landings. The main new contribution is the incorporation of a sophisticated hazard relative navigation sensor-originally intended to locate safe landing sites-into the navigation system and employed as a navigation sensor. The formulation of a dual-state inertial extended Kalman filter was designed to address the precision planetary landing problem when viewed as a rendezvous problem with an intended landing site. For the required precision navigation system that is capable of navigating along a descent-to-landing trajectory to a precise landing, the impact of attitude errors on the translational state estimation are included in a fully integrated navigation structure in which translation state estimation is combined with attitude state estimation. The map tie errors are estimated as part of the process, thereby creating a dual-state filter implementation. Also, the filter is implemented using inertial states rather than states relative to the target. External measurements include altimeter, velocimeter, star camera, terrain relative navigation sensor, and a hazard relative navigation sensor providing information regarding hazards on a map generated on-the-fly.
Towards Camera-LIDAR Fusion-Based Terrain Modelling for Planetary Surfaces: Review and Analysis
Shaukat, Affan; Blacker, Peter C.; Spiteri, Conrad; Gao, Yang
2016-01-01
In recent decades, terrain modelling and reconstruction techniques have increased research interest in precise short and long distance autonomous navigation, localisation and mapping within field robotics. One of the most challenging applications is in relation to autonomous planetary exploration using mobile robots. Rovers deployed to explore extraterrestrial surfaces are required to perceive and model the environment with little or no intervention from the ground station. Up to date, stereopsis represents the state-of-the art method and can achieve short-distance planetary surface modelling. However, future space missions will require scene reconstruction at greater distance, fidelity and feature complexity, potentially using other sensors like Light Detection And Ranging (LIDAR). LIDAR has been extensively exploited for target detection, identification, and depth estimation in terrestrial robotics, but is still under development to become a viable technology for space robotics. This paper will first review current methods for scene reconstruction and terrain modelling using cameras in planetary robotics and LIDARs in terrestrial robotics; then we will propose camera-LIDAR fusion as a feasible technique to overcome the limitations of either of these individual sensors for planetary exploration. A comprehensive analysis will be presented to demonstrate the advantages of camera-LIDAR fusion in terms of range, fidelity, accuracy and computation. PMID:27879625
Insect-Based Vision for Autonomous Vehicles: A Feasibility Study
NASA Technical Reports Server (NTRS)
Srinivasan, Mandyam V.
1999-01-01
The aims of the project were to use a high-speed digital video camera to pursue two questions: i) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; To study the fine structure of insect flight trajectories with in order to better understand the characteristics of flight control, orientation and navigation.
Insect-Based Vision for Autonomous Vehicles: A Feasibility Study
NASA Technical Reports Server (NTRS)
Srinivasan, Mandyam V.
1999-01-01
The aims of the project were to use a high-speed digital video camera to pursue two questions: (1) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; (2) To study the fine structure of insect flight trajectories in order to better understand the characteristics of flight control, orientation and navigation.
On-board autonomous attitude maneuver planning for planetary spacecraft using genetic algorithms
NASA Technical Reports Server (NTRS)
Kornfeld, Richard P.
2003-01-01
A key enabling technology that leads to greater spacecraft autonomy is the capability to autonomously and optimally slew the spacecraft from and to different attitudes while operating under a number of celestial and dynamic constraints. The task of finding an attitude trajectory that meets all the constraints is a formidable one, in particular for orbiting or fly-by spacecraft where the constraints and initial and final conditions are of time-varying nature. This paper presents an approach for attitude path planning that makes full use of a priori constraint knowledge and is computationally tractable enough to be executed on-board a spacecraft. The approach is based on incorporating the constraints into a cost function and using a Genetic Algorithm to iteratively search for and optimize the solution. This results in a directed random search that explores a large part of the solution space while maintaining the knowledge of good solutions from iteration to iteration. A solution obtained this way may be used 'as is' or as an initial solution to initialize additional deterministic optimization algorithms. A number of example simulations are presented including the case examples of a generic Europa Orbiter spacecraft in cruise as well as in orbit around Europa. The search times are typically on the order of minutes, thus demonstrating the viability of the presented approach. The results are applicable to all future deep space missions where greater spacecraft autonomy is required. In addition, onboard autonomous attitude planning greatly facilitates navigation and science observation planning, benefiting thus all missions to planet Earth as well.
On-Orbit Autonomous Assembly from Nanosatellites
NASA Technical Reports Server (NTRS)
Murchison, Luke S.; Martinez, Andres; Petro, Andrew
2015-01-01
The On-Orbit Autonomous Assembly from Nanosatellites (OAAN) project will demonstrate autonomous control algorithms for rendezvous and docking maneuvers; low-power reconfigurable magnetic docking technology; and compact, lightweight and inexpensive precision relative navigation using carrier-phase differential (CD) GPS with a three-degree of freedom ground demonstration. CDGPS is a specific relative position determination method that measures the phase of the GPS carrier wave to yield relative position data accurate to.4 inch (1 centimeter). CDGPS is a technology commonly found in the surveying industry. The development and demonstration of these technologies will fill a current gap in the availability of proven autonomous rendezvous and docking systems for small satellites.
Challenges in verification and validation of autonomous systems for space exploration
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Jonsson, Ari
2005-01-01
Space exploration applications offer a unique opportunity for the development and deployment of autonomous systems, due to limited communications, large distances, and great expense of direct operation. At the same time, the risk and cost of space missions leads to reluctance to taking on new, complex and difficult-to-understand technology. A key issue in addressing these concerns is the validation of autonomous systems. In recent years, higher-level autonomous systems have been applied in space applications. In this presentation, we will highlight those autonomous systems, and discuss issues in validating these systems. We will then look to future demands on validating autonomous systems for space, identify promising technologies and open issues.
Trajectory Generation and Path Planning for Autonomous Aerobots
NASA Technical Reports Server (NTRS)
Sharma, Shivanjli; Kulczycki, Eric A.; Elfes, Alberto
2007-01-01
This paper presents global path planning algorithms for the Titan aerobot based on user defined waypoints in 2D and 3D space. The algorithms were implemented using information obtained through a planner user interface. The trajectory planning algorithms were designed to accurately represent the aerobot's characteristics, such as minimum turning radius. Additionally, trajectory planning techniques were implemented to allow for surveying of a planar area based solely on camera fields of view, airship altitude, and the location of the planar area's perimeter. The developed paths allow for planar navigation and three-dimensional path planning. These calculated trajectories are optimized to produce the shortest possible path while still remaining within realistic bounds of airship dynamics.
Kim, Hoyeon; Cheang, U. Kei
2017-01-01
In order to broaden the use of microrobots in practical fields, autonomous control algorithms such as obstacle avoidance must be further developed. However, most previous studies of microrobots used manual motion control to navigate past tight spaces and obstacles while very few studies demonstrated the use of autonomous motion. In this paper, we demonstrated a dynamic obstacle avoidance algorithm for bacteria-powered microrobots (BPMs) using electric field in fluidic environments. A BPM consists of an artificial body, which is made of SU-8, and a high dense layer of harnessed bacteria. BPMs can be controlled using externally applied electric fields due to the electrokinetic property of bacteria. For developing dynamic obstacle avoidance for BPMs, a kinematic model of BPMs was utilized to prevent collision and a finite element model was used to characteristic the deformation of an electric field near the obstacle walls. In order to avoid fast moving obstacles, we modified our previously static obstacle avoidance approach using a modified vector field histogram (VFH) method. To validate the advanced algorithm in experiments, magnetically controlled moving obstacles were used to intercept the BPMs as the BPMs move from the initial position to final position. The algorithm was able to successfully guide the BPMs to reach their respective goal positions while avoiding the dynamic obstacles. PMID:29020016
Kim, Hoyeon; Cheang, U Kei; Kim, Min Jun
2017-01-01
In order to broaden the use of microrobots in practical fields, autonomous control algorithms such as obstacle avoidance must be further developed. However, most previous studies of microrobots used manual motion control to navigate past tight spaces and obstacles while very few studies demonstrated the use of autonomous motion. In this paper, we demonstrated a dynamic obstacle avoidance algorithm for bacteria-powered microrobots (BPMs) using electric field in fluidic environments. A BPM consists of an artificial body, which is made of SU-8, and a high dense layer of harnessed bacteria. BPMs can be controlled using externally applied electric fields due to the electrokinetic property of bacteria. For developing dynamic obstacle avoidance for BPMs, a kinematic model of BPMs was utilized to prevent collision and a finite element model was used to characteristic the deformation of an electric field near the obstacle walls. In order to avoid fast moving obstacles, we modified our previously static obstacle avoidance approach using a modified vector field histogram (VFH) method. To validate the advanced algorithm in experiments, magnetically controlled moving obstacles were used to intercept the BPMs as the BPMs move from the initial position to final position. The algorithm was able to successfully guide the BPMs to reach their respective goal positions while avoiding the dynamic obstacles.
Autonomous Flight Rules - A Concept for Self-Separation in U.S. Domestic Airspace
NASA Technical Reports Server (NTRS)
Wing, David J.; Cotton, William B.
2011-01-01
Autonomous Flight Rules (AFR) are proposed as a new set of operating regulations in which aircraft navigate on tracks of their choice while self-separating from traffic and weather. AFR would exist alongside Instrument and Visual Flight Rules (IFR and VFR) as one of three available flight options for any appropriately trained and qualified operator with the necessary certified equipment. Historically, ground-based separation services evolved by necessity as aircraft began operating in the clouds and were unable to see each other. Today, technologies for global navigation, airborne surveillance, and onboard computing enable the functions of traffic conflict management to be fully integrated with navigation procedures onboard the aircraft. By self-separating, aircraft can operate with more flexibility and fewer restrictions than are required when using ground-based separation. The AFR concept is described in detail and provides practical means by which self-separating aircraft could share the same airspace as IFR and VFR aircraft without disrupting the ongoing processes of Air Traffic Control.
NASA Astrophysics Data System (ADS)
Huang, Wei; Yang, Xiao-xu; Han, Jun-feng; Wei, Yu; Zhang, Jing; Xie, Mei-lin; Yue, Peng
2016-01-01
High precision tracking platform of celestial navigation with control mirror servo structure form, to solve the disadvantages of big volume and rotational inertia, slow response speed, and so on. It improved the stability and tracking accuracy of platform. Due to optical sensor and mirror are installed on the middle-gimbal, stiffness and resonant frequency requirement for high. Based on the application of finite element modality analysis theory, doing Research on dynamic characteristics of the middle-gimbal, and ANSYS was used for the finite element dynamic emulator analysis. According to the result of the computer to find out the weak links of the structure, and Put forward improvement suggestions and reanalysis. The lowest resonant frequency of optimization middle-gimbal avoid the bandwidth of the platform servo mechanism, and much higher than the disturbance frequency of carrier aircraft, and reduces mechanical resonance of the framework. Reaching provides a theoretical basis for the whole machine structure optimization design of high-precision of autonomous Celestial navigation tracking mirror system.
Improving CAR Navigation with a Vision-Based System
NASA Astrophysics Data System (ADS)
Kim, H.; Choi, K.; Lee, I.
2015-08-01
The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.
Improving Car Navigation with a Vision-Based System
NASA Astrophysics Data System (ADS)
Kim, H.; Choi, K.; Lee, I.
2015-08-01
The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.
A method of real-time detection for distant moving obstacles by monocular vision
NASA Astrophysics Data System (ADS)
Jia, Bao-zhi; Zhu, Ming
2013-12-01
In this paper, we propose an approach for detection of distant moving obstacles like cars and bicycles by a monocular camera to cooperate with ultrasonic sensors in low-cost condition. We are aiming at detecting distant obstacles that move toward our autonomous navigation car in order to give alarm and keep away from them. Method of frame differencing is applied to find obstacles after compensation of camera's ego-motion. Meanwhile, each obstacle is separated from others in an independent area and given a confidence level to indicate whether it is coming closer. The results on an open dataset and our own autonomous navigation car have proved that the method is effective for detection of distant moving obstacles in real-time.
NASA Astrophysics Data System (ADS)
Martínez, Fredy; Martínez, Fernando; Jacinto, Edwar
2017-02-01
In this paper we propose an on-line motion planning strategy for autonomous robots in dynamic and locally observable environments. In this approach, we first visually identify geometric shapes in the environment by filtering images. Then, an ART-2 network is used to establish the similarity between patterns. The proposed algorithm allows that a robot establish its relative location in the environment, and define its navigation path based on images of the environment and its similarity to reference images. This is an efficient and minimalist method that uses the similarity of landmark view patterns to navigate to the desired destination. Laboratory tests on real prototypes demonstrate the performance of the algorithm.
Design and test of a simulation system for autonomous optic-navigated planetary landing
NASA Astrophysics Data System (ADS)
Cai, Sheng; Yin, Yanhe; Liu, Yanjun; He, Fengyun
2018-02-01
In this paper, a simulation system based on commercial projector is proposed to test the optical navigation algorithms for autonomous planetary landing in laboratorial scenarios. The design work of optics, mechanics and synchronization control are carried out. Furthermore, the whole simulation system is set up and tested. Through the calibration of the system, two main problems, synchronization between the projector and CCD and pixel-level shifting caused by the low repeatability of DMD used in the projector, are settled. The experimental result shows that the RMS errors of pitch, yaw and roll angles are 0.78', 0.48', and 2.95' compared with the theoretical calculation, which can fulfill the requirement of experimental simulation for planetary landing in laboratory.
Fast and reliable obstacle detection and segmentation for cross-country navigation
NASA Technical Reports Server (NTRS)
Talukder, A.; Manduchi, R.; Rankin, A.; Matthies, L.
2002-01-01
Obstacle detection is one of the main components of the control system of autonomous vehicles. In the case of indoor/urban navigation, obstacles are typically defined as surface points that are higher than the ground plane. This characterization, however, cannot be used in cross-country and unstructured environments, where the notion of ground plane is often not meaningful.
AUV SLAM and Experiments Using a Mechanical Scanning Forward-Looking Sonar
He, Bo; Liang, Yan; Feng, Xiao; Nian, Rui; Yan, Tianhong; Li, Minghui; Zhang, Shujing
2012-01-01
Navigation technology is one of the most important challenges in the applications of autonomous underwater vehicles (AUVs) which navigate in the complex undersea environment. The ability of localizing a robot and accurately mapping its surroundings simultaneously, namely the simultaneous localization and mapping (SLAM) problem, is a key prerequisite of truly autonomous robots. In this paper, a modified-FastSLAM algorithm is proposed and used in the navigation for our C-Ranger research platform, an open-frame AUV. A mechanical scanning imaging sonar is chosen as the active sensor for the AUV. The modified-FastSLAM implements the update relying on the on-board sensors of C-Ranger. On the other hand, the algorithm employs the data association which combines the single particle maximum likelihood method with modified negative evidence method, and uses the rank-based resampling to overcome the particle depletion problem. In order to verify the feasibility of the proposed methods, both simulation experiments and sea trials for C-Ranger are conducted. The experimental results show the modified-FastSLAM employed for the navigation of the C-Ranger AUV is much more effective and accurate compared with the traditional methods. PMID:23012549
AUV SLAM and experiments using a mechanical scanning forward-looking sonar.
He, Bo; Liang, Yan; Feng, Xiao; Nian, Rui; Yan, Tianhong; Li, Minghui; Zhang, Shujing
2012-01-01
Navigation technology is one of the most important challenges in the applications of autonomous underwater vehicles (AUVs) which navigate in the complex undersea environment. The ability of localizing a robot and accurately mapping its surroundings simultaneously, namely the simultaneous localization and mapping (SLAM) problem, is a key prerequisite of truly autonomous robots. In this paper, a modified-FastSLAM algorithm is proposed and used in the navigation for our C-Ranger research platform, an open-frame AUV. A mechanical scanning imaging sonar is chosen as the active sensor for the AUV. The modified-FastSLAM implements the update relying on the on-board sensors of C-Ranger. On the other hand, the algorithm employs the data association which combines the single particle maximum likelihood method with modified negative evidence method, and uses the rank-based resampling to overcome the particle depletion problem. In order to verify the feasibility of the proposed methods, both simulation experiments and sea trials for C-Ranger are conducted. The experimental results show the modified-FastSLAM employed for the navigation of the C-Ranger AUV is much more effective and accurate compared with the traditional methods.
Applications of Clocks to Space Navigation & "Planetary GPS"
NASA Technical Reports Server (NTRS)
Lichten, Stephen M.
2004-01-01
The ability to fly atomic clocks on GPS satellites has profoundly defined the capabilities and limitations of GPS in near-Earth applications. It is likely that future infrastructure for Lunar and Mars applications will be constrained by financial factors. The development of a low cost, small, high performance space clock -- or ultrahigh performance space clocks -- could revolutionize and drive the entire approach to GPS-like systems at the Moon (or Mars), and possibly even change the future of GPS at Earth. Many system trade studies are required. The performance of future GPS-like tracking systems at the Moon or Mars will depend critically on clock performance, availability of inertial sensors, and constellation coverage. Example: present-day GPS carry 10(exp -13) clocks and require several updates per day. With 10(exp -15) clocks, a constellation at Mars could operate autonomously with updates just once per month. Use of GPS tracking at the Moon should be evaluated in a technical study.
NASA Technical Reports Server (NTRS)
Bradley, Arthur; Dubowsky, Steven; Quinn, Roger; Marzwell, Neville
2005-01-01
Robots that operate independently of one another will not be adequate to accomplish the future exploration tasks of long-distance autonomous navigation, habitat construction, resource discovery, and material handling. Such activities will require that systems widely share information, plan and divide complex tasks, share common resources, and physically cooperate to manipulate objects. Recognizing the need for interoperable robots to accomplish the new exploration initiative, NASA s Office of Exploration Systems Research & Technology recently funded the development of the Joint Technical Architecture for Robotic Systems (JTARS). JTARS charter is to identify the interface standards necessary to achieve interoperability among space robots. A JTARS working group (JTARS-WG) has been established comprising recognized leaders in the field of space robotics including representatives from seven NASA centers along with academia and private industry. The working group s early accomplishments include addressing key issues required for interoperability, defining which systems are within the project s scope, and framing the JTARS manuals around classes of robotic systems.
Machine Vision Applied to Navigation of Confined Spaces
NASA Technical Reports Server (NTRS)
Briscoe, Jeri M.; Broderick, David J.; Howard, Ricky; Corder, Eric L.
2004-01-01
The reliability of space related assets has been emphasized after the second loss of a Space Shuttle. The intricate nature of the hardware being inspected often requires a complete disassembly to perform a thorough inspection which can be difficult as well as costly. Furthermore, it is imperative that the hardware under inspection not be altered in any other manner than that which is intended. In these cases the use of machine vision can allow for inspection with greater frequency using less intrusive methods. Such systems can provide feedback to guide, not only manually controlled instrumentation, but autonomous robotic platforms as well. This paper serves to detail a method using machine vision to provide such sensing capabilities in a compact package. A single camera is used in conjunction with a projected reference grid to ascertain precise distance measurements. The design of the sensor focuses on the use of conventional components in an unconventional manner with the goal of providing a solution for systems that do not require or cannot accommodate more complex vision systems.
Safety Ellipse Motion with Coarse Sun Angle Optimization
NASA Technical Reports Server (NTRS)
Naasz, Bo
2005-01-01
The Hubble Space Telescope Robotic Servicing and De-orbit Mission (HRSDM) was t o be performed by the unmanned Hubble Robotic Vehicle (HRV) consisting of a Deorbit Module (DM), responsible for the ultimate disposal of Hubble Space Telescope (HST) at the end of science operations, and an Ejection Module (EM), responsible for robotically servicing the HST to extend its useful operational lifetime. HRSDM consisted of eight distinct phases, including: launch, pursuit, proximity operations, capture, servicing, EM jettison and disposal, science operations, and deorbit. The scope of this paper is limited to the Proximity Operations phase of HRSDM. It introduces a relative motion strategy useful for Autonomous Rendezvous and Docking (AR&D) or Formation Flying missions where safe circumnavigation trajectories, or close proximity operations (tens or hundreds of meters) are required for extended periods of time. Parameters and algorithms used to model the relative motion of HRV with respect to HST during the Proximity Operations phase of the HRSDM are described. Specifically, the Safety Ellipse (SE) concept, convenient parameters for describing SE motion, and a concept for initializing SE motion around a target vehicle to coarsely optimize sun and relative navigation sensor angles are presented. The effects of solar incidence angle variations on sun angle optimization, and the effects of orbital perturbations and navigation uncertainty on long term SE motion are discussed.
Preliminary Analyses of Beidou Signal-In Anomaly Since 2013
NASA Astrophysics Data System (ADS)
Wu, Y.; Ren, J.; Liu, W.
2016-06-01
As BeiDou navigation system has been operational since December 2012. There is an increasing desire to use multiple constellation to improve positioning performance. The signal-in-space (SIS) anomaly caused by the ground control and the space vehicle is one of the major threats to affect the integrity. For a young Global Navigation Satellite System, knowledge about SIS anomalies in history is very important for not only assessing the SIS integrity performance of a constellation but also providing the assumption for ARAIM (Advanced Receiver Autonomous Integrity Monitoring). In this paper, the broadcast ephemerides and the precise ones are pre-processed for avoiding the false anomaly identification. The SIS errors over the period of Mar. 2013-Feb. 2016 are computed by comparing the broadcast ephemerides with the precise ones. The time offsets between GPST (GPS time) and BDT (BeiDou time) are estimated and removed by an improved estimation algorithm. SIS worst-UREs are computed and a RMS criteria are investigated to identify the SIS anomalies. The results show that the probability of BeiDou SIS anomalies is in 10-3 level in last three years. Even though BeiDou SIS integrity performance currently cannot match the GPS integrity performances, the result indicates that BeiDou has a tendency to improve its integrity performance.
Current status of endovascular catheter robotics.
Lumsden, Alan B; Bismuth, Jean
2018-06-01
In this review, we will detail the evolution of endovascular therapy as the basis for the development of catheter-based robotics. In parallel, we will outline the evolution of robotics in the surgical space and how the convergence of technology and the entrepreneurs who push this evolution have led to the development of endovascular robots. The current state-of-the-art and future directions and potential are summarized for the reader. Information in this review has been drawn primarily from our personal clinical and preclinical experience in use of catheter robotics, coupled with some ground-breaking work reported from a few other major centers who have embraced the technology's capabilities and opportunities. Several case studies demonstrating the unique capabilities of a precisely controlled catheter are presented. Most of the preclinical work was performed in the advanced imaging and navigation laboratory. In this unique facility, the interface of advanced imaging techniques and robotic guidance is being explored. Although this procedure employs a very high-tech approach to navigation inside the endovascular space, we have conveyed the kind of opportunities that this technology affords to integrate 3D imaging and 3D control. Further, we present the opportunity of semi-autonomous motion of these devices to a target. For the interventionist, enhanced precision can be achieved in a nearly radiation-free environment.
Fusing terrain and goals: agent control in urban environments
NASA Astrophysics Data System (ADS)
Kaptan, Varol; Gelenbe, Erol
2006-04-01
The changing face of contemporary military conflicts has forced a major shift of focus in tactical planning and evaluation from the classical Cold War battlefield to an asymmetric guerrilla-type warfare in densely populated urban areas. The new arena of conflict presents unique operational difficulties due to factors like complex mobility restrictions and the necessity to preserve civilian lives and infrastructure. In this paper we present a novel method for autonomous agent control in an urban environment. Our approach is based on fusing terrain information and agent goals for the purpose of transforming the problem of navigation in a complex environment with many obstacles into the easier problem of navigation in a virtual obstacle-free space. The main advantage of our approach is its ability to act as an adapter layer for a number of efficient agent control techniques which normally show poor performance when applied to an environment with many complex obstacles. Because of the very low computational and space complexity at runtime, our method is also particularly well suited for simulation or control of a huge number of agents (military as well as civilian) in a complex urban environment where traditional path-planning may be too expensive or where a just-in-time decision with hard real-time constraints is required.