Autonomous Deep-Space Optical Navigation Project
NASA Technical Reports Server (NTRS)
D'Souza, Christopher
2014-01-01
This project will advance the Autonomous Deep-space navigation capability applied to Autonomous Rendezvous and Docking (AR&D) Guidance, Navigation and Control (GNC) system by testing it on hardware, particularly in a flight processor, with a goal of limited testing in the Integrated Power, Avionics and Software (IPAS) with the ARCM (Asteroid Retrieval Crewed Mission) DRO (Distant Retrograde Orbit) Autonomous Rendezvous and Docking (AR&D) scenario. The technology, which will be harnessed, is called 'optical flow', also known as 'visual odometry'. It is being matured in the automotive and SLAM (Simultaneous Localization and Mapping) applications but has yet to be applied to spacecraft navigation. In light of the tremendous potential of this technique, we believe that NASA needs to design a optical navigation architecture that will use this technique. It is flexible enough to be applicable to navigating around planetary bodies, such as asteroids.
Baohua, Li; Wenjie, Lai; Yun, Chen; Zongming, Liu
2013-01-01
An autonomous navigation algorithm using the sensor that integrated the star sensor (FOV1) and ultraviolet earth sensor (FOV2) is presented. The star images are sampled by FOV1, and the ultraviolet earth images are sampled by the FOV2. The star identification algorithm and star tracking algorithm are executed at FOV1. Then, the optical axis direction of FOV1 at J2000.0 coordinate system is calculated. The ultraviolet image of earth is sampled by FOV2. The center vector of earth at FOV2 coordinate system is calculated with the coordinates of ultraviolet earth. The autonomous navigation data of satellite are calculated by integrated sensor with the optical axis direction of FOV1 and the center vector of earth from FOV2. The position accuracy of the autonomous navigation for satellite is improved from 1000 meters to 300 meters. And the velocity accuracy of the autonomous navigation for satellite is improved from 100 m/s to 20 m/s. At the same time, the period sine errors of the autonomous navigation for satellite are eliminated. The autonomous navigation for satellite with a sensor that integrated ultraviolet earth sensor and star sensor is well robust. PMID:24250261
Baohua, Li; Wenjie, Lai; Yun, Chen; Zongming, Liu
2013-01-01
An autonomous navigation algorithm using the sensor that integrated the star sensor (FOV1) and ultraviolet earth sensor (FOV2) is presented. The star images are sampled by FOV1, and the ultraviolet earth images are sampled by the FOV2. The star identification algorithm and star tracking algorithm are executed at FOV1. Then, the optical axis direction of FOV1 at J2000.0 coordinate system is calculated. The ultraviolet image of earth is sampled by FOV2. The center vector of earth at FOV2 coordinate system is calculated with the coordinates of ultraviolet earth. The autonomous navigation data of satellite are calculated by integrated sensor with the optical axis direction of FOV1 and the center vector of earth from FOV2. The position accuracy of the autonomous navigation for satellite is improved from 1000 meters to 300 meters. And the velocity accuracy of the autonomous navigation for satellite is improved from 100 m/s to 20 m/s. At the same time, the period sine errors of the autonomous navigation for satellite are eliminated. The autonomous navigation for satellite with a sensor that integrated ultraviolet earth sensor and star sensor is well robust.
Design and Development of the WVU Advanced Technology Satellite for Optical Navigation
NASA Astrophysics Data System (ADS)
Straub, Miranda
In order to meet the demands of future space missions, it is beneficial for spacecraft to have the capability to support autonomous navigation. This is true for both crewed and uncrewed vehicles. For crewed vehicles, autonomous navigation would allow the crew to safely navigate home in the event of a communication system failure. For uncrewed missions, autonomous navigation reduces the demand on ground-based infrastructure and could allow for more flexible operation. One promising technique for achieving these goals is through optical navigation. To this end, the present work considers how camera images of the Earth's surface could enable autonomous navigation of a satellite in low Earth orbit. Specifically, this study will investigate the use of coastlines and other natural land-water boundaries for navigation. Observed coastlines can be matched to a pre-existing coastline database in order to determine the location of the spacecraft. This paper examines how such measurements may be processed in an on-board extended Kalman filter (EKF) to provide completely autonomous estimates of the spacecraft state throughout the duration of the mission. In addition, future work includes implementing this work on a CubeSat mission within the WVU Applied Space Exploration Lab (ASEL). The mission titled WVU Advanced Technology Satellite for Optical Navigation (WATSON) will provide students with an opportunity to experience the life cycle of a spacecraft from design through operation while hopefully meeting the primary and secondary goals defined for mission success. The spacecraft design process, although simplified by CubeSat standards, will be discussed in this thesis as well as the current results of laboratory testing with the CubeSat model in the ASEL.
NASA Astrophysics Data System (ADS)
Theil, S.; Ammann, N.; Andert, F.; Franz, T.; Krüger, H.; Lehner, H.; Lingenauber, M.; Lüdtke, D.; Maass, B.; Paproth, C.; Wohlfeil, J.
2018-03-01
Since 2010 the German Aerospace Center is working on the project Autonomous Terrain-based Optical Navigation (ATON). Its objective is the development of technologies which allow autonomous navigation of spacecraft in orbit around and during landing on celestial bodies like the Moon, planets, asteroids and comets. The project developed different image processing techniques and optical navigation methods as well as sensor data fusion. The setup—which is applicable to many exploration missions—consists of an inertial measurement unit, a laser altimeter, a star tracker and one or multiple navigation cameras. In the past years, several milestones have been achieved. It started with the setup of a simulation environment including the detailed simulation of camera images. This was continued by hardware-in-the-loop tests in the Testbed for Robotic Optical Navigation (TRON) where images were generated by real cameras in a simulated downscaled lunar landing scene. Data were recorded in helicopter flight tests and post-processed in real-time to increase maturity of the algorithms and to optimize the software. Recently, two more milestones have been achieved. In late 2016, the whole navigation system setup was flying on an unmanned helicopter while processing all sensor information onboard in real time. For the latest milestone the navigation system was tested in closed-loop on the unmanned helicopter. For that purpose the ATON navigation system provided the navigation state for the guidance and control of the unmanned helicopter replacing the GPS-based standard navigation system. The paper will give an introduction to the ATON project and its concept. The methods and algorithms of ATON are briefly described. The flight test results of the latest two milestones are presented and discussed.
Autonomous optical navigation using nanosatellite-class instruments: a Mars approach case study
NASA Astrophysics Data System (ADS)
Enright, John; Jovanovic, Ilija; Kazemi, Laila; Zhang, Harry; Dzamba, Tom
2018-02-01
This paper examines the effectiveness of small star trackers for orbital estimation. Autonomous optical navigation has been used for some time to provide local estimates of orbital parameters during close approach to celestial bodies. These techniques have been used extensively on spacecraft dating back to the Voyager missions, but often rely on long exposures and large instrument apertures. Using a hyperbolic Mars approach as a reference mission, we present an EKF-based navigation filter suitable for nanosatellite missions. Observations of Mars and its moons allow the estimator to correct initial errors in both position and velocity. Our results show that nanosatellite-class star trackers can produce good quality navigation solutions with low position (<300 {m}) and velocity (<0.15 {m/s}) errors as the spacecraft approaches periapse.
Experiment D009: Simple navigation
NASA Technical Reports Server (NTRS)
Silva, R. M.; Jorris, T. R.; Vallerie, E. M., III
1971-01-01
Space position-fixing techniques have been investigated by collecting data on the observable phenomena of space flight that could be used to solve the problem of autonomous navigation by the use of optical data and manual computations to calculate the position of a spacecraft. After completion of the developmental and test phases, the product of the experiment would be a manual-optical technique of orbital space navigation that could be used as a backup to onboard and ground-based spacecraft-navigation systems.
Navigation for the new millennium: Autonomous navigation for Deep Space 1
NASA Technical Reports Server (NTRS)
Reidel, J. E.; Bhaskaran, S.; Synnott, S. P.; Desai, S. D.; Bollman, W. E.; Dumont, P. J.; Halsell, C. A.; Han, D.; Kennedy, B. M.; Null, G. W.;
1997-01-01
The autonomous optical navigation system technology for the Deep Space 1 (DS1) mission is reported on. The DS1 navigation system will be the first to use autonomous navigation in deep space. The systems tasks are to: perform interplanetary cruise orbit determination using images of distant asteroids; control and maintain the orbit of the spacecraft with an ion propulsion system and conventional thrusters, and perform late knowledge updates of target position during close flybys in order to facilitate high quality data return from asteroid MaAuliffe and comet West-Kohoutek-Ikemura. To accomplish these tasks, the following functions are required: picture planning; image processing; dynamical modeling and integration; planetary ephemeris and star catalog handling; orbit determination; data filtering and estimation; maneuver estimation, and spacecraft ephemeris updating. These systems and functions are described and preliminary performance data are presented.
Mobile robots IV; Proceedings of the Meeting, Philadelphia, PA, Nov. 6, 7, 1989
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfe, W.J.; Chun, W.H.
1990-01-01
The present conference on mobile robot systems discusses high-speed machine perception based on passive sensing, wide-angle optical ranging, three-dimensional path planning for flying/crawling robots, navigation of autonomous mobile intelligence in an unstructured natural environment, mechanical models for the locomotion of a four-articulated-track robot, a rule-based command language for a semiautonomous Mars rover, and a computer model of the structured light vision system for a Mars rover. Also discussed are optical flow and three-dimensional information for navigation, feature-based reasoning trail detection, a symbolic neural-net production system for obstacle avoidance and navigation, intelligent path planning for robot navigation in an unknown environment,more » behaviors from a hierarchical control system, stereoscopic TV systems, the REACT language for autonomous robots, and a man-amplifying exoskeleton.« less
Autonomous Vision Navigation for Spacecraft in Lunar Orbit
NASA Astrophysics Data System (ADS)
Bader, Nolan A.
NASA aims to achieve unprecedented navigational reliability for the first manned lunar mission of the Orion spacecraft in 2023. A technique for accomplishing this is to integrate autonomous feature tracking as an added means of improving position and velocity estimation. In this thesis, a template matching algorithm and optical sensor are tested onboard three simulated lunar trajectories using linear covariance techniques under various conditions. A preliminary characterization of the camera gives insight into its ability to determine azimuth and elevation angles to points on the surface of the Moon. A navigation performance analysis shows that an optical camera sensor can aid in decreasing position and velocity errors, particularly in a loss of communication scenario. Furthermore, it is found that camera quality and computational capability are driving factors affecting the performance of such a system.
Design and test of a simulation system for autonomous optic-navigated planetary landing
NASA Astrophysics Data System (ADS)
Cai, Sheng; Yin, Yanhe; Liu, Yanjun; He, Fengyun
2018-02-01
In this paper, a simulation system based on commercial projector is proposed to test the optical navigation algorithms for autonomous planetary landing in laboratorial scenarios. The design work of optics, mechanics and synchronization control are carried out. Furthermore, the whole simulation system is set up and tested. Through the calibration of the system, two main problems, synchronization between the projector and CCD and pixel-level shifting caused by the low repeatability of DMD used in the projector, are settled. The experimental result shows that the RMS errors of pitch, yaw and roll angles are 0.78', 0.48', and 2.95' compared with the theoretical calculation, which can fulfill the requirement of experimental simulation for planetary landing in laboratory.
Visual Odometry for Autonomous Deep-Space Navigation
NASA Technical Reports Server (NTRS)
Robinson, Shane; Pedrotty, Sam
2016-01-01
Visual Odometry fills two critical needs shared by all future exploration architectures considered by NASA: Autonomous Rendezvous and Docking (AR&D), and autonomous navigation during loss of comm. To do this, a camera is combined with cutting-edge algorithms (called Visual Odometry) into a unit that provides accurate relative pose between the camera and the object in the imagery. Recent simulation analyses have demonstrated the ability of this new technology to reliably, accurately, and quickly compute a relative pose. This project advances this technology by both preparing the system to process flight imagery and creating an activity to capture said imagery. This technology can provide a pioneering optical navigation platform capable of supporting a wide variety of future missions scenarios: deep space rendezvous, asteroid exploration, loss-of-comm.
Autonomous Underwater Navigation and Optical Mapping in Unknown Natural Environments.
Hernández, Juan David; Istenič, Klemen; Gracias, Nuno; Palomeras, Narcís; Campos, Ricard; Vidal, Eduard; García, Rafael; Carreras, Marc
2016-07-26
We present an approach for navigating in unknown environments while, simultaneously, gathering information for inspecting underwater structures using an autonomous underwater vehicle (AUV). To accomplish this, we first use our pipeline for mapping and planning collision-free paths online, which endows an AUV with the capability to autonomously acquire optical data in close proximity. With that information, we then propose a reconstruction pipeline to create a photo-realistic textured 3D model of the inspected area. These 3D models are also of particular interest to other fields of study in marine sciences, since they can serve as base maps for environmental monitoring, thus allowing change detection of biological communities and their environment over time. Finally, we evaluate our approach using the Sparus II, a torpedo-shaped AUV, conducting inspection missions in a challenging, real-world and natural scenario.
Autonomous Underwater Navigation and Optical Mapping in Unknown Natural Environments
Hernández, Juan David; Istenič, Klemen; Gracias, Nuno; Palomeras, Narcís; Campos, Ricard; Vidal, Eduard; García, Rafael; Carreras, Marc
2016-01-01
We present an approach for navigating in unknown environments while, simultaneously, gathering information for inspecting underwater structures using an autonomous underwater vehicle (AUV). To accomplish this, we first use our pipeline for mapping and planning collision-free paths online, which endows an AUV with the capability to autonomously acquire optical data in close proximity. With that information, we then propose a reconstruction pipeline to create a photo-realistic textured 3D model of the inspected area. These 3D models are also of particular interest to other fields of study in marine sciences, since they can serve as base maps for environmental monitoring, thus allowing change detection of biological communities and their environment over time. Finally, we evaluate our approach using the Sparus II, a torpedo-shaped AUV, conducting inspection missions in a challenging, real-world and natural scenario. PMID:27472337
Compact autonomous navigation system (CANS)
NASA Astrophysics Data System (ADS)
Hao, Y. C.; Ying, L.; Xiong, K.; Cheng, H. Y.; Qiao, G. D.
2017-11-01
Autonomous navigation of Satellite and constellation has series of benefits, such as to reduce operation cost and ground station workload, to avoid the event of crises of war and natural disaster, to increase spacecraft autonomy, and so on. Autonomous navigation satellite is independent of ground station support. Many systems are developed for autonomous navigation of satellite in the past 20 years. Along them American MANS (Microcosm Autonomous Navigation System) [1] of Microcosm Inc. and ERADS [2] [3] (Earth Reference Attitude Determination System) of Honeywell Inc. are well known. The systems anticipate a series of good features of autonomous navigation and aim low cost, integrated structure, low power consumption and compact layout. The ERADS is an integrated small 3-axis attitude sensor system with low cost and small volume. It has the Earth center measurement accuracy higher than the common IR sensor because the detected ultraviolet radiation zone of the atmosphere has a brightness gradient larger than that of the IR zone. But the ERADS is still a complex system because it has to eliminate many problems such as making of the sapphire sphere lens, birefringence effect of sapphire, high precision image transfer optical fiber flattener, ultraviolet intensifier noise, and so on. The marginal sphere FOV of the sphere lens of the ERADS is used to star imaging that may be bring some disadvantages., i.e. , the image energy and attitude measurements accuracy may be reduced due to the tilt image acceptance end of the fiber flattener in the FOV. Besides Japan, Germany and Russia developed visible earth sensor for GEO [4] [5]. Do we have a way to develop a cheaper/easier and more accurate autonomous navigation system that can be used to all LEO spacecraft, especially, to LEO small and micro satellites? To return this problem we provide a new type of the system—CANS (Compact Autonomous Navigation System) [6].
Autonomous Navigation for Deep Space Missions
NASA Technical Reports Server (NTRS)
Bhaskaran, Shyam
2012-01-01
Navigation (determining where the spacecraft is at any given time, controlling its path to achieve desired targets), performed using ground-in- the-loop techniques: (1) Data includes 2-way radiometric (Doppler, range), interferometric (Delta- Differential One-way Range), and optical (images of natural bodies taken by onboard camera) (2) Data received on the ground, processed to determine orbit, commands sent to execute maneuvers to control orbit. A self-contained, onboard, autonomous navigation system can: (1) Eliminate delays due to round-trip light time (2) Eliminate the human factors in ground-based processing (3) Reduce turnaround time from navigation update to minutes, down to seconds (4) React to late-breaking data. At JPL, we have developed the framework and computational elements of an autonomous navigation system, called AutoNav. It was originally developed as one of the technologies for the Deep Space 1 mission, launched in 1998; subsequently used on three other spacecraft, for four different missions. The primary use has been on comet missions to track comets during flybys, and impact one comet.
An onboard navigation system which fulfills Mars aerocapture guidance requirements
NASA Technical Reports Server (NTRS)
Brand, Timothy J.; Fuhry, Douglas P.; Shepperd, Stanley W.
1989-01-01
The development of a candidate autonomous onboard Mars approach navigation scheme capable of supporting aerocapture into Mars orbit is discussed. An aerocapture guidance and navigation system which can run independently of the preaerocapture navigation was used to define a preliminary set of accuracy requirements at entry interface. These requirements are used to evaluate the proposed preaerocapture navigation scheme. This scheme uses optical sightings on Deimos with a star tracker and an inertial measurement unit for instrumentation as a source for navigation nformation. Preliminary results suggest that the approach will adequately support aerocaputre into Mars orbit.
Small Body Landing Accuracy Using In-Situ Navigation
NASA Technical Reports Server (NTRS)
Bhaskaran, Shyam; Nandi, Sumita; Broschart, Stephen; Wallace, Mark; Olson, Corwin; Cangahuala, L. Alberto
2011-01-01
Spacecraft landings on small bodies (asteroids and comets) can require target accuracies too stringent to be met using ground-based navigation alone, especially if specific landing site requirements must be met for safety or to meet science goals. In-situ optical observations coupled with onboard navigation processing can meet the tighter accuracy requirements to enable such missions. Recent developments in deep space navigation capability include a self-contained autonomous navigation system (used in flight on three missions) and a landmark tracking system (used experimentally on the Japanese Hayabusa mission). The merging of these two technologies forms a methodology to perform autonomous onboard navigation around small bodies. This paper presents an overview of these systems, as well as the results from Monte Carlo studies to quantify the achievable landing accuracies by using these methods. Sensitivity of the results to variations in spacecraft maneuver execution error, attitude control accuracy and unmodeled forces are examined. Cases for two bodies, a small asteroid and on a mid-size comet, are presented.
Wang, Hao; Jiang, Jie; Zhang, Guangjun
2017-04-21
The simultaneous extraction of optical navigation measurements from a target celestial body and star images is essential for autonomous optical navigation. Generally, a single optical navigation sensor cannot simultaneously image the target celestial body and stars well-exposed because their irradiance difference is generally large. Multi-sensor integration or complex image processing algorithms are commonly utilized to solve the said problem. This study analyzes and demonstrates the feasibility of simultaneously imaging the target celestial body and stars well-exposed within a single exposure through a single field of view (FOV) optical navigation sensor using the well capacity adjusting (WCA) scheme. First, the irradiance characteristics of the celestial body are analyzed. Then, the celestial body edge model and star spot imaging model are established when the WCA scheme is applied. Furthermore, the effect of exposure parameters on the accuracy of star centroiding and edge extraction is analyzed using the proposed model. Optimal exposure parameters are also derived by conducting Monte Carlo simulation to obtain the best performance of the navigation sensor. Finally, laboratorial and night sky experiments are performed to validate the correctness of the proposed model and optimal exposure parameters.
Wang, Hao; Jiang, Jie; Zhang, Guangjun
2017-01-01
The simultaneous extraction of optical navigation measurements from a target celestial body and star images is essential for autonomous optical navigation. Generally, a single optical navigation sensor cannot simultaneously image the target celestial body and stars well-exposed because their irradiance difference is generally large. Multi-sensor integration or complex image processing algorithms are commonly utilized to solve the said problem. This study analyzes and demonstrates the feasibility of simultaneously imaging the target celestial body and stars well-exposed within a single exposure through a single field of view (FOV) optical navigation sensor using the well capacity adjusting (WCA) scheme. First, the irradiance characteristics of the celestial body are analyzed. Then, the celestial body edge model and star spot imaging model are established when the WCA scheme is applied. Furthermore, the effect of exposure parameters on the accuracy of star centroiding and edge extraction is analyzed using the proposed model. Optimal exposure parameters are also derived by conducting Monte Carlo simulation to obtain the best performance of the navigation sensor. Finally, laboratorial and night sky experiments are performed to validate the correctness of the proposed model and optimal exposure parameters. PMID:28430132
INL Autonomous Navigation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
2005-03-30
The INL Autonomous Navigation System provides instructions for autonomously navigating a robot. The system permits high-speed autonomous navigation including obstacle avoidance, waypoing navigation and path planning in both indoor and outdoor environments.
Navigation Concepts for the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Long, Anne; Leung, Dominic; Kelbel, David; Beckman, Mark; Grambling, Cheryl
2003-01-01
This paper evaluates the performance that can be achieved using candidate ground and onboard navigation approaches for operation of the James Webb Space Telescope, which will be in an orbit about the Sun-Earth L2 libration point. The ground navigation approach processes standard range and Doppler measurements from the Deep Space Network The onboard navigation approach processes celestial object measurements and/or ground-to- spacecraft Doppler measurements to autonomously estimate the spacecraft s position and velocity and Doppler reference frequency. Particular attention is given to assessing the absolute position and velocity accuracy that can be achieved in the presence of the frequent spacecraft reorientations and momentum unloads planned for this mission. The ground navigation approach provides stable navigation solutions using a tracking schedule of one 30-minute contact per day. The onboard navigation approach that uses only optical quality celestial object measurements provides stable autonomous navigation solutions. This study indicates that unmodeled changes in the solar radiation pressure cross-sectional area and modeled momentum unload velocity changes are the major error sources. These errors can be mitigated by modeling these changes, by estimating corrections to compensate for the changes, or by including acceleration measurements.
Model-based software engineering for an optical navigation system for spacecraft
NASA Astrophysics Data System (ADS)
Franz, T.; Lüdtke, D.; Maibaum, O.; Gerndt, A.
2017-09-01
The project Autonomous Terrain-based Optical Navigation (ATON) at the German Aerospace Center (DLR) is developing an optical navigation system for future landing missions on celestial bodies such as the moon or asteroids. Image data obtained by optical sensors can be used for autonomous determination of the spacecraft's position and attitude. Camera-in-the-loop experiments in the Testbed for Robotic Optical Navigation (TRON) laboratory and flight campaigns with unmanned aerial vehicle (UAV) are performed to gather flight data for further development and to test the system in a closed-loop scenario. The software modules are executed in the C++ Tasking Framework that provides the means to concurrently run the modules in separated tasks, send messages between tasks, and schedule task execution based on events. Since the project is developed in collaboration with several institutes in different domains at DLR, clearly defined and well-documented interfaces are necessary. Preventing misconceptions caused by differences between various development philosophies and standards turned out to be challenging. After the first development cycles with manual Interface Control Documents (ICD) and manual implementation of the complex interactions between modules, we switched to a model-based approach. The ATON model covers a graphical description of the modules, their parameters and communication patterns. Type and consistency checks on this formal level help to reduce errors in the system. The model enables the generation of interfaces and unified data types as well as their documentation. Furthermore, the C++ code for the exchange of data between the modules and the scheduling of the software tasks is created automatically. With this approach, changing the data flow in the system or adding additional components (e.g., a second camera) have become trivial.
Model-based software engineering for an optical navigation system for spacecraft
NASA Astrophysics Data System (ADS)
Franz, T.; Lüdtke, D.; Maibaum, O.; Gerndt, A.
2018-06-01
The project Autonomous Terrain-based Optical Navigation (ATON) at the German Aerospace Center (DLR) is developing an optical navigation system for future landing missions on celestial bodies such as the moon or asteroids. Image data obtained by optical sensors can be used for autonomous determination of the spacecraft's position and attitude. Camera-in-the-loop experiments in the Testbed for Robotic Optical Navigation (TRON) laboratory and flight campaigns with unmanned aerial vehicle (UAV) are performed to gather flight data for further development and to test the system in a closed-loop scenario. The software modules are executed in the C++ Tasking Framework that provides the means to concurrently run the modules in separated tasks, send messages between tasks, and schedule task execution based on events. Since the project is developed in collaboration with several institutes in different domains at DLR, clearly defined and well-documented interfaces are necessary. Preventing misconceptions caused by differences between various development philosophies and standards turned out to be challenging. After the first development cycles with manual Interface Control Documents (ICD) and manual implementation of the complex interactions between modules, we switched to a model-based approach. The ATON model covers a graphical description of the modules, their parameters and communication patterns. Type and consistency checks on this formal level help to reduce errors in the system. The model enables the generation of interfaces and unified data types as well as their documentation. Furthermore, the C++ code for the exchange of data between the modules and the scheduling of the software tasks is created automatically. With this approach, changing the data flow in the system or adding additional components (e.g., a second camera) have become trivial.
Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke
2018-04-09
In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology.
Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke
2018-01-01
In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology. PMID:29642549
Optical Navigation for the Orion Vehicle
NASA Technical Reports Server (NTRS)
Crain, Timothy; Getchius, Joel; D'Souza, Christopher
2008-01-01
The Orion vehicle is being designed to provide nominal crew transport to the lunar transportation stack in low Earth orbit, crew abort prior during transit to the moon, and crew return to Earth once lunar orbit is achieved. One of the design requirements levied on the Orion vehicle is the ability to return to the vehicle and crew to Earth in the case of loss of communications and command with the Mission Control Center. Central to fulfilling this requirement, is the ability of Orion to navigate autonomously. In low-Earth orbit, this may be solved with the use of GPS, but in cis-lunar and lunar orbit this requires optical navigation. This paper documents the preliminary analyses performed by members of the Orion Orbit GN&C System team.
NASA Astrophysics Data System (ADS)
Bu, Yanlong; Zhang, Qiang; Ding, Chibiao; Tang, Geshi; Wang, Hang; Qiu, Rujin; Liang, Libo; Yin, Hejun
2017-02-01
This paper presents an interplanetary optical navigation algorithm based on two spherical celestial bodies. The remarkable characteristic of the method is that key navigation parameters can be estimated depending entirely on known sizes and ephemerides of two celestial bodies, especially positioning is realized through a single image and does not rely on traditional terrestrial radio tracking any more. Actual Earth-Moon group photos captured by China's Chang'e-5T1 probe were used to verify the effectiveness of the algorithm. From 430,000 km away from the Earth, the camera pointing accuracy reaches 0.01° (one sigma) and the inertial positioning error is less than 200 km, respectively; meanwhile, the cost of the ground control and human resources are greatly reduced. The algorithm is flexible, easy to implement, and can provide reference to interplanetary autonomous navigation in the solar system.
Autonomous Navigation Using Celestial Objects
NASA Technical Reports Server (NTRS)
Folta, David; Gramling, Cheryl; Leung, Dominic; Belur, Sheela; Long, Anne
1999-01-01
In the twenty-first century, National Aeronautics and Space Administration (NASA) Enterprises envision frequent low-cost missions to explore the solar system, observe the universe, and study our planet. Satellite autonomy is a key technology required to reduce satellite operating costs. The Guidance, Navigation, and Control Center (GNCC) at the Goddard Space Flight Center (GSFC) currently sponsors several initiatives associated with the development of advanced spacecraft systems to provide autonomous navigation and control. Autonomous navigation has the potential both to increase spacecraft navigation system performance and to reduce total mission cost. By eliminating the need for routine ground-based orbit determination and special tracking services, autonomous navigation can streamline spacecraft ground systems. Autonomous navigation products can be included in the science telemetry and forwarded directly to the scientific investigators. In addition, autonomous navigation products are available onboard to enable other autonomous capabilities, such as attitude control, maneuver planning and orbit control, and communications signal acquisition. Autonomous navigation is required to support advanced mission concepts such as satellite formation flying. GNCC has successfully developed high-accuracy autonomous navigation systems for near-Earth spacecraft using NASA's space and ground communications systems and the Global Positioning System (GPS). Recently, GNCC has expanded its autonomous navigation initiative to include satellite orbits that are beyond the regime in which use of GPS is possible. Currently, GNCC is assessing the feasibility of using standard spacecraft attitude sensors and communication components to provide autonomous navigation for missions including: libration point, gravity assist, high-Earth, and interplanetary orbits. The concept being evaluated uses a combination of star, Sun, and Earth sensor measurements along with forward-link Doppler measurements from the command link carrier to autonomously estimate the spacecraft's orbit and reference oscillator's frequency. To support autonomous attitude determination and control and maneuver planning and control, the orbit determination accuracy should be on the order of kilometers in position and centimeters per second in velocity. A less accurate solution (one hundred kilometers in position) could be used for acquisition purposes for command and science downloads. This paper provides performance results for both libration point orbiting and high Earth orbiting satellites as a function of sensor measurement accuracy, measurement types, measurement frequency, initial state errors, and dynamic modeling errors.
Orion Optical Navigation Progress Toward Exploration: Mission 1
NASA Technical Reports Server (NTRS)
Holt, Greg N.; D'Souza, Christopher N.; Saley, David
2018-01-01
Optical navigation of human spacecraft was proposed on Gemini and implemented successfully on Apollo as a means of autonomously operating the vehicle in the event of lost communication with controllers on Earth. It shares a history with the "method of lunar distances" that was used in the 18th century and gained some notoriety after its use by Captain James Cook during his 1768 Pacific voyage of the HMS Endeavor. The Orion emergency return system utilizing optical navigation has matured in design over the last several years, and is currently undergoing the final implementation and test phase in preparation for Exploration Mission 1 (EM-1) in 2019. The software development is being worked as a Government Furnished Equipment (GFE) project delivered as an application within the Core Flight Software of the Orion camera controller module. The mathematical formulation behind the initial ellipse fit in the image processing is detailed in Christian. The non-linear least squares refinement then follows the technique of Mortari as an estimation process of the planetary limb using the sigmoid function. The Orion optical navigation system uses a body fixed camera, a decision that was driven by mass and mechanism constraints. The general concept of operations involves a 2-hour pass once every 24 hours, with passes specifically placed before all maneuvers to supply accurate navigation information to guidance and targeting. The pass lengths are limited by thermal constraints on the vehicle since the OpNav attitude generally deviates from the thermally stable tail-to-sun attitude maintained during the rest of the orbit coast phase. Calibration is scheduled prior to every pass due to the unknown nature of thermal effects on the lens distortion and the mounting platform deformations between the camera and star trackers. The calibration technique is described in detail by Christian, et al. and simultaneously estimates the Brown-Conrady coefficients and the Star Tracker/Camera interlock angles. Accurate attitude information is provided by the star trackers during each pass. Figure 1 shows the various phases of lunar return navigation when the vehicle is in autonomous operation with lost ground communication. The midcourse maneuvers are placed to control the entry interface conditions to the desired corridor for safe landing. The general form of optical navigation on Orion is where still images of the Moon or Earth are processed to find the apparent angular diameter and centroid in the camera focal plane. This raw data is transformed into range and bearing angle measurements using planetary data and precise star tracker inertial attitude. The measurements are then sent to the main flight computer's Kalman filter to update the onboard state vector. The images are, of course, collected over an arc to converge the state and estimate velocity. The same basic technique was used by Apollo to satisfy loss-of-comm, but Apollo used manual crew sightings with a vehicle-integral sextant instead of autonomously processing optical imagery. The software development is past its Critical Design Review, and is progressing through test and certification for human rating. In support of this, a hardware-in-the-loop test rig was developed in the Johnson Space Center Electro-Optics Lab to exercise the OpNav system prior to integrated testing on the Orion vehicle. Figure 2 shows the rig, which the test team has dubbed OCILOT (Orion Camera In the Loop Optical Testbed). Analysis performed to date shows a delivery that satisfies an allowable entry corridor as shown in Figure 3.
2006-12-01
NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI
Fuzzy Logic Based Control for Autonomous Mobile Robot Navigation
Masmoudi, Mohamed Slim; Masmoudi, Mohamed
2016-01-01
This paper describes the design and the implementation of a trajectory tracking controller using fuzzy logic for mobile robot to navigate in indoor environments. Most of the previous works used two independent controllers for navigation and avoiding obstacles. The main contribution of the paper can be summarized in the fact that we use only one fuzzy controller for navigation and obstacle avoidance. The used mobile robot is equipped with DC motor, nine infrared range (IR) sensors to measure the distance to obstacles, and two optical encoders to provide the actual position and speeds. To evaluate the performances of the intelligent navigation algorithms, different trajectories are used and simulated using MATLAB software and SIMIAM navigation platform. Simulation results show the performances of the intelligent navigation algorithms in terms of simulation times and travelled path. PMID:27688748
Nonlinearity analysis of measurement model for vision-based optical navigation system
NASA Astrophysics Data System (ADS)
Li, Jianguo; Cui, Hutao; Tian, Yang
2015-02-01
In the autonomous optical navigation system based on line-of-sight vector observation, nonlinearity of measurement model is highly correlated with the navigation performance. By quantitatively calculating the degree of nonlinearity of the focal plane model and the unit vector model, this paper focuses on determining which optical measurement model performs better. Firstly, measurement equations and measurement noise statistics of these two line-of-sight measurement models are established based on perspective projection co-linearity equation. Then the nonlinear effects of measurement model on the filter performance are analyzed within the framework of the Extended Kalman filter, also the degrees of nonlinearity of two measurement models are compared using the curvature measure theory from differential geometry. Finally, a simulation of star-tracker-based attitude determination is presented to confirm the superiority of the unit vector measurement model. Simulation results show that the magnitude of curvature nonlinearity measurement is consistent with the filter performance, and the unit vector measurement model yields higher estimation precision and faster convergence properties.
NASA Astrophysics Data System (ADS)
Huang, Wei; Yang, Xiao-xu; Han, Jun-feng; Wei, Yu; Zhang, Jing; Xie, Mei-lin; Yue, Peng
2016-01-01
High precision tracking platform of celestial navigation with control mirror servo structure form, to solve the disadvantages of big volume and rotational inertia, slow response speed, and so on. It improved the stability and tracking accuracy of platform. Due to optical sensor and mirror are installed on the middle-gimbal, stiffness and resonant frequency requirement for high. Based on the application of finite element modality analysis theory, doing Research on dynamic characteristics of the middle-gimbal, and ANSYS was used for the finite element dynamic emulator analysis. According to the result of the computer to find out the weak links of the structure, and Put forward improvement suggestions and reanalysis. The lowest resonant frequency of optimization middle-gimbal avoid the bandwidth of the platform servo mechanism, and much higher than the disturbance frequency of carrier aircraft, and reduces mechanical resonance of the framework. Reaching provides a theoretical basis for the whole machine structure optimization design of high-precision of autonomous Celestial navigation tracking mirror system.
NASA Astrophysics Data System (ADS)
Welch, Sharon S.
Topics discussed in this volume include aircraft guidance and navigation, optics for visual guidance of aircraft, spacecraft and missile guidance and navigation, lidar and ladar systems, microdevices, gyroscopes, cockpit displays, and automotive displays. Papers are presented on optical processing for range and attitude determination, aircraft collision avoidance using a statistical decision theory, a scanning laser aircraft surveillance system for carrier flight operations, star sensor simulation for astroinertial guidance and navigation, autonomous millimeter-wave radar guidance systems, and a 1.32-micron long-range solid state imaging ladar. Attention is also given to a microfabricated magnetometer using Young's modulus changes in magnetoelastic materials, an integrated microgyroscope, a pulsed diode ring laser gyroscope, self-scanned polysilicon active-matrix liquid-crystal displays, the history and development of coated contrast enhancement filters for cockpit displays, and the effect of the display configuration on the attentional sampling performance. (For individual items see A93-28152 to A93-28176, A93-28178 to A93-28180)
Autonomous Relative Navigation for Formation-Flying Satellites Using GPS
NASA Technical Reports Server (NTRS)
Gramling, Cheryl; Carpenter, J. Russell; Long, Anne; Kelbel, David; Lee, Taesul
2000-01-01
The Goddard Space Flight Center is currently developing advanced spacecraft systems to provide autonomous navigation and control of formation flyers. This paper discusses autonomous relative navigation performance for a formation of four eccentric, medium-altitude Earth-orbiting satellites using Global Positioning System (GPS) Standard Positioning Service (SPS) and "GPS-like " intersatellite measurements. The performance of several candidate relative navigation approaches is evaluated. These analyses indicate that an autonomous relative navigation position accuracy of 1meter root-mean-square can be achieved by differencing high-accuracy filtered solutions if only measurements from common GPS space vehicles are used in the independently estimated solutions.
Terrain Navigation Concepts for Autonomous Vehicles,
1984-06-01
AD-fi144 994 TERRAIN NAVIGATION CONCEPTS FOR AUTONOMOUS VEHICLES (U) i/i I ARMY ENGINEER OPOGRAPHIC LABS FORT BELVOIR VA R D LEIGHTY JUN 84 ETL-R@65...FUNCTIONS The pacing problem for developing autonomous vehicles that can efficiently move to designated locations in the real world in the perfor- mance...autonomous functions can serve as general terrain navigation requirements for our discussion of autonomous vehicles . LEIGHTY Can we build a vehicular system
Conceptual Design of a Communication-Based Deep Space Navigation Network
NASA Technical Reports Server (NTRS)
Anzalone, Evan J.; Chuang, C. H.
2012-01-01
As the need grows for increased autonomy and position knowledge accuracy to support missions beyond Earth orbit, engineers must push and develop more advanced navigation sensors and systems that operate independent of Earth-based analysis and processing. Several spacecraft are approaching this problem using inter-spacecraft radiometric tracking and onboard autonomous optical navigation methods. This paper proposes an alternative implementation to aid in spacecraft position fixing. The proposed method Network-Based Navigation technique takes advantage of the communication data being sent between spacecraft and between spacecraft and ground control to embed navigation information. The navigation system uses these packets to provide navigation estimates to an onboard navigation filter to augment traditional ground-based radiometric tracking techniques. As opposed to using digital signal measurements to capture inherent information of the transmitted signal itself, this method relies on the embedded navigation packet headers to calculate a navigation estimate. This method is heavily dependent on clock accuracy and the initial results show the promising performance of a notional system.
NASA Technical Reports Server (NTRS)
Wood, L. J.; Jones, J. B.; Mease, K. D.; Kwok, J. H.; Goltz, G. L.; Kechichian, J. A.
1984-01-01
A conceptual design is outlined for the navigation subsystem of the Autonomous Redundancy and Maintenance Management Subsystem (ARMMS). The principal function of this navigation subsystem is to maintain the spacecraft over a specified equatorial longitude to within + or - 3 deg. In addition, the navigation subsystem must detect and correct internal faults. It comprises elements for a navigation executive and for orbit determination, trajectory, maneuver planning, and maneuver command. Each of these elements is described. The navigation subsystem is to be used in the DSCS III spacecraft.
COBALT: A GN&C Payload for Testing ALHAT Capabilities in Closed-Loop Terrestrial Rocket Flights
NASA Technical Reports Server (NTRS)
Carson, John M., III; Amzajerdian, Farzin; Hines, Glenn D.; O'Neal, Travis V.; Robertson, Edward A.; Seubert, Carl; Trawny, Nikolas
2016-01-01
The COBALT (CoOperative Blending of Autonomous Landing Technology) payload is being developed within NASA as a risk reduction activity to mature, integrate and test ALHAT (Autonomous precision Landing and Hazard Avoidance Technology) systems targeted for infusion into near-term robotic and future human space flight missions. The initial COBALT payload instantiation is integrating the third-generation ALHAT Navigation Doppler Lidar (NDL) sensor, for ultra high-precision velocity plus range measurements, with the passive-optical Lander Vision System (LVS) that provides Terrain Relative Navigation (TRN) global-position estimates. The COBALT payload will be integrated onboard a rocket-propulsive terrestrial testbed and will provide precise navigation estimates and guidance planning during two flight test campaigns in 2017 (one open-loop and closed- loop). The NDL is targeting performance capabilities desired for future Mars and Moon Entry, Descent and Landing (EDL). The LVS is already baselined for TRN on the Mars 2020 robotic lander mission. The COBALT platform will provide NASA with a new risk-reduction capability to test integrated EDL Guidance, Navigation and Control (GN&C) components in closed-loop flight demonstrations prior to the actual mission EDL.
Developing Autonomous Vehicles That Learn to Navigate by Mimicking Human Behavior
2006-09-28
navigate in an unstructured environment to a specific target or location. 15. SUBJECT TERMS autonomous vehicles , fuzzy logic, learning behavior...ANSI-Std Z39-18 Developing Autonomous Vehicles That Learn to Navigate by Mimicking Human Behavior FINAL REPORT 9/28/2006 Dean B. Edwards Department...the future, as greater numbers of autonomous vehicles are employed, it is hoped that lower LONG-TERM GOALS Use LAGR (Learning Applied to Ground Robots
A System for Fast Navigation of Autonomous Vehicles
1991-09-01
AD-A243 523 4, jj A System for Fast Navigation of Autonomous Vehicles Sanjiv Singh, Dai Feng, Paul Keller, Gary Shaffer, Wen Fan Shi, Dong Hun Shin...FUNDING NUMBERS A System for Fast Navigation of Autonomous Vehicles 6. AUTHOR(S) S. Singh, D. Feng, P. Keller, G. Shaffer, W.F. Shi, D.H. Shin, J. West...common in the control of autonomous vehicles to establish the necessary kinematic models but to ignore an explicit representation of the vehicle dynamics
Intelligent Optical Systems Using Adaptive Optics
NASA Technical Reports Server (NTRS)
Clark, Natalie
2012-01-01
Until recently, the phrase adaptive optics generally conjured images of large deformable mirrors being integrated into telescopes to compensate for atmospheric turbulence. However, the development of smaller, cheaper devices has sparked interest for other aerospace and commercial applications. Variable focal length lenses, liquid crystal spatial light modulators, tunable filters, phase compensators, polarization compensation, and deformable mirrors are becoming increasingly useful for other imaging applications including guidance navigation and control (GNC), coronagraphs, foveated imaging, situational awareness, autonomous rendezvous and docking, non-mechanical zoom, phase diversity, and enhanced multi-spectral imaging. The active components presented here allow flexibility in the optical design, increasing performance. In addition, the intelligent optical systems presented offer advantages in size and weight and radiation tolerance.
Autonomous Navigation Improvements for High-Earth Orbiters Using GPS
NASA Technical Reports Server (NTRS)
Long, Anne; Kelbel, David; Lee, Taesul; Garrison, James; Carpenter, J. Russell; Bauer, F. (Technical Monitor)
2000-01-01
The Goddard Space Flight Center is currently developing autonomous navigation systems for satellites in high-Earth orbits where acquisition of the GPS signals is severely limited This paper discusses autonomous navigation improvements for high-Earth orbiters and assesses projected navigation performance for these satellites using Global Positioning System (GPS) Standard Positioning Service (SPS) measurements. Navigation performance is evaluated as a function of signal acquisition threshold, measurement errors, and dynamic modeling errors using realistic GPS signal strength and user antenna models. These analyses indicate that an autonomous navigation position accuracy of better than 30 meters root-mean-square (RMS) can be achieved for high-Earth orbiting satellites using a GPS receiver with a very stable oscillator. This accuracy improves to better than 15 meters RMS if the GPS receiver's signal acquisition threshold can be reduced by 5 dB-Hertz to track weaker signals.
Natural Models for Autonomous Control of Spatial Navigation, Sensing, and Guidance. Part 1
2012-02-13
ketocarotenoid pigment astaxanthin is deposited in the antennal scale of a stomatopod crustacean, Odontodactylus scyllarus. Positive correlation between...partial polarization and the presence of astaxanthin indicates that the antennal scale polarizes light with astaxanthin . Both the optical properties and...the fine structure of the polarizationally-active cuticle suggest that the dipole axes of the astaxanthin molecules are oriented nearly normal to the
Autonomous unmanned air vehicles (UAV) techniques
NASA Astrophysics Data System (ADS)
Hsu, Ming-Kai; Lee, Ting N.
2007-04-01
The UAVs (Unmanned Air Vehicles) have great potentials in different civilian applications, such as oil pipeline surveillance, precision farming, forest fire fighting (yearly), search and rescue, boarder patrol, etc. The related industries of UAVs can create billions of dollars for each year. However, the road block of adopting UAVs is that it is against FAA (Federal Aviation Administration) and ATC (Air Traffic Control) regulations. In this paper, we have reviewed the latest technologies and researches on UAV navigation and obstacle avoidance. We have purposed a system design of Jittering Mosaic Image Processing (JMIP) with stereo vision and optical flow to fulfill the functionalities of autonomous UAVs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
EISLER, G. RICHARD
This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstratemore » the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.« less
Autonomous assistance navigation for robotic wheelchairs in confined spaces.
Cheein, Fernando Auat; Carelli, Ricardo; De la Cruz, Celso; Muller, Sandra; Bastos Filho, Teodiano F
2010-01-01
In this work, a visual interface for the assistance of a robotic wheelchair's navigation is presented. The visual interface is developed for the navigation in confined spaces such as narrows corridors or corridor-ends. The interface performs two navigation modus: non-autonomous and autonomous. The non-autonomous driving of the robotic wheelchair is made by means of a hand-joystick. The joystick directs the motion of the vehicle within the environment. The autonomous driving is performed when the user of the wheelchair has to turn (90, 90 or 180 degrees) within the environment. The turning strategy is performed by a maneuverability algorithm compatible with the kinematics of the wheelchair and by the SLAM (Simultaneous Localization and Mapping) algorithm. The SLAM algorithm provides the interface with the information concerning the environment disposition and the pose -position and orientation-of the wheelchair within the environment. Experimental and statistical results of the interface are also shown in this work.
Fuzzy Behavior Modulation with Threshold Activation for Autonomous Vehicle Navigation
NASA Technical Reports Server (NTRS)
Tunstel, Edward
2000-01-01
This paper describes fuzzy logic techniques used in a hierarchical behavior-based architecture for robot navigation. An architectural feature for threshold activation of fuzzy-behaviors is emphasized, which is potentially useful for tuning navigation performance in real world applications. The target application is autonomous local navigation of a small planetary rover. Threshold activation of low-level navigation behaviors is the primary focus. A preliminary assessment of its impact on local navigation performance is provided based on computer simulations.
Autonomous navigation system based on GPS and magnetometer data
NASA Technical Reports Server (NTRS)
Julie, Thienel K. (Inventor); Richard, Harman R. (Inventor); Bar-Itzhack, Itzhack Y. (Inventor)
2004-01-01
This invention is drawn to an autonomous navigation system using Global Positioning System (GPS) and magnetometers for low Earth orbit satellites. As a magnetometer is reliable and always provides information on spacecraft attitude, rate, and orbit, the magnetometer-GPS configuration solves GPS initialization problem, decreasing the convergence time for navigation estimate and improving the overall accuracy. Eventually the magnetometer-GPS configuration enables the system to avoid costly and inherently less reliable gyro for rate estimation. Being autonomous, this invention would provide for black-box spacecraft navigation, producing attitude, orbit, and rate estimates without any ground input with high accuracy and reliability.
Autonomous Navigation of USAF Spacecraft
1983-12-01
ASSEMBLY 21.LACn. THERM AL RADEARTOR ASEML 21.5 in REFERENC BASE PLATE JELECTRONICS REFERENMODULE ASSEMBLY (4 PLACES) PORRO PRISM & BASE MIRROR -24.25...involved in active satellite-to- satellite cracking for 14 days following one day of ground tracking. Earth geopotential resonance terms are the largest...rotates a prism at 9 rps such that optical signals are injected into each telescope parallel to the reielved starlight. The angle between tne two lines
Li, Tianlong; Chang, Xiaocong; Wu, Zhiguang; Li, Jinxing; Shao, Guangbin; Deng, Xinghong; Qiu, Jianbin; Guo, Bin; Zhang, Guangyu; He, Qiang; Li, Longqiu; Wang, Joseph
2017-09-26
Self-propelled micro- and nanoscale robots represent a rapidly emerging and fascinating robotics research area. However, designing autonomous and adaptive control systems for operating micro/nanorobotics in complex and dynamically changing environments, which is a highly demanding feature, is still an unmet challenge. Here we describe a smart microvehicle for precise autonomous navigation in complicated environments and traffic scenarios. The fully autonomous navigation system of the smart microvehicle is composed of a microscope-coupled CCD camera, an artificial intelligence planner, and a magnetic field generator. The microscope-coupled CCD camera provides real-time localization of the chemically powered Janus microsphere vehicle and environmental detection for path planning to generate optimal collision-free routes, while the moving direction of the microrobot toward a reference position is determined by the external electromagnetic torque. Real-time object detection offers adaptive path planning in response to dynamically changing environments. We demonstrate that the autonomous navigation system can guide the vehicle movement in complex patterns, in the presence of dynamically changing obstacles, and in complex biological environments. Such a navigation system for micro/nanoscale vehicles, relying on vision-based close-loop control and path planning, is highly promising for their autonomous operation in complex dynamic settings and unpredictable scenarios expected in a variety of realistic nanoscale scenarios.
Integrated polarization-dependent sensor for autonomous navigation
NASA Astrophysics Data System (ADS)
Liu, Ze; Zhang, Ran; Wang, Zhiwen; Guan, Le; Li, Bin; Chu, Jinkui
2015-01-01
Based on the navigation strategy of insects utilizing the polarized skylight, an integrated polarization-dependent sensor for autonomous navigation is presented. The navigation sensor has the features of compact structure, high precision, strong robustness, and a simple manufacture technique. The sensor is composed by integrating a complementary-metal-oxide-semiconductor sensor with a multiorientation nanowire grid polarizer. By nanoimprint lithography, the multiorientation nanowire polarizer is fabricated in one step and the alignment error is eliminated. The statistical theory is added to the interval-division algorithm to calculate the polarization angle of the incident light. The laboratory and outdoor tests for the navigation sensor are implemented and the errors of the measured angle are ±0.02 deg and ±1.3 deg, respectively. The results show that the proposed sensor has potential for application in autonomous navigation.
New vision system and navigation algorithm for an autonomous ground vehicle
NASA Astrophysics Data System (ADS)
Tann, Hokchhay; Shakya, Bicky; Merchen, Alex C.; Williams, Benjamin C.; Khanal, Abhishek; Zhao, Jiajia; Ahlgren, David J.
2013-12-01
Improvements were made to the intelligence algorithms of an autonomously operating ground vehicle, Q, which competed in the 2013 Intelligent Ground Vehicle Competition (IGVC). The IGVC required the vehicle to first navigate between two white lines on a grassy obstacle course, then pass through eight GPS waypoints, and pass through a final obstacle field. Modifications to Q included a new vision system with a more effective image processing algorithm for white line extraction. The path-planning algorithm adopted the vision system, creating smoother, more reliable navigation. With these improvements, Q successfully completed the basic autonomous navigation challenge, finishing tenth out of over 50 teams.
NASA Astrophysics Data System (ADS)
Endo, Yoichiro; Balloch, Jonathan C.; Grushin, Alexander; Lee, Mun Wai; Handelman, David
2016-05-01
Control of current tactical unmanned ground vehicles (UGVs) is typically accomplished through two alternative modes of operation, namely, low-level manual control using joysticks and high-level planning-based autonomous control. Each mode has its own merits as well as inherent mission-critical disadvantages. Low-level joystick control is vulnerable to communication delay and degradation, and high-level navigation often depends on uninterrupted GPS signals and/or energy-emissive (non-stealth) range sensors such as LIDAR for localization and mapping. To address these problems, we have developed a mid-level control technique where the operator semi-autonomously drives the robot relative to visible landmarks that are commonly recognizable by both humans and machines such as closed contours and structured lines. Our novel solution relies solely on optical and non-optical passive sensors and can be operated under GPS-denied, communication-degraded environments. To control the robot using these landmarks, we developed an interactive graphical user interface (GUI) that allows the operator to select landmarks in the robot's view and direct the robot relative to one or more of the landmarks. The integrated UGV control system was evaluated based on its ability to robustly navigate through indoor environments. The system was successfully field tested with QinetiQ North America's TALON UGV and Tactical Robot Controller (TRC), a ruggedized operator control unit (OCU). We found that the proposed system is indeed robust against communication delay and degradation, and provides the operator with steady and reliable control of the UGV in realistic tactical scenarios.
NASA Astrophysics Data System (ADS)
Lu, Shan; Zhang, Hanmo
2016-01-01
To meet the requirement of autonomous orbit determination, this paper proposes a fast curve fitting method based on earth ultraviolet features to obtain accurate earth vector direction, in order to achieve the high precision autonomous navigation. Firstly, combining the stable characters of earth ultraviolet radiance and the use of transmission model software of atmospheric radiation, the paper simulates earth ultraviolet radiation model on different time and chooses the proper observation band. Then the fast improved edge extracting method combined Sobel operator and local binary pattern (LBP) is utilized, which can both eliminate noises efficiently and extract earth ultraviolet limb features accurately. And earth's centroid locations on simulated images are estimated via the least square fitting method using part of the limb edges. Taken advantage of the estimated earth vector direction and earth distance, Extended Kalman Filter (EKF) is applied to realize the autonomous navigation finally. Experiment results indicate the proposed method can achieve a sub-pixel earth centroid location estimation and extremely enhance autonomous celestial navigation precision.
Recursive Gradient Estimation Using Splines for Navigation of Autonomous Vehicles.
1985-07-01
AUTONOMOUS VEHICLES C. N. SHEN DTIC " JULY 1985 SEP 1 219 85 V US ARMY ARMAMENT RESEARCH AND DEVELOPMENT CENTER LARGE CALIBER WEAPON SYSTEMS LABORATORY I...GRADIENT ESTIMATION USING SPLINES FOR NAVIGATION OF AUTONOMOUS VEHICLES Final S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(q) 8. CONTRACT OR GRANT NUMBER...which require autonomous vehicles . Essential to these robotic vehicles is an adequate and efficient computer vision system. A potentially more
High Speed Lunar Navigation for Crewed and Remotely Piloted Vehicles
NASA Technical Reports Server (NTRS)
Pedersen, L.; Allan, M.; To, V.; Utz, H.; Wojcikiewicz, W.; Chautems, C.
2010-01-01
Increased navigation speed is desirable for lunar rovers, whether autonomous, crewed or remotely operated, but is hampered by the low gravity, high contrast lighting and rough terrain. We describe lidar based navigation system deployed on NASA's K10 autonomous rover and to increase the terrain hazard situational awareness of the Lunar Electric Rover crew.
Autonomous navigation and obstacle avoidance for unmanned surface vehicles
NASA Astrophysics Data System (ADS)
Larson, Jacoby; Bruch, Michael; Ebken, John
2006-05-01
The US Navy and other Department of Defense (DoD) and Department of Homeland Security (DHS) organizations are increasingly interested in the use of unmanned surface vehicles (USVs) for a variety of missions and applications. In order for USVs to fill these roles, they must be capable of a relatively high degree of autonomous navigation. Space and Naval Warfare Systems Center, San Diego is developing core technologies required for robust USV operation in a real-world environment, primarily focusing on autonomous navigation, obstacle avoidance, and path planning.
Mobile Robot Designed with Autonomous Navigation System
NASA Astrophysics Data System (ADS)
An, Feng; Chen, Qiang; Zha, Yanfang; Tao, Wenyin
2017-10-01
With the rapid development of robot technology, robots appear more and more in all aspects of life and social production, people also ask more requirements for the robot, one is that robot capable of autonomous navigation, can recognize the road. Take the common household sweeping robot as an example, which could avoid obstacles, clean the ground and automatically find the charging place; Another example is AGV tracking car, which can following the route and reach the destination successfully. This paper introduces a new type of robot navigation scheme: SLAM, which can build the environment map in a totally strange environment, and at the same time, locate its own position, so as to achieve autonomous navigation function.
NASA Astrophysics Data System (ADS)
Trigo, Guilherme F.; Maass, Bolko; Krüger, Hans; Theil, Stephan
2018-01-01
Accurate autonomous navigation capabilities are essential for future lunar robotic landing missions with a pin-point landing requirement, since in the absence of direct line of sight to ground control during critical approach and landing phases, or when facing long signal delays the herein before mentioned capability is needed to establish a guidance solution to reach the landing site reliably. This paper focuses on the processing and evaluation of data collected from flight tests that consisted of scaled descent scenarios where the unmanned helicopter of approximately 85 kg approached a landing site from altitudes of 50 m down to 1 m for a downrange distance of 200 m. Printed crater targets were distributed along the ground track and their detection provided earth-fixed measurements. The Crater Navigation (CNav) algorithm used to detect and match the crater targets is an unmodified method used for real lunar imagery. We analyze the absolute position and attitude solutions of CNav obtained and recorded during these flight tests, and investigate the attainable quality of vehicle pose estimation using both CNav and measurements from a tactical-grade inertial measurement unit. The navigation filter proposed for this end corrects and calibrates the high-rate inertial propagation with the less frequent crater navigation fixes through a closed-loop, loosely coupled hybrid setup. Finally, the attainable accuracy of the fused solution is evaluated by comparison with the on-board ground-truth solution of a dual-antenna high-grade GNSS receiver. It is shown that the CNav is an enabler for building autonomous navigation systems with high quality and suitability for exploration mission scenarios.
A Feedforward Control Approach to the Local Navigation Problem for Autonomous Vehicles
1994-05-02
AD-A282 787 " A Feedforward Control Approach to the Local Navigation Problem for Autonomous Vehicles Alonzo Kelly CMU-RI-TR-94-17 The Robotics...follow, or a direction to prefer, it cannot generate its own strategic goals. Therefore, it solves the local planning problem for autonomous vehicles . The... autonomous vehicles . It is intelligent because it uses range images that are generated from either a laser rangefinder or a stereo triangulation
Open-Loop Performance of COBALT Precision Landing Payload on a Commercial Sub-Orbital Rocket
NASA Technical Reports Server (NTRS)
Restrepo, Carolina I.; Carson, John M., III; Amzajerdian, Farzin; Seubert, Carl R.; Lovelace, Ronney S.; McCarthy, Megan M.; Tse, Teming; Stelling, Richard; Collins, Steven M.
2018-01-01
An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) platform was conducted onboard the Masten Xodiac suborbital rocket testbed. The COBALT platform integrates NASA Guidance, Navigation and Control (GN&C) sensing technologies for autonomous, precise soft landing, including the Navigation Doppler Lidar (NDL) velocity and range sensor and the Lander Vision System (LVS) Terrain Relative Navigation (TRN) system. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a navigation solution that is independent of GPS and suitable for future, autonomous, planetary, landing systems. COBALT was a passive payload during the open loop tests. COBALT's sensors were actively taking data and processing it in real time, but the Xodiac rocket flew with its own GPS-navigation system as a risk reduction activity in the maturation of the technologies towards space flight. A future closed-loop test campaign is planned where the COBALT navigation solution will be used to fly its host vehicle.
NASA Astrophysics Data System (ADS)
Qin, M.; Wan, X.; Shao, Y. Y.; Li, S. Y.
2018-04-01
Vision-based navigation has become an attractive solution for autonomous navigation for planetary exploration. This paper presents our work of designing and building an autonomous vision-based GPS-denied unmanned vehicle and developing an ARFM (Adaptive Robust Feature Matching) based VO (Visual Odometry) software for its autonomous navigation. The hardware system is mainly composed of binocular stereo camera, a pan-and tilt, a master machine, a tracked chassis. And the ARFM-based VO software system contains four modules: camera calibration, ARFM-based 3D reconstruction, position and attitude calculation, BA (Bundle Adjustment) modules. Two VO experiments were carried out using both outdoor images from open dataset and indoor images captured by our vehicle, the results demonstrate that our vision-based unmanned vehicle is able to achieve autonomous localization and has the potential for future planetary exploration.
Target Trailing With Safe Navigation With Colregs for Maritime Autonomous Surface Vehicles
NASA Technical Reports Server (NTRS)
Kuwata, Yoshiaki (Inventor); Aghazarian, Hrand (Inventor); Huntsberger, Terrance L. (Inventor); Howard, Andrew B. (Inventor); Wolf, Michael T. (Inventor); Zarzhitsky, Dimitri V. (Inventor)
2014-01-01
Systems and methods for operating autonomous waterborne vessels in a safe manner. The systems include hardware for identifying the locations and motions of other vessels, as well as the locations of stationary objects that represent navigation hazards. By applying a computational method that uses a maritime navigation algorithm for avoiding hazards and obeying COLREGS using Velocity Obstacles to the data obtained, the autonomous vessel computes a safe and effective path to be followed in order to accomplish a desired navigational end result, while operating in a manner so as to avoid hazards and to maintain compliance with standard navigational procedures defined by international agreement. The systems and methods have been successfully demonstrated on water with radar and stereo cameras as the perception sensors, and integrated with a higher level planner for trailing a maneuvering target.
Autonomous Navigation of Small Uavs Based on Vehicle Dynamic Model
NASA Astrophysics Data System (ADS)
Khaghani, M.; Skaloud, J.
2016-03-01
This paper presents a novel approach to autonomous navigation for small UAVs, in which the vehicle dynamic model (VDM) serves as the main process model within the navigation filter. The proposed method significantly increases the accuracy and reliability of autonomous navigation, especially for small UAVs with low-cost IMUs on-board. This is achieved with no extra sensor added to the conventional INS/GNSS setup. This improvement is of special interest in case of GNSS outages, where inertial coasting drifts very quickly. In the proposed architecture, the solution to VDM equations provides the estimate of position, velocity, and attitude, which is updated within the navigation filter based on available observations, such as IMU data or GNSS measurements. The VDM is also fed with the control input to the UAV, which is available within the control/autopilot system. The filter is capable of estimating wind velocity and dynamic model parameters, in addition to navigation states and IMU sensor errors. Monte Carlo simulations reveal major improvements in navigation accuracy compared to conventional INS/GNSS navigation system during the autonomous phase, when satellite signals are not available due to physical obstruction or electromagnetic interference for example. In case of GNSS outages of a few minutes, position and attitude accuracy experiences improvements of orders of magnitude compared to inertial coasting. It means that during such scenario, the position-velocity-attitude (PVA) determination is sufficiently accurate to navigate the UAV to a home position without any signal that depends on vehicle environment.
Reactive navigation for autonomous guided vehicle using neuro-fuzzy techniques
NASA Astrophysics Data System (ADS)
Cao, Jin; Liao, Xiaoqun; Hall, Ernest L.
1999-08-01
A Neuro-fuzzy control method for navigation of an Autonomous Guided Vehicle robot is described. Robot navigation is defined as the guiding of a mobile robot to a desired destination or along a desired path in an environment characterized by as terrain and a set of distinct objects, such as obstacles and landmarks. The autonomous navigate ability and road following precision are mainly influenced by its control strategy and real-time control performance. Neural network and fuzzy logic control techniques can improve real-time control performance for mobile robot due to its high robustness and error-tolerance ability. For a mobile robot to navigate automatically and rapidly, an important factor is to identify and classify mobile robots' currently perceptual environment. In this paper, a new approach of the current perceptual environment feature identification and classification, which are based on the analysis of the classifying neural network and the Neuro- fuzzy algorithm, is presented. The significance of this work lies in the development of a new method for mobile robot navigation.
Relative Navigation for Formation Flying of Spacecraft
NASA Technical Reports Server (NTRS)
Alonso, Roberto; Du, Ju-Young; Hughes, Declan; Junkins, John L.; Crassidis, John L.
2001-01-01
This paper presents a robust and efficient approach for relative navigation and attitude estimation of spacecraft flying in formation. This approach uses measurements from a new optical sensor that provides a line of sight vector from the master spacecraft to the secondary satellite. The overall system provides a novel, reliable, and autonomous relative navigation and attitude determination system, employing relatively simple electronic circuits with modest digital signal processing requirements and is fully independent of any external systems. Experimental calibration results are presented, which are used to achieve accurate line of sight measurements. State estimation for formation flying is achieved through an optimal observer design. Also, because the rotational and translational motions are coupled through the observation vectors, three approaches are suggested to separate both signals just for stability analysis. Simulation and experimental results indicate that the combined sensor/estimator approach provides accurate relative position and attitude estimates.
Keshavan, J; Gremillion, G; Escobar-Alvarez, H; Humbert, J S
2014-06-01
Safe, autonomous navigation by aerial microsystems in less-structured environments is a difficult challenge to overcome with current technology. This paper presents a novel visual-navigation approach that combines bioinspired wide-field processing of optic flow information with control-theoretic tools for synthesis of closed loop systems, resulting in robustness and performance guarantees. Structured singular value analysis is used to synthesize a dynamic controller that provides good tracking performance in uncertain environments without resorting to explicit pose estimation or extraction of a detailed environmental depth map. Experimental results with a quadrotor demonstrate the vehicle's robust obstacle-avoidance behaviour in a straight line corridor, an S-shaped corridor and a corridor with obstacles distributed in the vehicle's path. The computational efficiency and simplicity of the current approach offers a promising alternative to satisfying the payload, power and bandwidth constraints imposed by aerial microsystems.
Laser Range and Bearing Finder for Autonomous Missions
NASA Technical Reports Server (NTRS)
Granade, Stephen R.
2004-01-01
NASA has recently re-confirmed their interest in autonomous systems as an enabling technology for future missions. In order for autonomous missions to be possible, highly-capable relative sensor systems are needed to determine an object's distance, direction, and orientation. This is true whether the mission is autonomous in-space assembly, rendezvous and docking, or rover surface navigation. Advanced Optical Systems, Inc. has developed a wide-angle laser range and bearing finder (RBF) for autonomous space missions. The laser RBF has a number of features that make it well-suited for autonomous missions. It has an operating range of 10 m to 5 km, with a 5 deg field of view. Its wide field of view removes the need for scanning systems such as gimbals, eliminating moving parts and making the sensor simpler and space qualification easier. Its range accuracy is 1% or better. It is designed to operate either as a stand-alone sensor or in tandem with a sensor that returns range, bearing, and orientation at close ranges, such as NASA's Advanced Video Guidance Sensor. We have assembled the initial prototype and are currently testing it. We will discuss the laser RBF's design and specifications. Keywords: laser range and bearing finder, autonomous rendezvous and docking, space sensors, on-orbit sensors, advanced video guidance sensor
Relative Navigation of Formation Flying Satellites
NASA Technical Reports Server (NTRS)
Long, Anne; Kelbel, David; Lee, Taesul; Leung, Dominic; Carpenter, Russell; Gramling, Cheryl; Bauer, Frank (Technical Monitor)
2002-01-01
The Guidance, Navigation, and Control Center (GNCC) at Goddard Space Flight Center (GSFC) has successfully developed high-accuracy autonomous satellite navigation systems using the National Aeronautics and Space Administration's (NASA's) space and ground communications systems and the Global Positioning System (GPS). In addition, an autonomous navigation system that uses celestial object sensor measurements is currently under development and has been successfully tested using real Sun and Earth horizon measurements.The GNCC has developed advanced spacecraft systems that provide autonomous navigation and control of formation flyers in near-Earth, high-Earth, and libration point orbits. To support this effort, the GNCC is assessing the relative navigation accuracy achievable for proposed formations using GPS, intersatellite crosslink, ground-to-satellite Doppler, and celestial object sensor measurements. This paper evaluates the performance of these relative navigation approaches for three proposed missions with two or more vehicles maintaining relatively tight formations. High-fidelity simulations were performed to quantify the absolute and relative navigation accuracy as a function of navigation algorithm and measurement type. Realistically-simulated measurements were processed using the extended Kalman filter implemented in the GPS Enhanced Inboard Navigation System (GEONS) flight software developed by GSFC GNCC. Solutions obtained by simultaneously estimating all satellites in the formation were compared with the results obtained using a simpler approach based on differencing independently estimated state vectors.
Autonomous satellite navigation using starlight refraction angle measurements
NASA Astrophysics Data System (ADS)
Ning, Xiaolin; Wang, Longhua; Bai, Xinbei; Fang, Jiancheng
2013-05-01
An on-board autonomous navigation capability is required to reduce the operation costs and enhance the navigation performance of future satellites. Autonomous navigation by stellar refraction is a type of autonomous celestial navigation method that uses high-accuracy star sensors instead of Earth sensors to provide information regarding Earth's horizon. In previous studies, the refraction apparent height has typically been used for such navigation. However, the apparent height cannot be measured directly by a star sensor and can only be calculated by the refraction angle and an atmospheric refraction model. Therefore, additional errors are introduced by the uncertainty and nonlinearity of atmospheric refraction models, which result in reduced navigation accuracy and reliability. A new navigation method based on the direct measurement of the refraction angle is proposed to solve this problem. Techniques for the determination of the refraction angle are introduced, and a measurement model for the refraction angle is established. The method is tested and validated by simulations. When the starlight refraction height ranges from 20 to 50 km, a positioning accuracy of better than 100 m can be achieved for a low-Earth-orbit (LEO) satellite using the refraction angle, while the positioning accuracy of the traditional method using the apparent height is worse than 500 m under the same conditions. Furthermore, an analysis of the factors that affect navigation accuracy, including the measurement accuracy of the refraction angle, the number of visible refracted stars per orbit and the installation azimuth of star sensor, is presented. This method is highly recommended for small satellites in particular, as no additional hardware besides two star sensors is required.
Angles-only navigation for autonomous orbital rendezvous
NASA Astrophysics Data System (ADS)
Woffinden, David C.
The proposed thesis of this dissertation has both a practical element and theoretical component which aim to answer key questions related to the use of angles-only navigation for autonomous orbital rendezvous. The first and fundamental principle to this work argues that an angles-only navigation filter can determine the relative position and orientation (pose) between two spacecraft to perform the necessary maneuvers and close proximity operations for autonomous orbital rendezvous. Second, the implementation of angles-only navigation for on-orbit applications is looked upon with skeptical eyes because of its perceived limitation of determining the relative range between two vehicles. This assumed, yet little understood subtlety can be formally characterized with a closed-form analytical observability criteria which specifies the necessary and sufficient conditions for determining the relative position and velocity with only angular measurements. With a mathematical expression of the observability criteria, it can be used to (1) identify the orbital rendezvous trajectories and maneuvers that ensure the relative position and velocity are observable for angles-only navigation, (2) quantify the degree or level of observability and (3) compute optimal maneuvers that maximize observability. In summary, the objective of this dissertation is to provide both a practical and theoretical foundation for the advancement of autonomous orbital rendezvous through the use of angles-only navigation.
Autonomous vision-based navigation for proximity operations around binary asteroids
NASA Astrophysics Data System (ADS)
Gil-Fernandez, Jesus; Ortega-Hernando, Guillermo
2018-02-01
Future missions to small bodies demand higher level of autonomy in the Guidance, Navigation and Control system for higher scientific return and lower operational costs. Different navigation strategies have been assessed for ESA's asteroid impact mission (AIM). The main objective of AIM is the detailed characterization of binary asteroid Didymos. The trajectories for the proximity operations shall be intrinsically safe, i.e., no collision in presence of failures (e.g., spacecraft entering safe mode), perturbations (e.g., non-spherical gravity field), and errors (e.g., maneuver execution error). Hyperbolic arcs with sufficient hyperbolic excess velocity are designed to fulfil the safety, scientific, and operational requirements. The trajectory relative to the asteroid is determined using visual camera images. The ground-based trajectory prediction error at some points is comparable to the camera Field Of View (FOV). Therefore, some images do not contain the entire asteroid. Autonomous navigation can update the state of the spacecraft relative to the asteroid at higher frequency. The objective of the autonomous navigation is to improve the on-board knowledge compared to the ground prediction. The algorithms shall fit in off-the-shelf, space-qualified avionics. This note presents suitable image processing and relative-state filter algorithms for autonomous navigation in proximity operations around binary asteroids.
Autonomous vision-based navigation for proximity operations around binary asteroids
NASA Astrophysics Data System (ADS)
Gil-Fernandez, Jesus; Ortega-Hernando, Guillermo
2018-06-01
Future missions to small bodies demand higher level of autonomy in the Guidance, Navigation and Control system for higher scientific return and lower operational costs. Different navigation strategies have been assessed for ESA's asteroid impact mission (AIM). The main objective of AIM is the detailed characterization of binary asteroid Didymos. The trajectories for the proximity operations shall be intrinsically safe, i.e., no collision in presence of failures (e.g., spacecraft entering safe mode), perturbations (e.g., non-spherical gravity field), and errors (e.g., maneuver execution error). Hyperbolic arcs with sufficient hyperbolic excess velocity are designed to fulfil the safety, scientific, and operational requirements. The trajectory relative to the asteroid is determined using visual camera images. The ground-based trajectory prediction error at some points is comparable to the camera Field Of View (FOV). Therefore, some images do not contain the entire asteroid. Autonomous navigation can update the state of the spacecraft relative to the asteroid at higher frequency. The objective of the autonomous navigation is to improve the on-board knowledge compared to the ground prediction. The algorithms shall fit in off-the-shelf, space-qualified avionics. This note presents suitable image processing and relative-state filter algorithms for autonomous navigation in proximity operations around binary asteroids.
Using neuromorphic optical sensors for spacecraft absolute and relative navigation
NASA Astrophysics Data System (ADS)
Shake, Christopher M.
We develop a novel attitude determination system (ADS) for use on nano spacecraft using neuromorphic optical sensors. The ADS intends to support nano-satellite operations by providing low-cost, low-mass, low-volume, low-power, and redundant attitude determination capabilities with quick and straightforward onboard programmability for real time spacecraft operations. The ADS is experimentally validated with commercial-off-the-shelf optical devices that perform sensing and image processing on the same circuit board and are biologically inspired by insects' vision systems, which measure optical flow while navigating in the environment. The firmware on the devices is modified to both perform the additional biologically inspired task of tracking objects and communicate with a PC/104 form-factor embedded computer running Real Time Application Interface Linux used on a spacecraft simulator. Algorithms are developed for operations using optical flow, point tracking, and hybrid modes with the sensors, and the performance of the system in all three modes is assessed using a spacecraft simulator in the Advanced Autonomous Multiple Spacecraft (ADAMUS) laboratory at Rensselaer. An existing relative state determination method is identified to be combined with the novel ADS to create a self-contained navigation system for nano spacecraft. The performance of the method is assessed in simulation and found not to match the results from its authors using only conditions and equations already published. An improved target inertia tensor method is proposed as an update to the existing relative state method, but found not to perform as expected, but is presented for others to build upon.
Autonomous satellite navigation with the Global Positioning System
NASA Technical Reports Server (NTRS)
Fuchs, A. J.; Wooden, W. H., II; Long, A. C.
1977-01-01
This paper discusses the potential of using the Global Positioning System (GPS) to provide autonomous navigation capability to NASA satellites in the 1980 era. Some of the driving forces motivating autonomous navigation are presented. These include such factors as advances in attitude control systems, onboard science annotation, and onboard gridding of imaging data. Simulation results which demonstrate baseline orbit determination accuracies using GPS data on Seasat, Landsat-D, and the Solar Maximum Mission are presented. Emphasis is placed on identifying error sources such as GPS time, GPS ephemeris, user timing biases, and user orbit dynamics, and in a parametric sense on evaluating their contribution to the orbit determination accuracies.
Autonomous GPS/INS navigation experiment for Space Transfer Vehicle
NASA Technical Reports Server (NTRS)
Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. W.
1993-01-01
An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.
Autonomous GPS/INS navigation experiment for Space Transfer Vehicle (STV)
NASA Technical Reports Server (NTRS)
Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. Wayne
1991-01-01
An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.
Autonomous GPS/INS navigation experiment for Space Transfer Vehicle
NASA Astrophysics Data System (ADS)
Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. W.
1993-07-01
An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.
LABRADOR: a learning autonomous behavior-based robot for adaptive detection and object retrieval
NASA Astrophysics Data System (ADS)
Yamauchi, Brian; Moseley, Mark; Brookshire, Jonathan
2013-01-01
As part of the TARDEC-funded CANINE (Cooperative Autonomous Navigation in a Networked Environment) Program, iRobot developed LABRADOR (Learning Autonomous Behavior-based Robot for Adaptive Detection and Object Retrieval). LABRADOR was based on the rugged, man-portable, iRobot PackBot unmanned ground vehicle (UGV) equipped with an explosives ordnance disposal (EOD) manipulator arm and a custom gripper. For LABRADOR, we developed a vision-based object learning and recognition system that combined a TLD (track-learn-detect) filter based on object shape features with a color-histogram-based object detector. Our vision system was able to learn in real-time to recognize objects presented to the robot. We also implemented a waypoint navigation system based on fused GPS, IMU (inertial measurement unit), and odometry data. We used this navigation capability to implement autonomous behaviors capable of searching a specified area using a variety of robust coverage strategies - including outward spiral, random bounce, random waypoint, and perimeter following behaviors. While the full system was not integrated in time to compete in the CANINE competition event, we developed useful perception, navigation, and behavior capabilities that may be applied to future autonomous robot systems.
He, Bo; Zhang, Hongjin; Li, Chao; Zhang, Shujing; Liang, Yan; Yan, Tianhong
2011-01-01
This paper addresses an autonomous navigation method for the autonomous underwater vehicle (AUV) C-Ranger applying information-filter-based simultaneous localization and mapping (SLAM), and its sea trial experiments in Tuandao Bay (Shangdong Province, P.R. China). Weak links in the information matrix in an extended information filter (EIF) can be pruned to achieve an efficient approach-sparse EIF algorithm (SEIF-SLAM). All the basic update formulae can be implemented in constant time irrespective of the size of the map; hence the computational complexity is significantly reduced. The mechanical scanning imaging sonar is chosen as the active sensing device for the underwater vehicle, and a compensation method based on feedback of the AUV pose is presented to overcome distortion of the acoustic images due to the vehicle motion. In order to verify the feasibility of the navigation methods proposed for the C-Ranger, a sea trial was conducted in Tuandao Bay. Experimental results and analysis show that the proposed navigation approach based on SEIF-SLAM improves the accuracy of the navigation compared with conventional method; moreover the algorithm has a low computational cost when compared with EKF-SLAM. PMID:22346682
He, Bo; Zhang, Hongjin; Li, Chao; Zhang, Shujing; Liang, Yan; Yan, Tianhong
2011-01-01
This paper addresses an autonomous navigation method for the autonomous underwater vehicle (AUV) C-Ranger applying information-filter-based simultaneous localization and mapping (SLAM), and its sea trial experiments in Tuandao Bay (Shangdong Province, P.R. China). Weak links in the information matrix in an extended information filter (EIF) can be pruned to achieve an efficient approach-sparse EIF algorithm (SEIF-SLAM). All the basic update formulae can be implemented in constant time irrespective of the size of the map; hence the computational complexity is significantly reduced. The mechanical scanning imaging sonar is chosen as the active sensing device for the underwater vehicle, and a compensation method based on feedback of the AUV pose is presented to overcome distortion of the acoustic images due to the vehicle motion. In order to verify the feasibility of the navigation methods proposed for the C-Ranger, a sea trial was conducted in Tuandao Bay. Experimental results and analysis show that the proposed navigation approach based on SEIF-SLAM improves the accuracy of the navigation compared with conventional method; moreover the algorithm has a low computational cost when compared with EKF-SLAM.
Autonomous satellite navigation by stellar refraction
NASA Technical Reports Server (NTRS)
Gounley, R.; White, R.; Gai, E.
1983-01-01
This paper describes an error analysis of an autonomous navigator using refraction measurements of starlight passing through the upper atmosphere. The analysis is based on a discrete linear Kalman filter. The filter generated steady-state values of navigator performance for a variety of test cases. Results of these simulations show that in low-earth orbit position-error standard deviations of less than 0.100 km may be obtained using only 40 star sightings per orbit.
Learning for autonomous navigation
NASA Technical Reports Server (NTRS)
Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric
2005-01-01
Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter.
Synopsis of Precision Landing and Hazard Avoidance (PL&HA) Capabilities for Space Exploration
NASA Technical Reports Server (NTRS)
Robertson, Edward A.
2017-01-01
Until recently, robotic exploration missions to the Moon, Mars, and other solar system bodies relied upon controlled blind landings. Because terrestrial techniques for terrain relative navigation (TRN) had not yet been evolved to support space exploration, landing dispersions were driven by the capabilities of inertial navigation systems combined with surface relative altimetry and velocimetry. Lacking tight control over the actual landing location, mission success depended on the statistical vetting of candidate landing areas within the predicted landing dispersion ellipse based on orbital reconnaissance data, combined with the ability of the spacecraft to execute a controlled landing in terms of touchdown attitude, attitude rates, and velocity. In addition, the sensors, algorithms, and processing technologies required to perform autonomous hazard detection and avoidance in real time during the landing sequence were not yet available. Over the past decade, NASA has invested substantial resources on the development, integration, and testing of autonomous precision landing and hazard avoidance (PL&HA) capabilities. In addition to substantially improving landing accuracy and safety, these autonomous PL&HA functions also offer access to targets of interest located within more rugged and hazardous terrain. Optical TRN systems are baselined on upcoming robotic landing missions to the Moon and Mars, and NASA JPL is investigating the development of a comprehensive PL&HA system for a Europa lander. These robotic missions will demonstrate and mature PL&HA technologies that are considered essential for future human exploration missions. PL&HA technologies also have applications to rendezvous and docking/berthing with other spacecraft, as well as proximity navigation, contact, and retrieval missions to smaller bodies with microgravity environments, such as asteroids.
A low-cost test-bed for real-time landmark tracking
NASA Astrophysics Data System (ADS)
Csaszar, Ambrus; Hanan, Jay C.; Moreels, Pierre; Assad, Christopher
2007-04-01
A low-cost vehicle test-bed system was developed to iteratively test, refine and demonstrate navigation algorithms before attempting to transfer the algorithms to more advanced rover prototypes. The platform used here was a modified radio controlled (RC) car. A microcontroller board and onboard laptop computer allow for either autonomous or remote operation via a computer workstation. The sensors onboard the vehicle represent the types currently used on NASA-JPL rover prototypes. For dead-reckoning navigation, optical wheel encoders, a single axis gyroscope, and 2-axis accelerometer were used. An ultrasound ranger is available to calculate distance as a substitute for the stereo vision systems presently used on rovers. The prototype also carries a small laptop computer with a USB camera and wireless transmitter to send real time video to an off-board computer. A real-time user interface was implemented that combines an automatic image feature selector, tracking parameter controls, streaming video viewer, and user generated or autonomous driving commands. Using the test-bed, real-time landmark tracking was demonstrated by autonomously driving the vehicle through the JPL Mars yard. The algorithms tracked rocks as waypoints. This generated coordinates calculating relative motion and visually servoing to science targets. A limitation for the current system is serial computing-each additional landmark is tracked in order-but since each landmark is tracked independently, if transferred to appropriate parallel hardware, adding targets would not significantly diminish system speed.
NASA Technical Reports Server (NTRS)
Cotariu, Steven S.
1991-01-01
Pattern recognition may supplement or replace certain navigational aids on spacecraft in docking or landing activities. The need to correctly identify terrain features remains critical in preparation of autonomous planetary landing. One technique that may solve this problem is optical correlation. Correlation has been successfully demonstrated under ideal conditions; however, noise significantly affects the ability of the correlator to accurately identify input signals. Optical correlation in the presence of noise must be successfully demonstrated before this technology can be incorporated into system design. An optical correlator is designed and constructed using a modified 2f configuration. Liquid crystal televisions (LCTV) are used as the spatial light modulators (SLM) for both the input and filter devices. The filter LCTV is characterized and an operating curve is developed. Determination of this operating curve is critical for reduction of input noise. Correlation of live input with a programmable filter is demonstrated.
NASA Astrophysics Data System (ADS)
Cotariu, Steven S.
1991-12-01
Pattern recognition may supplement or replace certain navigational aids on spacecraft in docking or landing activities. The need to correctly identify terrain features remains critical in preparation of autonomous planetary landing. One technique that may solve this problem is optical correlation. Correlation has been successfully demonstrated under ideal conditions; however, noise significantly affects the ability of the correlator to accurately identify input signals. Optical correlation in the presence of noise must be successfully demonstrated before this technology can be incorporated into system design. An optical correlator is designed and constructed using a modified 2f configuration. Liquid crystal televisions (LCTV) are used as the spatial light modulators (SLM) for both the input and filter devices. The filter LCTV is characterized and an operating curve is developed. Determination of this operating curve is critical for reduction of input noise. Correlation of live input with a programmable filter is demonstrated.
Multi-Spacecraft Autonomous Positioning System
NASA Technical Reports Server (NTRS)
Anzalone, Evan
2015-01-01
As the number of spacecraft in simultaneous operation continues to grow, there is an increased dependency on ground-based navigation support. The current baseline system for deep space navigation utilizes Earth-based radiometric tracking, requiring long-duration observations to perform orbit determination and generate a state update. The age, complexity, and high utilization of the ground assets pose a risk to spacecraft navigation performance. In order to perform complex operations at large distances from Earth, such as extraterrestrial landing and proximity operations, autonomous systems are required. With increasingly complex mission operations, the need for frequent and Earth-independent navigation capabilities is further reinforced. The Multi-spacecraft Autonomous Positioning System (MAPS) takes advantage of the growing interspacecraft communication network and infrastructure to allow for Earth-autonomous state measurements to enable network-based space navigation. A notional concept of operations is given in figure 1. This network is already being implemented and routinely used in Martian communications through the use of the Mars Reconnaissance Orbiter and Mars Odyssey spacecraft as relays for surface assets. The growth of this communications architecture is continued through MAVEN, and future potential commercial Mars telecom orbiters. This growing network provides an initial Marslocal capability for inter-spacecraft communication and navigation. These navigation updates are enabled by cross-communication between assets in the network, coupled with onboard navigation estimation routines to integrate packet travel time to generate ranging measurements. Inter-spacecraft communication allows for frequent state broadcasts and time updates from trusted references. The architecture is a software-based solution, enabling its implementation on a wide variety of current assets, with the operational constraints and measurement accuracy determined by onboard systems.
NASA Technical Reports Server (NTRS)
Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.
1994-01-01
A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.
Precise laser gyroscope for autonomous inertial navigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuznetsov, A G; Molchanov, A V; Izmailov, E A
2015-01-31
Requirements to gyroscopes of strapdown inertial navigation systems for aircraft application are formulated. The construction of a ring helium – neon laser designed for autonomous navigation is described. The processes that determine the laser service life and the relation between the random error of the angular velocity measurement and the surface relief features of the cavity mirrors are analysed. The results of modelling one of the promising approaches to processing the laser gyroscope signals are presented. (laser gyroscopes)
Study of the Navigation Method for a Snake Robot Based on the Kinematics Model with MEMS IMU.
Zhao, Xu; Dou, Lihua; Su, Zhong; Liu, Ning
2018-03-16
A snake robot is a type of highly redundant mobile robot that significantly differs from a tracked robot, wheeled robot and legged robot. To address the issue of a snake robot performing self-localization in the application environment without assistant orientation, an autonomous navigation method is proposed based on the snake robot's motion characteristic constraints. The method realized the autonomous navigation of the snake robot with non-nodes and an external assistant using its own Micro-Electromechanical-Systems (MEMS) Inertial-Measurement-Unit (IMU). First, it studies the snake robot's motion characteristics, builds the kinematics model, and then analyses the motion constraint characteristics and motion error propagation properties. Second, it explores the snake robot's navigation layout, proposes a constraint criterion and the fixed relationship, and makes zero-state constraints based on the motion features and control modes of a snake robot. Finally, it realizes autonomous navigation positioning based on the Extended-Kalman-Filter (EKF) position estimation method under the constraints of its motion characteristics. With the self-developed snake robot, the test verifies the proposed method, and the position error is less than 5% of Total-Traveled-Distance (TDD). In a short-distance environment, this method is able to meet the requirements of a snake robot in order to perform autonomous navigation and positioning in traditional applications and can be extended to other familiar multi-link robots.
Open-Loop Flight Testing of COBALT GN&C Technologies for Precise Soft Landing
NASA Technical Reports Server (NTRS)
Carson, John M., III; Amzajerdian, Farzin; Seubert, Carl R.; Restrepo, Carolina I.
2017-01-01
A terrestrial, open-loop (OL) flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) platform was conducted onboard the Masten Xodiac suborbital rocket testbed, with support through the NASA Advanced Exploration Systems (AES), Game Changing Development (GCD), and Flight Opportunities (FO) Programs. The COBALT platform integrates NASA Guidance, Navigation and Control (GN&C) sensing technologies for autonomous, precise soft landing, including the Navigation Doppler Lidar (NDL) velocity and range sensor and the Lander Vision System (LVS) Terrain Relative Navigation (TRN) system. A specialized navigation filter running onboard COBALT fuzes the NDL and LVS data in real time to produce a precise navigation solution that is independent of the Global Positioning System (GPS) and suitable for future, autonomous planetary landing systems. The OL campaign tested COBALT as a passive payload, with COBALT data collection and filter execution, but with the Xodiac vehicle Guidance and Control (G&C) loops closed on a Masten GPS-based navigation solution. The OL test was performed as a risk reduction activity in preparation for an upcoming 2017 closed-loop (CL) flight campaign in which Xodiac G&C will act on the COBALT navigation solution and the GPS-based navigation will serve only as a backup monitor.
NASA Astrophysics Data System (ADS)
Armstrong, Roy A.; Singh, Hanumant
2006-09-01
Optical imaging of coral reefs and other benthic communities present below one attenuation depth, the limit of effective airborne and satellite remote sensing, requires the use of in situ platforms such as autonomous underwater vehicles (AUVs). The Seabed AUV, which was designed for high-resolution underwater optical and acoustic imaging, was used to characterize several deep insular shelf reefs of Puerto Rico and the US Virgin Islands using digital imagery. The digital photo transects obtained by the Seabed AUV provided quantitative data on living coral, sponge, gorgonian, and macroalgal cover as well as coral species richness and diversity. Rugosity, an index of structural complexity, was derived from the pencil-beam acoustic data. The AUV benthic assessments could provide the required information for selecting unique areas of high coral cover, biodiversity and structural complexity for habitat protection and ecosystem-based management. Data from Seabed sensors and related imaging technologies are being used to conduct multi-beam sonar surveys, 3-D image reconstruction from a single camera, photo mosaicking, image based navigation, and multi-sensor fusion of acoustic and optical data.
Mamdani Fuzzy System for Indoor Autonomous Mobile Robot
NASA Astrophysics Data System (ADS)
Khan, M. K. A. Ahamed; Rashid, Razif; Elamvazuthi, I.
2011-06-01
Several control algorithms for autonomous mobile robot navigation have been proposed in the literature. Recently, the employment of non-analytical methods of computing such as fuzzy logic, evolutionary computation, and neural networks has demonstrated the utility and potential of these paradigms for intelligent control of mobile robot navigation. In this paper, Mamdani fuzzy system for an autonomous mobile robot is developed. The paper begins with the discussion on the conventional controller and then followed by the description of fuzzy logic controller in detail.
Draper Laboratory small autonomous aerial vehicle
NASA Astrophysics Data System (ADS)
DeBitetto, Paul A.; Johnson, Eric N.; Bosse, Michael C.; Trott, Christian A.
1997-06-01
The Charles Stark Draper Laboratory, Inc. and students from Massachusetts Institute of Technology and Boston University have cooperated to develop an autonomous aerial vehicle that won the 1996 International Aerial Robotics Competition. This paper describes the approach, system architecture and subsystem designs for the entry. This entry represents a combination of many technology areas: navigation, guidance, control, vision processing, human factors, packaging, power, real-time software, and others. The aerial vehicle, an autonomous helicopter, performs navigation and control functions using multiple sensors: differential GPS, inertial measurement unit, sonar altimeter, and a flux compass. The aerial transmits video imagery to the ground. A ground based vision processor converts the image data into target position and classification estimates. The system was designed, built, and flown in less than one year and has provided many lessons about autonomous vehicle systems, several of which are discussed. In an appendix, our current research in augmenting the navigation system with vision- based estimates is presented.
NASA Technical Reports Server (NTRS)
Gramling, C. J.; Long, A. C.; Lee, T.; Ottenstein, N. A.; Samii, M. V.
1991-01-01
A Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) is currently being developed by NASA to provide a high accuracy autonomous navigation capability for users of TDRSS and its successor, the Advanced TDRSS (ATDRSS). The fully autonomous user onboard navigation system will support orbit determination, time determination, and frequency determination, based on observation of a continuously available, unscheduled navigation beacon signal. A TONS experiment will be performed in conjunction with the Explorer Platform (EP) Extreme Ultraviolet Explorer (EUVE) mission to flight quality TONS Block 1. An overview is presented of TONS and a preliminary analysis of the navigation accuracy anticipated for the TONS experiment. Descriptions of the TONS experiment and the associated navigation objectives, as well as a description of the onboard navigation algorithms, are provided. The accuracy of the selected algorithms is evaluated based on the processing of realistic simulated TDRSS one way forward link Doppler measurements. The analysis process is discussed and the associated navigation accuracy results are presented.
Orion Optical Navigation Progress Toward Exploration Mission 1
NASA Technical Reports Server (NTRS)
Holt, Greg N.; D'Souza, Christopher N.; Saley, David
2018-01-01
Optical navigation of human spacecraft was proposed on Gemini and implemented successfully on Apollo as a means of autonomously operating the vehicle in the event of lost communication with controllers on Earth. The Orion emergency return system utilizing optical navigation has matured in design over the last several years, and is currently undergoing the final implementation and test phase in preparation for Exploration Mission 1 (EM-1) in 2019. The software development is past its Critical Design Review, and is progressing through test and certification for human rating. The filter architecture uses a square-root-free UDU covariance factorization. Linear Covariance Analysis (LinCov) was used to analyze the measurement models and the measurement error models on a representative EM-1 trajectory. The Orion EM-1 flight camera was calibrated at the Johnson Space Center (JSC) electro-optics lab. To permanently stake the focal length of the camera a 500 mm focal length refractive collimator was used. Two Engineering Design Unit (EDU) cameras and an EDU star tracker were used for a live-sky test in Denver. In-space imagery with high-fidelity truth metadata is rare so these live-sky tests provide one of the closest real-world analogs to operational use. A hardware-in-the-loop test rig was developed in the Johnson Space Center Electro-Optics Lab to exercise the OpNav system prior to integrated testing on the Orion vehicle. The software is verified with synthetic images. Several hundred off-nominal images are also used to analyze robustness and fault detection in the software. These include effects such as stray light, excess radiation damage, and specular reflections, and are used to help verify the tuning parameters chosen for the algorithms such as earth atmosphere bias, minimum pixel intensity, and star detection thresholds.
Spatial abstraction for autonomous robot navigation.
Epstein, Susan L; Aroor, Anoop; Evanusa, Matthew; Sklar, Elizabeth I; Parsons, Simon
2015-09-01
Optimal navigation for a simulated robot relies on a detailed map and explicit path planning, an approach problematic for real-world robots that are subject to noise and error. This paper reports on autonomous robots that rely on local spatial perception, learning, and commonsense rationales instead. Despite realistic actuator error, learned spatial abstractions form a model that supports effective travel.
SLAM algorithm applied to robotics assistance for navigation in unknown environments.
Cheein, Fernando A Auat; Lopez, Natalia; Soria, Carlos M; di Sciascio, Fernando A; Pereira, Fernando Lobo; Carelli, Ricardo
2010-02-17
The combination of robotic tools with assistance technology determines a slightly explored area of applications and advantages for disability or elder people in their daily tasks. Autonomous motorized wheelchair navigation inside an environment, behaviour based control of orthopaedic arms or user's preference learning from a friendly interface are some examples of this new field. In this paper, a Simultaneous Localization and Mapping (SLAM) algorithm is implemented to allow the environmental learning by a mobile robot while its navigation is governed by electromyographic signals. The entire system is part autonomous and part user-decision dependent (semi-autonomous). The environmental learning executed by the SLAM algorithm and the low level behaviour-based reactions of the mobile robot are robotic autonomous tasks, whereas the mobile robot navigation inside an environment is commanded by a Muscle-Computer Interface (MCI). In this paper, a sequential Extended Kalman Filter (EKF) feature-based SLAM algorithm is implemented. The features correspond to lines and corners -concave and convex- of the environment. From the SLAM architecture, a global metric map of the environment is derived. The electromyographic signals that command the robot's movements can be adapted to the patient's disabilities. For mobile robot navigation purposes, five commands were obtained from the MCI: turn to the left, turn to the right, stop, start and exit. A kinematic controller to control the mobile robot was implemented. A low level behavior strategy was also implemented to avoid robot's collisions with the environment and moving agents. The entire system was tested in a population of seven volunteers: three elder, two below-elbow amputees and two young normally limbed patients. The experiments were performed within a closed low dynamic environment. Subjects took an average time of 35 minutes to navigate the environment and to learn how to use the MCI. The SLAM results have shown a consistent reconstruction of the environment. The obtained map was stored inside the Muscle-Computer Interface. The integration of a highly demanding processing algorithm (SLAM) with a MCI and the communication between both in real time have shown to be consistent and successful. The metric map generated by the mobile robot would allow possible future autonomous navigation without direct control of the user, whose function could be relegated to choose robot destinations. Also, the mobile robot shares the same kinematic model of a motorized wheelchair. This advantage can be exploited for wheelchair autonomous navigation.
Mixed-mode VLSI optic flow sensors for in-flight control of a micro air vehicle
NASA Astrophysics Data System (ADS)
Barrows, Geoffrey L.; Neely, C.
2000-11-01
NRL is developing compact optic flow sensors for use in a variety of small-scale navigation and collision avoidance tasks. These sensors are being developed for use in micro air vehicles (MAVs), which are autonomous aircraft whose maximum dimension is on the order of 15 cm. To achieve desired weight specifications of 1 - 2 grams, mixed-signal VLSI circuitry is being used to develop compact focal plane sensors that directly compute optic flow. As an interim proof of principle, we have constructed a sensor comprising a focal plane sensor head with on-chip processing and a back-end PIC microcontroller. This interim sensors weighs approximately 25 grams and is able to measure optic flow with real-world and low-contrast textures. Variations of this sensor have been used to control the flight of a glider in real-time to avoid collisions with walls.
Study of the Navigation Method for a Snake Robot Based on the Kinematics Model with MEMS IMU
Dou, Lihua; Su, Zhong; Liu, Ning
2018-01-01
A snake robot is a type of highly redundant mobile robot that significantly differs from a tracked robot, wheeled robot and legged robot. To address the issue of a snake robot performing self-localization in the application environment without assistant orientation, an autonomous navigation method is proposed based on the snake robot’s motion characteristic constraints. The method realized the autonomous navigation of the snake robot with non-nodes and an external assistant using its own Micro-Electromechanical-Systems (MEMS) Inertial-Measurement-Unit (IMU). First, it studies the snake robot’s motion characteristics, builds the kinematics model, and then analyses the motion constraint characteristics and motion error propagation properties. Second, it explores the snake robot’s navigation layout, proposes a constraint criterion and the fixed relationship, and makes zero-state constraints based on the motion features and control modes of a snake robot. Finally, it realizes autonomous navigation positioning based on the Extended-Kalman-Filter (EKF) position estimation method under the constraints of its motion characteristics. With the self-developed snake robot, the test verifies the proposed method, and the position error is less than 5% of Total-Traveled-Distance (TDD). In a short-distance environment, this method is able to meet the requirements of a snake robot in order to perform autonomous navigation and positioning in traditional applications and can be extended to other familiar multi-link robots. PMID:29547515
New Vectorial Propulsion System and Trajectory Control Designs for Improved AUV Mission Autonomy.
Masmitja, Ivan; Gonzalez, Julian; Galarza, Cesar; Gomariz, Spartacus; Aguzzi, Jacopo; Del Rio, Joaquin
2018-04-17
Autonomous Underwater Vehicles (AUV) are proving to be a promising platform design for multidisciplinary autonomous operability with a wide range of applications in marine ecology and geoscience. Here, two novel contributions towards increasing the autonomous navigation capability of a new AUV prototype (the Guanay II) as a mix between a propelled vehicle and a glider are presented. Firstly, a vectorial propulsion system has been designed to provide full vehicle maneuverability in both horizontal and vertical planes. Furthermore, two controllers have been designed, based on fuzzy controls, to provide the vehicle with autonomous navigation capabilities. Due to the decoupled system propriety, the controllers in the horizontal plane have been designed separately from the vertical plane. This class of non-linear controllers has been used to interpret linguistic laws into different zones of functionality. This method provided good performance, used as interpolation between different rules or linear controls. Both improvements have been validated through simulations and field tests, displaying good performance results. Finally, the conclusion of this work is that the Guanay II AUV has a solid controller to perform autonomous navigation and carry out vertical immersions.
COBALT: Development of a Platform to Flight Test Lander GN&C Technologies on Suborbital Rockets
NASA Technical Reports Server (NTRS)
Carson, John M., III; Seubert, Carl R.; Amzajerdian, Farzin; Bergh, Chuck; Kourchians, Ara; Restrepo, Carolina I.; Villapando, Carlos Y.; O'Neal, Travis V.; Robertson, Edward A.; Pierrottet, Diego;
2017-01-01
The NASA COBALT Project (CoOperative Blending of Autonomous Landing Technologies) is developing and integrating new precision-landing Guidance, Navigation and Control (GN&C) technologies, along with developing a terrestrial fight-test platform for Technology Readiness Level (TRL) maturation. The current technologies include a third- generation Navigation Doppler Lidar (NDL) sensor for ultra-precise velocity and line- of-site (LOS) range measurements, and the Lander Vision System (LVS) that provides passive-optical Terrain Relative Navigation (TRN) estimates of map-relative position. The COBALT platform is self contained and includes the NDL and LVS sensors, blending filter, a custom compute element, power unit, and communication system. The platform incorporates a structural frame that has been designed to integrate with the payload frame onboard the new Masten Xodiac vertical take-o, vertical landing (VTVL) terrestrial rocket vehicle. Ground integration and testing is underway, and terrestrial fight testing onboard Xodiac is planned for 2017 with two flight campaigns: one open-loop and one closed-loop.
An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles.
Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo
2017-03-25
Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance.
An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles
Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo
2017-01-01
Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance. PMID:28346346
Autonomous vehicle navigation utilizing fuzzy controls concepts for a next generation wheelchair.
Hansen, J D; Barrett, S F; Wright, C H G; Wilcox, M
2008-01-01
Three different positioning techniques were investigated to create an autonomous vehicle that could accurately navigate towards a goal: Global Positioning System (GPS), compass dead reckoning, and Ackerman steering. Each technique utilized a fuzzy logic controller that maneuvered a four-wheel car towards a target. The reliability and the accuracy of the navigation methods were investigated by modeling the algorithms in software and implementing them in hardware. To implement the techniques in hardware, positioning sensors were interfaced to a remote control car and a microprocessor. The microprocessor utilized the sensor measurements to orient the car with respect to the target. Next, a fuzzy logic control algorithm adjusted the front wheel steering angle to minimize the difference between the heading and bearing. After minimizing the heading error, the car maintained a straight steering angle along its path to the final destination. The results of this research can be used to develop applications that require precise navigation. The design techniques can also be implemented on alternate platforms such as a wheelchair to assist with autonomous navigation.
76 FR 21772 - Navigation Safety Advisory Council
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-18
..., routing measures, marine information, diving safety, and aids to navigation systems. Agenda The NAVSAC... discussion of autonomous unmanned vessels and discuss their implications for the Inland Navigation Rules. A... DEPARTMENT OF HOMELAND SECURITY Coast Guard [Docket No. USCG-2011-0204] Navigation Safety Advisory...
NASA Technical Reports Server (NTRS)
Fuchs, A. J. (Editor)
1979-01-01
Onboard and real time image processing to enhance geometric correction of the data is discussed with application to autonomous navigation and attitude and orbit determination. Specific topics covered include: (1) LANDSAT landmark data; (2) star sensing and pattern recognition; (3) filtering algorithms for Global Positioning System; and (4) determining orbital elements for geostationary satellites.
Developments in Acoustic Navigation and Communication for High-Latitude Ocean Research
NASA Astrophysics Data System (ADS)
Gobat, J.; Lee, C.
2006-12-01
Developments in autonomous platforms (profiling floats, drifters, long-range gliders and propeller-driven vehicles) offer the possibility of unprecedented access to logistically difficult polar regions that challenge conventional techniques. Currently, however, navigation and telemetry for these platforms rely on satellite positioning and communications poorly suited for high-latitude applications where ice cover restricts access to the sea surface. A similar infrastructure offering basin-wide acoustic geolocation and telemetry would allow the community to employ autonomous platforms to address previously intractable problems in Arctic oceanography. Two recent efforts toward the development of such an infrastructure are reported here. As part of an observational array monitoring fluxes through Davis Strait, development of real-time RAFOS acoustic navigation for gliders has been ongoing since autumn 2004. To date, test deployments have been conducted in a 260 Hz field in the Pacific and 780 Hz fields off Norway and in Davis Strait. Real-time navigation accuracy of ~1~km is achievable. Autonomously navigating gliders will operate under ice cover beginning in autumn 2006. In addition to glider navigation development, the Davis Strait array moorings carry fixed RAFOS recorders to study propagation over a range of distances under seasonally varying ice cover. Results from the under-ice propagation and glider navigation experiments are presented. Motivated by the need to coordinate these types of development efforts, an international group of acousticians, autonomous platform developers, high-latitude oceanographers and marine mammal researchers gathered in Seattle, U.S.A. from 27 February -- 1 March 2006 for an NSF Office of Polar Programs sponsored Acoustic Navigation and Communication for High-latitude Ocean Research (ANCHOR) workshop. Workshop participants focused on summarizing the current state of knowledge concerning Arctic acoustics, navigation and communications, developing an overarching system specification to guide community-wide engineering efforts and establishing an active community and steering group to guide long-term engineering efforts and ensure interoperability. This presentation will summarize ANCHOR workshop findings.
The use of x-ray pulsar-based navigation method for interplanetary flight
NASA Astrophysics Data System (ADS)
Yang, Bo; Guo, Xingcan; Yang, Yong
2009-07-01
As interplanetary missions are increasingly complex, the existing unique mature interplanetary navigation method mainly based on radiometric tracking techniques of Deep Space Network can not meet the rising demands of autonomous real-time navigation. This paper studied the applications for interplanetary flights of a new navigation technology under rapid development-the X-ray pulsar-based navigation for spacecraft (XPNAV), and valued its performance with a computer simulation. The XPNAV is an excellent autonomous real-time navigation method, and can provide comprehensive navigation information, including position, velocity, attitude, attitude rate and time. In the paper the fundamental principles and time transformation of the XPNAV were analyzed, and then the Delta-correction XPNAV blending the vehicles' trajectory dynamics with the pulse time-of-arrival differences at nominal and estimated spacecraft locations within an Unscented Kalman Filter (UKF) was discussed with a background mission of Mars Pathfinder during the heliocentric transferring orbit. The XPNAV has an intractable problem of integer pulse phase cycle ambiguities similar to the GPS carrier phase navigation. This article innovatively proposed the non-ambiguity assumption approach based on an analysis of the search space array method to resolve pulse phase cycle ambiguities between the nominal position and estimated position of the spacecraft. The simulation results show that the search space array method are computationally intensive and require long processing time when the position errors are large, and the non-ambiguity assumption method can solve ambiguity problem quickly and reliably. It is deemed that autonomous real-time integrated navigation system of the XPNAV blending with DSN, celestial navigation, inertial navigation and so on will be the development direction of interplanetary flight navigation system in the future.
Survivability design for a hybrid underwater vehicle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Biao; Wu, Chao; Li, Xiang
A novel hybrid underwater robotic vehicle (HROV) capable of working to the full ocean depth has been developed. The battery powered vehicle operates in two modes: operate as an untethered autonomous vehicle in autonomous underwater vehicle (AUV) mode and operate under remote control connected to the surface vessel by a lightweight, fiber optic tether in remotely operated vehicle (ROV) mode. Considering the hazardous underwater environment at the limiting depth and the hybrid operating modes, survivability has been placed on an equal level with the other design attributes of the HROV since the beginning of the project. This paper reports themore » survivability design elements for the HROV including basic vehicle design of integrated navigation and integrated communication, emergency recovery strategy, distributed architecture, redundant bus, dual battery package, emergency jettison system and self-repairing control system.« less
INS integrated motion analysis for autonomous vehicle navigation
NASA Technical Reports Server (NTRS)
Roberts, Barry; Bazakos, Mike
1991-01-01
The use of inertial navigation system (INS) measurements to enhance the quality and robustness of motion analysis techniques used for obstacle detection is discussed with particular reference to autonomous vehicle navigation. The approach to obstacle detection used here employs motion analysis of imagery generated by a passive sensor. Motion analysis of imagery obtained during vehicle travel is used to generate range measurements to points within the field of view of the sensor, which can then be used to provide obstacle detection. Results obtained with an INS integrated motion analysis approach are reviewed.
Evaluation of Relative Navigation Algorithms for Formation-Flying Satellites
NASA Technical Reports Server (NTRS)
Kelbel, David; Lee, Taesul; Long, Anne; Carpenter, J. Russell; Gramling, Cheryl
2001-01-01
Goddard Space Flight Center is currently developing advanced spacecraft systems to provide autonomous navigation and control of formation flyers. This paper discusses autonomous relative navigation performance for formations in eccentric, medium, and high-altitude Earth orbits using Global Positioning System (GPS) Standard Positioning Service (SPS) and intersatellite range measurements. The performance of several candidate relative navigation approaches is evaluated. These analyses indicate that the relative navigation accuracy is primarily a function of the frequency of acquisition and tracking of the GPS signals. A relative navigation position accuracy of 0.5 meters root-mean-square (RMS) can be achieved for formations in medium-attitude eccentric orbits that can continuously track at least one GPS signal. A relative navigation position accuracy of better than 75 meters RMS can be achieved for formations in high-altitude eccentric orbits that have sparse tracking of the GPS signals. The addition of round-trip intersatellite range measurements can significantly improve relative navigation accuracy for formations with sparse tracking of the GPS signals.
An Efficient Model-Based Image Understanding Method for an Autonomous Vehicle.
1997-09-01
The problem discussed in this dissertation is the development of an efficient method for visual navigation of autonomous vehicles . The approach is to... autonomous vehicles . Thus the new method is implemented as a component of the image-understanding system in the autonomous mobile robot Yamabico-11 at
Sandia National Laboratories proof-of-concept robotic security vehicle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrington, J.J.; Jones, D.P.; Klarer, P.R.
1989-01-01
Several years ago Sandia National Laboratories developed a prototype interior robot that could navigate autonomously inside a large complex building to air and test interior intrusion detection systems. Recently the Department of Energy Office of Safeguards and Security has supported the development of a vehicle that will perform limited security functions autonomously in a structured exterior environment. The goal of the first phase of this project was to demonstrate the feasibility of an exterior robotic vehicle for security applications by using converted interior robot technology, if applicable. An existing teleoperational test bed vehicle with remote driving controls was modified andmore » integrated with a newly developed command driving station and navigation system hardware and software to form the Robotic Security Vehicle (RSV) system. The RSV, also called the Sandia Mobile Autonomous Navigator (SANDMAN), has been successfully used to demonstrate that teleoperated security vehicles which can perform limited autonomous functions are viable and have the potential to decrease security manpower requirements and improve system capabilities. 2 refs., 3 figs.« less
Solar Thermal Utility-Scale Joint Venture Program (USJVP) Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
MANCINI,THOMAS R.
2001-04-01
Several years ago Sandia National Laboratories developed a prototype interior robot [1] that could navigate autonomously inside a large complex building to aid and test interior intrusion detection systems. Recently the Department of Energy Office of Safeguards and Security has supported the development of a vehicle that will perform limited security functions autonomously in a structured exterior environment. The goal of the first phase of this project was to demonstrate the feasibility of an exterior robotic vehicle for security applications by using converted interior robot technology, if applicable. An existing teleoperational test bed vehicle with remote driving controls was modifiedmore » and integrated with a newly developed command driving station and navigation system hardware and software to form the Robotic Security Vehicle (RSV) system. The RSV, also called the Sandia Mobile Autonomous Navigator (SANDMAN), has been successfully used to demonstrate that teleoperated security vehicles which can perform limited autonomous functions are viable and have the potential to decrease security manpower requirements and improve system capabilities.« less
Control technique for planetary rover
NASA Technical Reports Server (NTRS)
Nakatani, Ichiro; Kubota, Takashi; Adachi, Tadashi; Saitou, Hiroaki; Okamoto, Sinya
1994-01-01
Beginning next century, several schemes for sending a planetary rover to the moon or Mars are being planned. As part of the development program, autonomous navigation technology is being studied to allow the rover the ability to move autonomously over a long range of unknown planetary surface. In the previous study, we ran the autonomous navigation experiment on an outdoor test terrain by using a rover test-bed that was controlled by a conventional sense-plan-act method. In some cases during the experiment, a problem occurred with the rover moving into untraversable areas. To improve this situation, a new control technique has been developed that gives the rover the ability of reacting to the outputs of the proximity sensors, a reaction behavior if you will. We have developed a new rover test-bed system on which an autonomous navigation experiment was performed using the newly developed control technique. In this outdoor experiment, the new control technique effectively produced the control command for the rover to avoid obstacles and be guided to the goal point safely.
NASA Astrophysics Data System (ADS)
Hesar, Siamak G.; Parker, Jeffrey S.; Leonard, Jason M.; McGranaghan, Ryan M.; Born, George H.
2015-12-01
We study the application of Linked Autonomous Interplanetary Satellite Orbit Navigation (LiAISON) to track vehicles on the far side of the lunar surface. The LiAISON architecture is demonstrated to achieve accurate orbit determination solutions for various mission scenarios in the Earth-Moon system. Given the proper description of the force field, LiAISON is capable of producing absolute orbit determination solutions using relative satellite-to-satellite tracking observations alone. The lack of direct communication between Earth-based tracking stations and the far side of the Moon provides an ideal opportunity for implementing LiAISON. This paper presents a novel approach to use the LiAISON architecture to perform autonomous navigation of assets on the lunar far side surface. Relative measurements between a spacecraft placed in an EML-2 halo orbit and lunar surface asset(s) are simulated and processed. Comprehensive simulation results show that absolute states of the surface assets are observable with an achieved accuracy of the position estimate on the order of tens of meters.
Fully autonomous navigation for the NASA cargo transfer vehicle
NASA Technical Reports Server (NTRS)
Wertz, James R.; Skulsky, E. David
1991-01-01
A great deal of attention has been paid to navigation during the close approach (less than or equal to 1 km) phase of spacecraft rendezvous. However, most spacecraft also require a navigation system which provides the necessary accuracy for placing both satellites within the range of the docking sensors. The Microcosm Autonomous Navigation System (MANS) is an on-board system which uses Earth-referenced attitude sensing hardware to provide precision orbit and attitude determination. The system is capable of functioning from LEO to GEO and beyond. Performance depends on the number of available sensors as well as mission geometry; however, extensive simulations have shown that MANS will provide 100 m to 400 m (3(sigma)) position accuracy and 0.03 to 0.07 deg (3(sigma)) attitude accuracy in low Earth orbit. The system is independent of any external source, including GPS. MANS is expected to have a significant impact on ground operations costs, mission definition and design, survivability, and the potential development of very low-cost, fully autonomous spacecraft.
Autonomous Navigation Error Propagation Assessment for Lunar Surface Mobility Applications
NASA Technical Reports Server (NTRS)
Welch, Bryan W.; Connolly, Joseph W.
2006-01-01
The NASA Vision for Space Exploration is focused on the return of astronauts to the Moon. While navigation systems have already been proven in the Apollo missions to the moon, the current exploration campaign will involve more extensive and extended missions requiring new concepts for lunar navigation. In this document, the results of an autonomous navigation error propagation assessment are provided. The analysis is intended to be the baseline error propagation analysis for which Earth-based and Lunar-based radiometric data are added to compare these different architecture schemes, and quantify the benefits of an integrated approach, in how they can handle lunar surface mobility applications when near the Lunar South pole or on the Lunar Farside.
Bourbakis, N G
1997-01-01
This paper presents a generic traffic priority language, called KYKLOFORTA, used by autonomous robots for collision-free navigation in a dynamic unknown or known navigation space. In a previous work by X. Grossmman (1988), a set of traffic control rules was developed for the navigation of the robots on the lines of a two-dimensional (2-D) grid and a control center coordinated and synchronized their movements. In this work, the robots are considered autonomous: they are moving anywhere and in any direction inside the free space, and there is no need of a central control to coordinate and synchronize them. The requirements for each robot are i) visual perception, ii) range sensors, and iii) the ability of each robot to detect other moving objects in the same free navigation space, define the other objects perceived size, their velocity and their directions. Based on these assumptions, a traffic priority language is needed for each robot, making it able to decide during the navigation and avoid possible collision with other moving objects. The traffic priority language proposed here is based on a set of primitive traffic priority alphabet and rules which compose pattern of corridors for the application of the traffic priority rules.
Reactive Sequencing for Autonomous Navigation Evolving from Phoenix Entry, Descent, and Landing
NASA Technical Reports Server (NTRS)
Grasso, Christopher A.; Riedel, Joseph E.; Vaughan, Andrew T.
2010-01-01
Virtual Machine Language (VML) is an award-winning advanced procedural sequencing language in use on NASA deep-space missions since 1997, and was used for the successful entry, descent, and landing (EDL) of the Phoenix spacecraft onto the surface of Mars. Phoenix EDL utilized a state-oriented operations architecture which executed within the constraints of the existing VML 2.0 flight capability, compatible with the linear "land or die" nature of the mission. The intricacies of Phoenix EDL included the planned discarding of portions of the vehicle, the complex communications management for relay through on-orbit assets, the presence of temporally indeterminate physical events, and the need to rapidly catch up four days of sequencing should a reboot of the spacecraft flight computer occur shortly before atmospheric entry. These formidable operational challenges led to new techniques for packaging and coordinating reusable sequences called blocks using one-way synchronization via VML sequencing global variable events. The coordinated blocks acted as an ensemble to land the spacecraft, while individually managing various elements in as simple a fashion as possible. This paper outlines prototype VML 2.1 flight capabilities that have evolved from the one-way synchronization techniques in order to implement even more ambitious autonomous mission capabilities. Target missions for these new capabilities include autonomous touch-and-go sampling of cometary and asteroidal bodies, lunar landing of robotic missions, and ultimately landing of crewed lunar vehicles. Close proximity guidance, navigation, and control operations, on-orbit rendezvous, and descent and landing events featured in these missions require elaborate abort capability, manifesting highly non-linear scenarios that are so complex as to overtax traditional sequencing, or even the sort of one-way coordinated sequencing used during EDL. Foreseeing advanced command and control needs for small body and lunar landing guidance, navigation and control scenarios, work began three years ago on substantial upgrades to VML that are now being exercised in scenarios for lunar landing and comet/asteroid rendezvous. The advanced state-based approach includes coordinated state transition machines with distributed decision-making logic. These state machines are not merely sequences - they are reactive logic constructs capable of autonomous decision making within a well-defined domain. Combined with the JPL's AutoNav software used on Deep Space 1 and Deep Impact, the system allows spacecraft to autonomously navigate to an unmapped surface, soft-contact, and either land or ascend. The state machine architecture enabled by VML 2.1 has successfully performed sampling missions and lunar descent missions in a simulated environment, and is progressing toward flight capability. The authors are also investigating using the VML 2.1 flight director architecture to perform autonomous activities like rendezvous with a passive hypothetical Mars sample return capsule. The approach being pursued is similar to the touch-and-go sampling state machines, with the added complications associated with the search for, physical capture of, and securing of a separate spacecraft. Complications include optically finding and tracking the Orbiting Sample Capsule (OSC), keeping the OSC illuminated, making orbital adjustments, and physically capturing the OSC. Other applications could include autonomous science collection and fault compensation.
SLAM algorithm applied to robotics assistance for navigation in unknown environments
2010-01-01
Background The combination of robotic tools with assistance technology determines a slightly explored area of applications and advantages for disability or elder people in their daily tasks. Autonomous motorized wheelchair navigation inside an environment, behaviour based control of orthopaedic arms or user's preference learning from a friendly interface are some examples of this new field. In this paper, a Simultaneous Localization and Mapping (SLAM) algorithm is implemented to allow the environmental learning by a mobile robot while its navigation is governed by electromyographic signals. The entire system is part autonomous and part user-decision dependent (semi-autonomous). The environmental learning executed by the SLAM algorithm and the low level behaviour-based reactions of the mobile robot are robotic autonomous tasks, whereas the mobile robot navigation inside an environment is commanded by a Muscle-Computer Interface (MCI). Methods In this paper, a sequential Extended Kalman Filter (EKF) feature-based SLAM algorithm is implemented. The features correspond to lines and corners -concave and convex- of the environment. From the SLAM architecture, a global metric map of the environment is derived. The electromyographic signals that command the robot's movements can be adapted to the patient's disabilities. For mobile robot navigation purposes, five commands were obtained from the MCI: turn to the left, turn to the right, stop, start and exit. A kinematic controller to control the mobile robot was implemented. A low level behavior strategy was also implemented to avoid robot's collisions with the environment and moving agents. Results The entire system was tested in a population of seven volunteers: three elder, two below-elbow amputees and two young normally limbed patients. The experiments were performed within a closed low dynamic environment. Subjects took an average time of 35 minutes to navigate the environment and to learn how to use the MCI. The SLAM results have shown a consistent reconstruction of the environment. The obtained map was stored inside the Muscle-Computer Interface. Conclusions The integration of a highly demanding processing algorithm (SLAM) with a MCI and the communication between both in real time have shown to be consistent and successful. The metric map generated by the mobile robot would allow possible future autonomous navigation without direct control of the user, whose function could be relegated to choose robot destinations. Also, the mobile robot shares the same kinematic model of a motorized wheelchair. This advantage can be exploited for wheelchair autonomous navigation. PMID:20163735
Autonomous Navigation Results from the Mars Exploration Rover (MER) Mission
NASA Technical Reports Server (NTRS)
Maimone, Mark; Johnson, Andrew; Cheng, Yang; Willson, Reg; Matthies, Larry H.
2004-01-01
In January, 2004, the Mars Exploration Rover (MER) mission landed two rovers, Spirit and Opportunity, on the surface of Mars. Several autonomous navigation capabilities were employed in space for the first time in this mission. ]n the Entry, Descent, and Landing (EDL) phase, both landers used a vision system called the, Descent Image Motion Estimation System (DIMES) to estimate horizontal velocity during the last 2000 meters (m) of descent, by tracking features on the ground with a downlooking camera, in order to control retro-rocket firing to reduce horizontal velocity before impact. During surface operations, the rovers navigate autonomously using stereo vision for local terrain mapping and a local, reactive planning algorithm called Grid-based Estimation of Surface Traversability Applied to Local Terrain (GESTALT) for obstacle avoidance. ]n areas of high slip, stereo vision-based visual odometry has been used to estimate rover motion, As of mid-June, Spirit had traversed 3405 m, of which 1253 m were done autonomously; Opportunity had traversed 1264 m, of which 224 m were autonomous. These results have contributed substantially to the success of the mission and paved the way for increased levels of autonomy in future missions.
New Vectorial Propulsion System and Trajectory Control Designs for Improved AUV Mission Autonomy
Gonzalez, Julian; Galarza, Cesar; Aguzzi, Jacopo; del Rio, Joaquin
2018-01-01
Autonomous Underwater Vehicles (AUV) are proving to be a promising platform design for multidisciplinary autonomous operability with a wide range of applications in marine ecology and geoscience. Here, two novel contributions towards increasing the autonomous navigation capability of a new AUV prototype (the Guanay II) as a mix between a propelled vehicle and a glider are presented. Firstly, a vectorial propulsion system has been designed to provide full vehicle maneuverability in both horizontal and vertical planes. Furthermore, two controllers have been designed, based on fuzzy controls, to provide the vehicle with autonomous navigation capabilities. Due to the decoupled system propriety, the controllers in the horizontal plane have been designed separately from the vertical plane. This class of non-linear controllers has been used to interpret linguistic laws into different zones of functionality. This method provided good performance, used as interpolation between different rules or linear controls. Both improvements have been validated through simulations and field tests, displaying good performance results. Finally, the conclusion of this work is that the Guanay II AUV has a solid controller to perform autonomous navigation and carry out vertical immersions. PMID:29673224
Autonomous RPRV Navigation, Guidance and Control
NASA Technical Reports Server (NTRS)
Johnston, Donald E.; Myers, Thomas T.; Zellner, John W.
1983-01-01
Dryden Flight Research Center has the responsibility for flight testing of advanced remotely piloted research vehicles (RPRV) to explore highly maneuverable aircraft technology, and to test advanced structural concepts, and related aeronautical technologies which can yield important research results with significant cost benefits. The primary purpose is to provide the preliminary design of an upgraded automatic approach and landing control system and flight director display to improve landing performance and reduce pilot workload. A secondary purpose is to determine the feasibility of an onboard autonomous navigation, orbit, and landing capability for safe vehicle recovery in the event of loss of telemetry uplink communication with the vehicles. The current RPRV approach and landing method, the proposed automatic and manual approach and autoland system, and an autonomous navigation, orbit, and landing system concept which is based on existing operational technology are described.
ALHAT COBALT: CoOperative Blending of Autonomous Landing Technology
NASA Technical Reports Server (NTRS)
Carson, John M.
2015-01-01
The COBALT project is a flight demonstration of two NASA ALHAT (Autonomous precision Landing and Hazard Avoidance Technology) capabilities that are key for future robotic or human landing GN&C (Guidance, Navigation and Control) systems. The COBALT payload integrates the Navigation Doppler Lidar (NDL) for ultraprecise velocity and range measurements with the Lander Vision System (LVS) for Terrain Relative Navigation (TRN) position estimates. Terrestrial flight tests of the COBALT payload in an open-loop and closed-loop GN&C configuration will be conducted onboard a commercial, rocket-propulsive Vertical Test Bed (VTB) at a test range in Mojave, CA.
A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots.
Sherwin, Tyrone; Easte, Mikala; Chen, Andrew Tzer-Yeu; Wang, Kevin I-Kai; Dai, Wenbin
2018-02-14
Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system.
A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots
Sherwin, Tyrone; Easte, Mikala; Wang, Kevin I-Kai; Dai, Wenbin
2018-01-01
Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system. PMID:29443906
Development of Navigation Doppler Lidar for Future Landing Mission
NASA Technical Reports Server (NTRS)
Amzajerdian, Farzin; Hines, Glenn D.; Petway, Larry B.; Barnes, Bruce W.; Pierrottet, Diego F.; Carson, John M., III
2016-01-01
A coherent Navigation Doppler Lidar (NDL) sensor has been developed under the Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project to support future NASA missions to planetary bodies. This lidar sensor provides accurate surface-relative altitude and vector velocity data during the descent phase that can be used by an autonomous Guidance, Navigation, and Control (GN&C) system to precisely navigate the vehicle from a few kilometers above the ground to a designated location and execute a controlled soft touchdown. The operation and performance of the NDL was demonstrated through closed-loop flights onboard the rocket-propelled Morpheus vehicle in 2014. In Morpheus flights, conducted at the NASA Kennedy Space Center, the NDL data was used by an autonomous GN&C system to navigate and land the vehicle precisely at the selected location surrounded by hazardous rocks and craters. Since then, development efforts for the NDL have shifted toward enhancing performance, optimizing design, and addressing spaceflight size and mass constraints and environmental and reliability requirements. The next generation NDL, with expanded operational envelope and significantly reduced size, will be demonstrated in 2017 through a new flight test campaign onboard a commercial rocketpropelled test vehicle.
PointCom: semi-autonomous UGV control with intuitive interface
NASA Astrophysics Data System (ADS)
Rohde, Mitchell M.; Perlin, Victor E.; Iagnemma, Karl D.; Lupa, Robert M.; Rohde, Steven M.; Overholt, James; Fiorani, Graham
2008-04-01
Unmanned ground vehicles (UGVs) will play an important role in the nation's next-generation ground force. Advances in sensing, control, and computing have enabled a new generation of technologies that bridge the gap between manual UGV teleoperation and full autonomy. In this paper, we present current research on a unique command and control system for UGVs named PointCom (Point-and-Go Command). PointCom is a semi-autonomous command system for one or multiple UGVs. The system, when complete, will be easy to operate and will enable significant reduction in operator workload by utilizing an intuitive image-based control framework for UGV navigation and allowing a single operator to command multiple UGVs. The project leverages new image processing algorithms for monocular visual servoing and odometry to yield a unique, high-performance fused navigation system. Human Computer Interface (HCI) techniques from the entertainment software industry are being used to develop video-game style interfaces that require little training and build upon the navigation capabilities. By combining an advanced navigation system with an intuitive interface, a semi-autonomous control and navigation system is being created that is robust, user friendly, and less burdensome than many current generation systems. mand).
Flight Testing ALHAT Precision Landing Technologies Integrated Onboard the Morpheus Rocket Vehicle
NASA Technical Reports Server (NTRS)
Carson, John M. III; Robertson, Edward A.; Trawny, Nikolas; Amzajerdian, Farzin
2015-01-01
A suite of prototype sensors, software, and avionics developed within the NASA Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project were terrestrially demonstrated onboard the NASA Morpheus rocket-propelled Vertical Testbed (VTB) in 2014. The sensors included a LIDAR-based Hazard Detection System (HDS), a Navigation Doppler LIDAR (NDL) velocimeter, and a long-range Laser Altimeter (LAlt) that enable autonomous and safe precision landing of robotic or human vehicles on solid solar system bodies under varying terrain lighting conditions. The flight test campaign with the Morpheus vehicle involved a detailed integration and functional verification process, followed by tether testing and six successful free flights, including one night flight. The ALHAT sensor measurements were integrated into a common navigation solution through a specialized ALHAT Navigation filter that was employed in closed-loop flight testing within the Morpheus Guidance, Navigation and Control (GN&C) subsystem. Flight testing on Morpheus utilized ALHAT for safe landing site identification and ranking, followed by precise surface-relative navigation to the selected landing site. The successful autonomous, closed-loop flight demonstrations of the prototype ALHAT system have laid the foundation for the infusion of safe, precision landing capabilities into future planetary exploration missions.
Vision Based Navigation for Autonomous Cooperative Docking of CubeSats
NASA Astrophysics Data System (ADS)
Pirat, Camille; Ankersen, Finn; Walker, Roger; Gass, Volker
2018-05-01
A realistic rendezvous and docking navigation solution applicable to CubeSats is investigated. The scalability analysis of the ESA Autonomous Transfer Vehicle Guidance, Navigation & Control (GNC) performances and the Russian docking system, shows that the docking of two CubeSats would require a lateral control performance of the order of 1 cm. Line of sight constraints and multipath effects affecting Global Navigation Satellite System (GNSS) measurements in close proximity prevent the use of this sensor for the final approach. This consideration and the high control accuracy requirement led to the use of vision sensors for the final 10 m of the rendezvous and docking sequence. A single monocular camera on the chaser satellite and various sets of Light-Emitting Diodes (LEDs) on the target vehicle ensure the observability of the system throughout the approach trajectory. The simple and novel formulation of the measurement equations allows differentiating unambiguously rotations from translations between the target and chaser docking port and allows a navigation performance better than 1 mm at docking. Furthermore, the non-linear measurement equations can be solved in order to provide an analytic navigation solution. This solution can be used to monitor the navigation filter solution and ensure its stability, adding an extra layer of robustness for autonomous rendezvous and docking. The navigation filter initialization is addressed in detail. The proposed method is able to differentiate LEDs signals from Sun reflections as demonstrated by experimental data. The navigation filter uses a comprehensive linearised coupled rotation/translation dynamics, describing the chaser to target docking port motion. The handover, between GNSS and vision sensor measurements, is assessed. The performances of the navigation function along the approach trajectory is discussed.
Bio-inspired multi-mode optic flow sensors for micro air vehicles
NASA Astrophysics Data System (ADS)
Park, Seokjun; Choi, Jaehyuk; Cho, Jihyun; Yoon, Euisik
2013-06-01
Monitoring wide-field surrounding information is essential for vision-based autonomous navigation in micro-air-vehicles (MAV). Our image-cube (iCube) module, which consists of multiple sensors that are facing different angles in 3-D space, can be applied to the wide-field of view optic flows estimation (μ-Compound eyes) and to attitude control (μ- Ocelli) in the Micro Autonomous Systems and Technology (MAST) platforms. In this paper, we report an analog/digital (A/D) mixed-mode optic-flow sensor, which generates both optic flows and normal images in different modes for μ- Compound eyes and μ-Ocelli applications. The sensor employs a time-stamp based optic flow algorithm which is modified from the conventional EMD (Elementary Motion Detector) algorithm to give an optimum partitioning of hardware blocks in analog and digital domains as well as adequate allocation of pixel-level, column-parallel, and chip-level signal processing. Temporal filtering, which may require huge hardware resources if implemented in digital domain, is remained in a pixel-level analog processing unit. The rest of the blocks, including feature detection and timestamp latching, are implemented using digital circuits in a column-parallel processing unit. Finally, time-stamp information is decoded into velocity from look-up tables, multiplications, and simple subtraction circuits in a chip-level processing unit, thus significantly reducing core digital processing power consumption. In the normal image mode, the sensor generates 8-b digital images using single slope ADCs in the column unit. In the optic flow mode, the sensor estimates 8-b 1-D optic flows from the integrated mixed-mode algorithm core and 2-D optic flows with an external timestamp processing, respectively.
2013-09-30
underwater acoustic communication technologies for autonomous distributed underwater networks, through innovative signal processing, coding, and navigation...in real enviroments , an offshore testbed has been developed to conduct field experimetns. The testbed consists of four nodes and has been deployed...Leadership by the Connecticut Technology Council. Dr. Zhaohui Wang joined the faculty of the Department of Electrical and Computer Engineering at
Ultrafast optical ranging using microresonator soliton frequency combs
NASA Astrophysics Data System (ADS)
Trocha, P.; Karpov, M.; Ganin, D.; Pfeiffer, M. H. P.; Kordts, A.; Wolf, S.; Krockenberger, J.; Marin-Palomo, P.; Weimann, C.; Randel, S.; Freude, W.; Kippenberg, T. J.; Koos, C.
2018-02-01
Light detection and ranging is widely used in science and industry. Over the past decade, optical frequency combs were shown to offer advantages in optical ranging, enabling fast distance acquisition with high accuracy. Driven by emerging high-volume applications such as industrial sensing, drone navigation, or autonomous driving, there is now a growing demand for compact ranging systems. Here, we show that soliton Kerr comb generation in integrated silicon nitride microresonators provides a route to high-performance chip-scale ranging systems. We demonstrate dual-comb distance measurements with Allan deviations down to 12 nanometers at averaging times of 13 microseconds along with ultrafast ranging at acquisition rates of 100 megahertz, allowing for in-flight sampling of gun projectiles moving at 150 meters per second. Combining integrated soliton-comb ranging systems with chip-scale nanophotonic phased arrays could enable compact ultrafast ranging systems for emerging mass applications.
Plenoptic Imager for Automated Surface Navigation
NASA Technical Reports Server (NTRS)
Zollar, Byron; Milder, Andrew; Milder, Andrew; Mayo, Michael
2010-01-01
An electro-optical imaging device is capable of autonomously determining the range to objects in a scene without the use of active emitters or multiple apertures. The novel, automated, low-power imaging system is based on a plenoptic camera design that was constructed as a breadboard system. Nanohmics proved feasibility of the concept by designing an optical system for a prototype plenoptic camera, developing simulated plenoptic images and range-calculation algorithms, constructing a breadboard prototype plenoptic camera, and processing images (including range calculations) from the prototype system. The breadboard demonstration included an optical subsystem comprised of a main aperture lens, a mechanical structure that holds an array of micro lenses at the focal distance from the main lens, and a structure that mates a CMOS imaging sensor the correct distance from the micro lenses. The demonstrator also featured embedded electronics for camera readout, and a post-processor executing image-processing algorithms to provide ranging information.
Biologically inspired collision avoidance system for unmanned vehicles
NASA Astrophysics Data System (ADS)
Ortiz, Fernando E.; Graham, Brett; Spagnoli, Kyle; Kelmelis, Eric J.
2009-05-01
In this project, we collaborate with researchers in the neuroscience department at the University of Delaware to develop an Field Programmable Gate Array (FPGA)-based embedded computer, inspired by the brains of small vertebrates (fish). The mechanisms of object detection and avoidance in fish have been extensively studied by our Delaware collaborators. The midbrain optic tectum is a biological multimodal navigation controller capable of processing input from all senses that convey spatial information, including vision, audition, touch, and lateral-line (water current sensing in fish). Unfortunately, computational complexity makes these models too slow for use in real-time applications. These simulations are run offline on state-of-the-art desktop computers, presenting a gap between the application and the target platform: a low-power embedded device. EM Photonics has expertise in developing of high-performance computers based on commodity platforms such as graphic cards (GPUs) and FPGAs. FPGAs offer (1) high computational power, low power consumption and small footprint (in line with typical autonomous vehicle constraints), and (2) the ability to implement massively-parallel computational architectures, which can be leveraged to closely emulate biological systems. Combining UD's brain modeling algorithms and the power of FPGAs, this computer enables autonomous navigation in complex environments, and further types of onboard neural processing in future applications.
Intelligent agents: adaptation of autonomous bimodal microsystems
NASA Astrophysics Data System (ADS)
Smith, Patrice; Terry, Theodore B.
2014-03-01
Autonomous bimodal microsystems exhibiting survivability behaviors and characteristics are able to adapt dynamically in any given environment. Equipped with a background blending exoskeleton it will have the capability to stealthily detect and observe a self-chosen viewing area while exercising some measurable form of selfpreservation by either flying or crawling away from a potential adversary. The robotic agent in this capacity activates a walk-fly algorithm, which uses a built in multi-sensor processing and navigation subsystem or algorithm for visual guidance and best walk-fly path trajectory to evade capture or annihilation. The research detailed in this paper describes the theoretical walk-fly algorithm, which broadens the scope of spatial and temporal learning, locomotion, and navigational performances based on optical flow signals necessary for flight dynamics and walking stabilities. By observing a fly's travel and avoidance behaviors; and, understanding the reverse bioengineering research efforts of others, we were able to conceptualize an algorithm, which works in conjunction with decisionmaking functions, sensory processing, and sensorimotor integration. Our findings suggest that this highly complex decentralized algorithm promotes inflight or terrain travel mobile stability which is highly suitable for nonaggressive micro platforms supporting search and rescue (SAR), and chemical and explosive detection (CED) purposes; a necessity in turbulent, non-violent structured or unstructured environments.
Open-Loop Flight Testing of COBALT Navigation and Sensor Technologies for Precise Soft Landing
NASA Technical Reports Server (NTRS)
Carson, John M., III; Restrepo, Caroline I.; Seubert, Carl R.; Amzajerdian, Farzin; Pierrottet, Diego F.; Collins, Steven M.; O'Neal, Travis V.; Stelling, Richard
2017-01-01
An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) payload was conducted onboard the Masten Xodiac suborbital rocket testbed. The payload integrates two complementary sensor technologies that together provide a spacecraft with knowledge during planetary descent and landing to precisely navigate and softly touchdown in close proximity to targeted surface locations. The two technologies are the Navigation Doppler Lidar (NDL), for high-precision velocity and range measurements, and the Lander Vision System (LVS) for map-relative state esti- mates. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a very precise Terrain Relative Navigation (TRN) solution that is suitable for future, autonomous planetary landing systems that require precise and soft landing capabilities. During the open-loop flight campaign, the COBALT payload acquired measurements and generated a precise navigation solution, but the Xodiac vehicle planned and executed its maneuvers based on an independent, GPS-based navigation solution. This minimized the risk to the vehicle during the integration and testing of the new navigation sensing technologies within the COBALT payload.
Mixed-mode VLSI optic flow sensors for micro air vehicles
NASA Astrophysics Data System (ADS)
Barrows, Geoffrey Louis
We develop practical, compact optic flow sensors. To achieve the desired weight of 1--2 grams, mixed-mode and mixed-signal VLSI techniques are used to develop compact circuits that directly perform computations necessary to measure optic flow. We discuss several implementations, including a version fully integrated in VLSI, and several "hybrid sensors" in which the front end processing is performed with an analog chip and the back end processing is performed with a microcontroller. We extensively discuss one-dimensional optic flow sensors based on the linear competitive feature tracker (LCFT) algorithm. Hardware implementations of this algorithm are shown able to measure visual motion with contrast levels on the order of several percent. We argue that the development of one-dimensional optic flow sensors is therefore reduced to a problem of engineering. We also introduce two related two-dimensional optic flow algorithms that are amenable to implementation in VLSI. This includes the planar competitive feature tracker (PCFT) algorithm and the trajectory method. These sensors are being developed to solve small-scale navigation problems in micro air vehicles, which are autonomous aircraft whose maximum dimension is on the order of 15 cm. We obtain a proof-of-principle of small-scale navigation by mounting a prototype sensor onto a toy glider and programming the sensor to control a rudder or an elevator to affect the glider's path during flight. We demonstrate the determination of altitude by measuring optic flow in the downward direction. We also demonstrate steering to avoid a collision with a wall, when the glider is tossed towards the wall at a shallow angle, by measuring the optic flow in the direction of the glider's left and right side.
Tracked robot controllers for climbing obstacles autonomously
NASA Astrophysics Data System (ADS)
Vincent, Isabelle
2009-05-01
Research in mobile robot navigation has demonstrated some success in navigating flat indoor environments while avoiding obstacles. However, the challenge of analyzing complex environments to climb obstacles autonomously has had very little success due to the complexity of the task. Unmanned ground vehicles currently exhibit simple autonomous behaviours compared to the human ability to move in the world. This paper presents the control algorithms designed for a tracked mobile robot to autonomously climb obstacles by varying its tracks configuration. Two control algorithms are proposed to solve the autonomous locomotion problem for climbing obstacles. First, a reactive controller evaluates the appropriate geometric configuration based on terrain and vehicle geometric considerations. Then, a reinforcement learning algorithm finds alternative solutions when the reactive controller gets stuck while climbing an obstacle. The methodology combines reactivity to learning. The controllers have been demonstrated in box and stair climbing simulations. The experiments illustrate the effectiveness of the proposed approach for crossing obstacles.
Survey of computer vision technology for UVA navigation
NASA Astrophysics Data System (ADS)
Xie, Bo; Fan, Xiang; Li, Sijian
2017-11-01
Navigation based on computer version technology, which has the characteristics of strong independence, high precision and is not susceptible to electrical interference, has attracted more and more attention in the filed of UAV navigation research. Early navigation project based on computer version technology mainly applied to autonomous ground robot. In recent years, the visual navigation system is widely applied to unmanned machine, deep space detector and underwater robot. That further stimulate the research of integrated navigation algorithm based on computer version technology. In China, with many types of UAV development and two lunar exploration, the three phase of the project started, there has been significant progress in the study of visual navigation. The paper expounds the development of navigation based on computer version technology in the filed of UAV navigation research and draw a conclusion that visual navigation is mainly applied to three aspects as follows.(1) Acquisition of UAV navigation parameters. The parameters, including UAV attitude, position and velocity information could be got according to the relationship between the images from sensors and carrier's attitude, the relationship between instant matching images and the reference images and the relationship between carrier's velocity and characteristics of sequential images.(2) Autonomous obstacle avoidance. There are many ways to achieve obstacle avoidance in UAV navigation. The methods based on computer version technology ,including feature matching, template matching, image frames and so on, are mainly introduced. (3) The target tracking, positioning. Using the obtained images, UAV position is calculated by using optical flow method, MeanShift algorithm, CamShift algorithm, Kalman filtering and particle filter algotithm. The paper expounds three kinds of mainstream visual system. (1) High speed visual system. It uses parallel structure, with which image detection and processing are carried out at high speed. The system is applied to rapid response system. (2) The visual system of distributed network. There are several discrete image data acquisition sensor in different locations, which transmit image data to the node processor to increase the sampling rate. (3) The visual system combined with observer. The system combines image sensors with the external observers to make up for lack of visual equipment. To some degree, these systems overcome lacks of the early visual system, including low frequency, low processing efficiency and strong noise. In the end, the difficulties of navigation based on computer version technology in practical application are briefly discussed. (1) Due to the huge workload of image operation , the real-time performance of the system is poor. (2) Due to the large environmental impact , the anti-interference ability of the system is poor.(3) Due to the ability to work in a particular environment, the system has poor adaptability.
Asteroid approach covariance analysis for the Clementine mission
NASA Technical Reports Server (NTRS)
Ionasescu, Rodica; Sonnabend, David
1993-01-01
The Clementine mission is designed to test Strategic Defense Initiative Organization (SDIO) technology, the Brilliant Pebbles and Brilliant Eyes sensors, by mapping the moon surface and flying by the asteroid Geographos. The capability of two of the instruments available on board the spacecraft, the lidar (laser radar) and the UV/Visible camera is used in the covariance analysis to obtain the spacecraft delivery uncertainties at the asteroid. These uncertainties are due primarily to asteroid ephemeris uncertainties. On board optical navigation reduces the uncertainty in the knowledge of the spacecraft position in the direction perpendicular to the incoming asymptote to a one-sigma value of under 1 km, at the closest approach distance of 100 km. The uncertainty in the knowledge of the encounter time is about 0.1 seconds for a flyby velocity of 10.85 km/s. The magnitude of these uncertainties is due largely to Center Finding Errors (CFE). These systematic errors represent the accuracy expected in locating the center of the asteroid in the optical navigation images, in the absence of a topographic model for the asteroid. The direction of the incoming asymptote cannot be estimated accurately until minutes before the asteroid flyby, and correcting for it would require autonomous navigation. Orbit determination errors dominate over maneuver execution errors, and the final delivery accuracy attained is basically the orbit determination uncertainty before the final maneuver.
Nature-Inspired Acoustic Sensor Projects
1999-08-24
m). The pager motors are worn on the wrists. Yale Intelligent Sensors Lab 8 Autonomous vehicle navigation Yago – Yale Autonomous Go-Cart Yago is used...proximity sensor determined the presence of close-by objects missed by the sonars. Yago operated autonomously by avoiding obstacles. Problems being
Improved obstacle avoidance and navigation for an autonomous ground vehicle
NASA Astrophysics Data System (ADS)
Giri, Binod; Cho, Hyunsu; Williams, Benjamin C.; Tann, Hokchhay; Shakya, Bicky; Bharam, Vishal; Ahlgren, David J.
2015-01-01
This paper presents improvements made to the intelligence algorithms employed on Q, an autonomous ground vehicle, for the 2014 Intelligent Ground Vehicle Competition (IGVC). In 2012, the IGVC committee combined the formerly separate autonomous and navigation challenges into a single AUT-NAV challenge. In this new challenge, the vehicle is required to navigate through a grassy obstacle course and stay within the course boundaries (a lane of two white painted lines) that guide it toward a given GPS waypoint. Once the vehicle reaches this waypoint, it enters an open course where it is required to navigate to another GPS waypoint while avoiding obstacles. After reaching the final waypoint, the vehicle is required to traverse another obstacle course before completing the run. Q uses modular parallel software architecture in which image processing, navigation, and sensor control algorithms run concurrently. A tuned navigation algorithm allows Q to smoothly maneuver through obstacle fields. For the 2014 competition, most revisions occurred in the vision system, which detects white lines and informs the navigation component. Barrel obstacles of various colors presented a new challenge for image processing: the previous color plane extraction algorithm would not suffice. To overcome this difficulty, laser range sensor data were overlaid on visual data. Q also participates in the Joint Architecture for Unmanned Systems (JAUS) challenge at IGVC. For 2014, significant updates were implemented: the JAUS component accepted a greater variety of messages and showed better compliance to the JAUS technical standard. With these improvements, Q secured second place in the JAUS competition.
Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio; Rispoli, Attilio
2010-01-01
This paper presents an innovative method for estimating the attitude of airborne electro-optical cameras with respect to the onboard autonomous navigation unit. The procedure is based on the use of attitude measurements under static conditions taken by an inertial unit and carrier-phase differential Global Positioning System to obtain accurate camera position estimates in the aircraft body reference frame, while image analysis allows line-of-sight unit vectors in the camera based reference frame to be computed. The method has been applied to the alignment of the visible and infrared cameras installed onboard the experimental aircraft of the Italian Aerospace Research Center and adopted for in-flight obstacle detection and collision avoidance. Results show an angular uncertainty on the order of 0.1° (rms). PMID:22315559
Development of autonomous grasping and navigating robot
NASA Astrophysics Data System (ADS)
Kudoh, Hiroyuki; Fujimoto, Keisuke; Nakayama, Yasuichi
2015-01-01
The ability to find and grasp target items in an unknown environment is important for working robots. We developed an autonomous navigating and grasping robot. The operations are locating a requested item, moving to where the item is placed, finding the item on a shelf or table, and picking the item up from the shelf or the table. To achieve these operations, we designed the robot with three functions: an autonomous navigating function that generates a map and a route in an unknown environment, an item position recognizing function, and a grasping function. We tested this robot in an unknown environment. It achieved a series of operations: moving to a destination, recognizing the positions of items on a shelf, picking up an item, placing it on a cart with its hand, and returning to the starting location. The results of this experiment show the applicability of reducing the workforce with robots.
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1991-01-01
The volume on data fusion from multiple sources discusses fusing multiple views, temporal analysis and 3D motion interpretation, sensor fusion and eye-to-hand coordination, and integration in human shape perception. Attention is given to surface reconstruction, statistical methods in sensor fusion, fusing sensor data with environmental knowledge, computational models for sensor fusion, and evaluation and selection of sensor fusion techniques. Topics addressed include the structure of a scene from two and three projections, optical flow techniques for moving target detection, tactical sensor-based exploration in a robotic environment, and the fusion of human and machine skills for remote robotic operations. Also discussed are K-nearest-neighbor concepts for sensor fusion, surface reconstruction with discontinuities, a sensor-knowledge-command fusion paradigm for man-machine systems, coordinating sensing and local navigation, and terrain map matching using multisensing techniques for applications to autonomous vehicle navigation.
Bioinspired polarization navigation sensor for autonomous munitions systems
NASA Astrophysics Data System (ADS)
Giakos, G. C.; Quang, T.; Farrahi, T.; Deshpande, A.; Narayan, C.; Shrestha, S.; Li, Y.; Agarwal, M.
2013-05-01
Small unmanned aerial vehicles UAVs (SUAVs), micro air vehicles (MAVs), Automated Target Recognition (ATR), and munitions guidance, require extreme operational agility and robustness which can be partially offset by efficient bioinspired imaging sensor designs capable to provide enhanced guidance, navigation and control capabilities (GNC). Bioinspired-based imaging technology can be proved useful either for long-distance surveillance of targets in a cluttered environment, or at close distances limited by space surroundings and obstructions. The purpose of this study is to explore the phenomenology of image formation by different insect eye architectures, which would directly benefit the areas of defense and security, on the following four distinct areas: a) fabrication of the bioinspired sensor b) optical architecture, c) topology, and d) artificial intelligence. The outcome of this study indicates that bioinspired imaging can impact the areas of defense and security significantly by dedicated designs fitting into different combat scenarios and applications.
Crew-Aided Autonomous Navigation
NASA Technical Reports Server (NTRS)
Holt, Greg N.
2015-01-01
A sextant provides manual capability to perform star/planet-limb sightings and offers a cheap, simple, robust backup navigation source for exploration missions independent from the ground. Sextant sightings from spacecraft were first exercised in Gemini and flew as the lost-communication backup for all Apollo missions. This study characterized error sources of navigation-grade sextants for feasibility of taking star and planetary limb sightings from inside a spacecraft. A series of similar studies was performed in the early/mid-1960s in preparation for Apollo missions. This study modernized and updated those findings in addition to showing feasibility using Linear Covariance analysis techniques. The human eyeball is a remarkable piece of optical equipment and provides many advantages over camera-based systems, including dynamic range and detail resolution. This technique utilizes those advantages and provides important autonomy to the crew in the event of lost communication with the ground. It can also provide confidence and verification of low-TRL automated onboard systems. The technique is extremely flexible and is not dependent on any particular vehicle type. The investigation involved procuring navigation-grade sextants and characterizing their performance under a variety of conditions encountered in exploration missions. The JSC optical sensor lab and Orion mockup were the primary testing locations. For the accuracy assessment, a group of test subjects took sextant readings on calibrated targets while instrument/operator precision was measured. The study demonstrated repeatability of star/planet-limb sightings with bias and standard deviation around 10 arcseconds, then used high-fidelity simulations to verify those accuracy levels met the needs for targeting mid-course maneuvers in preparation for Earth reen.
USAF Development Of Optical Correlation Missile Guidance
NASA Astrophysics Data System (ADS)
Kaehr, Ronald; Spector, Marvin
1980-12-01
In 1965, the Advanced Development Program (ADP)-679A of the Avionics Laboratory initiated development of guidance systems for stand-off tactical missiles. Employing project engineering support from the Aeronautical Systems Division, WPAFB, the Avionics Laboratory funded multiple terminal guidance concepts and related midcourse navigation technology. Optical correlation techniques which utilize prestored reference information for autonomous target acquisition offered the best near-term opportunity for meeting mission goals. From among the systems studied and flight tested, Aimpoint* optical area guidance provided the best and most consistent performance. Funded development by the Air Force ended in 1974 with a MK-84 guided bomb drop test demonstration at White Sands Missile Range and the subsequent transfer of the tactical missile guidance development charter to the Air Force Armament Laboratory, Eglin AFB. A historical review of optical correlation development within the Avionics Laboratory is presented. Evolution of the Aimpoint system is specifically addressed. Finally, a brief discussion of trends in scene matching technology is presented.
Vector Pursuit Path Tracking for Autonomous Ground Vehicles
2000-08-01
vi INTRODUCTION ...........................................................................................................1...other geometric path-tracking techniques. 1 CHAPTER 1 INTRODUCTION An autonomous vehicle is one that is capable of automatic navigation. It is...Joint Architecture for Unmanned Ground Vehicles ( JAUGS ) working group meeting held at the University of Florida. 5 Figure 1.5: Autonomous
Bioinspired engineering of exploration systems for NASA and DoD
NASA Technical Reports Server (NTRS)
Thakoor, Sarita; Chahl, Javaan; Srinivasan, M. V.; Young, L.; Werblin, Frank; Hine, Butler; Zornetzer, Steven
2002-01-01
A new approach called bioinspired engineering of exploration systems (BEES) and its value for solving pressing NASA and DoD needs are described. Insects (for example honeybees and dragonflies) cope remarkably well with their world, despite possessing a brain containing less than 0.01% as many neurons as the human brain. Although most insects have immobile eyes with fixed focus optics and lack stereo vision, they use a number of ingenious, computationally simple strategies for perceiving their world in three dimensions and navigating successfully within it. We are distilling selected insect-inspired strategies to obtain novel solutions for navigation, hazard avoidance, altitude hold, stable flight, terrain following, and gentle deployment of payload. Such functionality provides potential solutions for future autonomous robotic space and planetary explorers. A BEES approach to developing lightweight low-power autonomous flight systems should be useful for flight control of such biomorphic flyers for both NASA and DoD needs. Recent biological studies of mammalian retinas confirm that representations of multiple features of the visual world are systematically parsed and processed in parallel. Features are mapped to a stack of cellular strata within the retina. Each of these representations can be efficiently modeled in semiconductor cellular nonlinear network (CNN) chips. We describe recent breakthroughs in exploring the feasibility of the unique blending of insect strategies of navigation with mammalian visual search, pattern recognition, and image understanding into hybrid biomorphic flyers for future planetary and terrestrial applications. We describe a few future mission scenarios for Mars exploration, uniquely enabled by these newly developed biomorphic flyers.
Autonomous navigation using lunar beacons
NASA Technical Reports Server (NTRS)
Khatib, A. R.; Ellis, J.; French, J.; Null, G.; Yunck, T.; Wu, S.
1983-01-01
The concept of using lunar beacon signal transmission for on-board navigation for earth satellites and near-earth spacecraft is described. The system would require powerful transmitters on the earth-side of the moon's surface and black box receivers with antennae and microprocessors placed on board spacecraft for autonomous navigation. Spacecraft navigation requires three position and three velocity elements to establish location coordinates. Two beacons could be soft-landed on the lunar surface at the limits of allowable separation and each would transmit a wide-beam signal with cones reaching GEO heights and be strong enough to be received by small antennae in near-earth orbit. The black box processor would perform on-board computation with one-way Doppler/range data and dynamical models. Alternatively, GEO satellites such as the GPS or TDRSS spacecraft can be used with interferometric techniques to provide decimeter-level accuracy for aircraft navigation.
Embedded Relative Navigation Sensor Fusion Algorithms for Autonomous Rendezvous and Docking Missions
NASA Technical Reports Server (NTRS)
DeKock, Brandon K.; Betts, Kevin M.; McDuffie, James H.; Dreas, Christine B.
2008-01-01
bd Systems (a subsidiary of SAIC) has developed a suite of embedded relative navigation sensor fusion algorithms to enable NASA autonomous rendezvous and docking (AR&D) missions. Translational and rotational Extended Kalman Filters (EKFs) were developed for integrating measurements based on the vehicles' orbital mechanics and high-fidelity sensor error models and provide a solution with increased accuracy and robustness relative to any single relative navigation sensor. The filters were tested tinough stand-alone covariance analysis, closed-loop testing with a high-fidelity multi-body orbital simulation, and hardware-in-the-loop (HWIL) testing in the Marshall Space Flight Center (MSFC) Flight Robotics Laboratory (FRL).
An Algorithm for Autonomous Formation Obstacle Avoidance
NASA Astrophysics Data System (ADS)
Cruz, Yunior I.
The level of human interaction with Unmanned Aerial Systems varies greatly from remotely piloted aircraft to fully autonomous systems. In the latter end of the spectrum, the challenge lies in designing effective algorithms to dictate the behavior of the autonomous agents. A swarm of autonomous Unmanned Aerial Vehicles requires collision avoidance and formation flight algorithms to negotiate environmental challenges it may encounter during the execution of its mission, which may include obstacles and chokepoints. In this work, a simple algorithm is developed to allow a formation of autonomous vehicles to perform point to point navigation while avoiding obstacles and navigating through chokepoints. Emphasis is placed on maintaining formation structures. Rather than breaking formation and individually navigating around the obstacle or through the chokepoint, vehicles are required to assemble into appropriately sized/shaped sub-formations, bifurcate around the obstacle or negotiate the chokepoint, and reassemble into the original formation at the far side of the obstruction. The algorithm receives vehicle and environmental properties as inputs and outputs trajectories for each vehicle from start to the desired ending location. Simulation results show that the algorithm safely routes all vehicles past the obstruction while adhering to the aforementioned requirements. The formation adapts and successfully negotiates the obstacles and chokepoints in its path while maintaining proper vehicle separation.
Navigation of robotic system using cricket motes
NASA Astrophysics Data System (ADS)
Patil, Yogendra J.; Baine, Nicholas A.; Rattan, Kuldip S.
2011-06-01
This paper presents a novel algorithm for self-mapping of the cricket motes that can be used for indoor navigation of autonomous robotic systems. The cricket system is a wireless sensor network that can provide indoor localization service to its user via acoustic ranging techniques. The behavior of the ultrasonic transducer on the cricket mote is studied and the regions where satisfactorily distance measurements can be obtained are recorded. Placing the motes in these regions results fine-grain mapping of the cricket motes. Trilateration is used to obtain a rigid coordinate system, but is insufficient if the network is to be used for navigation. A modified SLAM algorithm is applied to overcome the shortcomings of trilateration. Finally, the self-mapped cricket motes can be used for navigation of autonomous robotic systems in an indoor location.
Navigation strategies for multiple autonomous mobile robots moving in formation
NASA Technical Reports Server (NTRS)
Wang, P. K. C.
1991-01-01
The problem of deriving navigation strategies for a fleet of autonomous mobile robots moving in formation is considered. Here, each robot is represented by a particle with a spherical effective spatial domain and a specified cone of visibility. The global motion of each robot in the world space is described by the equations of motion of the robot's center of mass. First, methods for formation generation are discussed. Then, simple navigation strategies for robots moving in formation are derived. A sufficient condition for the stability of a desired formation pattern for a fleet of robots each equipped with the navigation strategy based on nearest neighbor tracking is developed. The dynamic behavior of robot fleets consisting of three or more robots moving in formation in a plane is studied by means of computer simulation.
An Analysis of Navigation Algorithms for Smartphones Using J2ME
NASA Astrophysics Data System (ADS)
Santos, André C.; Tarrataca, Luís; Cardoso, João M. P.
Embedded systems are considered one of the most potential areas for future innovations. Two embedded fields that will most certainly take a primary role in future innovations are mobile robotics and mobile computing. Mobile robots and smartphones are growing in number and functionalities, becoming a presence in our daily life. In this paper, we study the current feasibility of a smartphone to execute navigation algorithms. As a test case, we use a smartphone to control an autonomous mobile robot. We tested three navigation problems: Mapping, Localization and Path Planning. For each of these problems, an algorithm has been chosen, developed in J2ME, and tested on the field. Results show the current mobile Java capacity for executing computationally demanding algorithms and reveal the real possibility of using smartphones for autonomous navigation.
Autonomous Locator of Thermals (ALOFT) Autonomous Soaring Algorithm
2015-04-03
estimator used on the NRL CICADA Mk 3 micro air vehicle [13]. An extended Kalman filter (EKF) was designed to estimate the airspeed sensor bias and...Boulder, 2007. ALOFT Autonomous Soaring Algorithm 31 13. A.D. Kahn and D.J. Edwards, “Navigation, Guidance and Control for the CICADA Expendable
Enabling Autonomous Navigation for Affordable Scooters.
Liu, Kaikai; Mulky, Rajathswaroop
2018-06-05
Despite the technical success of existing assistive technologies, for example, electric wheelchairs and scooters, they are still far from effective enough in helping those in need navigate to their destinations in a hassle-free manner. In this paper, we propose to improve the safety and autonomy of navigation by designing a cutting-edge autonomous scooter, thus allowing people with mobility challenges to ambulate independently and safely in possibly unfamiliar surroundings. We focus on indoor navigation scenarios for the autonomous scooter where the current location, maps, and nearby obstacles are unknown. To achieve semi-LiDAR functionality, we leverage the gyros-based pose data to compensate the laser motion in real time and create synthetic mapping of simple environments with regular shapes and deep hallways. Laser range finders are suitable for long ranges with limited resolution. Stereo vision, on the other hand, provides 3D structural data of nearby complex objects. To achieve simultaneous fine-grained resolution and long range coverage in the mapping of cluttered and complex environments, we dynamically fuse the measurements from the stereo vision camera system, the synthetic laser scanner, and the LiDAR. We propose solutions to self-correct errors in data fusion and create a hybrid map to assist the scooter in achieving collision-free navigation in an indoor environment.
Guidance and control for unmanned ground vehicles
NASA Astrophysics Data System (ADS)
Bateman, Peter J.
1994-06-01
Techniques for the guidance, control, and navigation of unmanned ground vehicles are described in terms of the communication bandwidth requirements for driving and control of a vehicle remote from the human operator. Modes of operation are conveniently classified as conventional teleoperation, supervisory control, and fully autonomous control. The fundamental problem of maintaining a robust non-line-of-sight communications link between the human controller and the remote vehicle is discussed, as this provides the impetus for greater autonomy in the control system and the greatest scope for innovation. While supervisory control still requires the man to be providing the primary navigational intelligence, fully autonomous operation requires that mission navigation is provided solely by on-board machine intelligence. Methods directed at achieving this performance are described using various active and passive sensing of the terrain for route navigation and obstacle detection. Emphasis is given to TV imagery and signal processing techniques for image understanding. Reference is made to the limitations of current microprocessor technology and suitable computer architectures. Some of the more recent control techniques involve the use of neural networks, fuzzy logic, and data fusion and these are discussed in the context of road following and cross country navigation. Examples of autonomous vehicle testbeds operated at various laboratories around the world are given.
ERIC Educational Resources Information Center
Doty, Keith L.
1999-01-01
Research on neural networks and hippocampal function demonstrating how mammals construct mental maps and develop navigation strategies is being used to create Intelligent Autonomous Mobile Robots (IAMRs). Such robots are able to recognize landmarks and navigate without "vision." (SK)
NASA Technical Reports Server (NTRS)
Winternitz, Luke
2017-01-01
This talk will describe two first-of-their-kind technology demonstrations attached to ongoing NASA science missions, both of which aim to extend the range of autonomous spacecraft navigation far from the Earth. First, we will describe the onboard GPS navigation system for the Magnetospheric Multiscale (MMS) mission which is currently operating in elliptic orbits reaching nearly halfway to the Moon. The MMS navigation system is a key outgrowth of a larger effort at NASA Goddard Space Flight Center to advance high-altitude Global Navigation Satellite System (GNSS) navigation on multiple fronts, including developing Global Positioning System receivers and onboard navigation software, running simulation studies, and leading efforts to characterize and protect signals at high-altitude in the so-called GNSS Space-Service Volume (SSV). In the second part of the talk, we will describe the Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) mission that aims to make the first in-space demonstration of X-ray pulsar navigation (XNAV). SEXTANT is attached to the NASA astrophysics mission Neutron-star Interior Composition ExploreR (NICER) whose International Space Station mounted X-ray telescope is investigating the fundamental physics of extremes in gravity, material density, and electromagnetic fields found in neutron stars, and whose instrument provides a nearly ideal navigation sensor for XNAV.
Augmentation method of XPNAV in Mars orbit based on Phobos and Deimos observations
NASA Astrophysics Data System (ADS)
Rong, Jiao; Luping, Xu; Zhang, Hua; Cong, Li
2016-11-01
Autonomous navigation for Mars probe spacecraft is required to reduce the operation costs and enhance the navigation performance in the future. X-ray pulsar-based navigation (XPNAV) is a potential candidate to meet this requirement. This paper addresses the use of the Mars' natural satellites to improve XPNAV for Mars probe spacecraft. Two observation variables of the field angle and natural satellites' direction vectors of Mars are added into the XPNAV positioning system. The measurement model of field angle and direction vectors is formulated by processing satellite image of Mars obtained from optical camera. This measurement model is integrated into the spacecraft orbit dynamics to build the filter model. In order to estimate position and velocity error of the spacecraft and reduce the impact of the system noise on navigation precision, an adaptive divided difference filter (ADDF) is applied. Numerical simulation results demonstrate that the performance of ADDF is better than Unscented Kalman Filter (UKF) DDF and EKF. In view of the invisibility of Mars' natural satellites in some cases, a visibility condition analysis is given and the augmented XPNAV in a different visibility condition is numerically simulated. The simulation results show that the navigation precision is evidently improved by using the augmented XPNAV based on the field angle and natural satellites' direction vectors of Mars in a comparison with the conventional XPNAV.
The Development of a Simulator System and Hardware Test Bed for Deep Space X-Ray Navigation
NASA Astrophysics Data System (ADS)
Doyle, Patrick T.
2013-03-01
Currently, there is a considerable interest in developing technologies that will allow using photon measurements from celestial x-ray sources for deep space navigation. The impetus for this is that many envisioned future space missions will require spacecraft to have autonomous navigation capabilities. For missions close to Earth, Global Navigation Satellite Systems (GNSS) such as GPS are readily available for use, but for missions far from Earth, other alternatives must be provided. While existing systems such as the Deep Space Network (DSN) can be used, latencies associated with servicing a fleet of vehicles may not be compatible with some autonomous operations requiring timely updates of their navigation solution. Because of their somewhat predictable emissions, pulsars are the ideal candidates for x-ray sources that can be used to provide key parameters for navigation. Algorithms and simulation tools that will enable designing and analyzing x-ray navigation concepts are presented. The development of a compact x-ray detector system is pivotal to the eventual deployment of such navigation systems. Therefore, results of a high altitude balloon test to evaluate the design of a compact x-ray detector system are described as well.
Development Of Autonomous Systems
NASA Astrophysics Data System (ADS)
Kanade, Takeo
1989-03-01
In the last several years at the Robotics Institute of Carnegie Mellon University, we have been working on two projects for developing autonomous systems: Nablab for Autonomous Land Vehicle and Ambler for Mars Rover. These two systems are for different purposes: the Navlab is a four-wheeled vehicle (van) for road and open terrain navigation, and the Ambler is a six-legged locomotor for Mars exploration. The two projects, however, share many common aspects. Both are large-scale integrated systems for navigation. In addition to the development of individual components (eg., construction and control of the vehicle, vision and perception, and planning), integration of those component technologies into a system by means of an appropriate architecture is a major issue.
Autonomous formation flying based on GPS — PRISMA flight results
NASA Astrophysics Data System (ADS)
D'Amico, Simone; Ardaens, Jean-Sebastien; De Florio, Sergio
2013-01-01
This paper presents flight results from the early harvest of the Spaceborne Autonomous Formation Flying Experiment (SAFE) conducted in the frame of the Swedish PRISMA technology demonstration mission. SAFE represents one of the first demonstrations in low Earth orbit of an advanced guidance, navigation and control system for dual-spacecraft formations. Innovative techniques based on differential GPS-based navigation and relative orbital elements control are validated and tuned in orbit to fulfill the typical requirements of future distributed scientific instruments for remote sensing.
2013-05-01
saliency, natural scene statistics 1. INTRODUCTION Research into the area of autonomous navigation for unmanned ground vehicles (UGV) has accelerated in...recent years. This is partly due to the success of programs such as the DARPA Grand Challenge1 and the dream of driverless cars ,2 but is also due to the...NOTES 14. ABSTRACT There have been several major advances in autonomous navigation for unmanned ground vehicles in controlled urban environments in
Perception system and functions for autonomous navigation in a natural environment
NASA Technical Reports Server (NTRS)
Chatila, Raja; Devy, Michel; Lacroix, Simon; Herrb, Matthieu
1994-01-01
This paper presents the approach, algorithms, and processes we developed for the perception system of a cross-country autonomous robot. After a presentation of the tele-programming context we favor for intervention robots, we introduce an adaptive navigation approach, well suited for the characteristics of complex natural environments. This approach lead us to develop a heterogeneous perception system that manages several different terrain representatives. The perception functionalities required during navigation are listed, along with the corresponding representations we consider. The main perception processes we developed are presented. They are integrated within an on-board control architecture we developed. First results of an ambitious experiment currently underway at LAAS are then presented.
High accuracy autonomous navigation using the global positioning system (GPS)
NASA Technical Reports Server (NTRS)
Truong, Son H.; Hart, Roger C.; Shoan, Wendy C.; Wood, Terri; Long, Anne C.; Oza, Dipak H.; Lee, Taesul
1997-01-01
The application of global positioning system (GPS) technology to the improvement of the accuracy and economy of spacecraft navigation, is reported. High-accuracy autonomous navigation algorithms are currently being qualified in conjunction with the GPS attitude determination flyer (GADFLY) experiment for the small satellite technology initiative Lewis spacecraft. Preflight performance assessments indicated that these algorithms are able to provide a real time total position accuracy of better than 10 m and a velocity accuracy of better than 0.01 m/s, with selective availability at typical levels. It is expected that the position accuracy will be increased to 2 m if corrections are provided by the GPS wide area augmentation system.
NASA Technical Reports Server (NTRS)
Wagenknecht, J.; Fredrickson, S.; Manning, T.; Jones, B.
2003-01-01
Engineers at NASA Johnson Space Center have designed, developed, and tested a nanosatellite-class free-flyer intended for future external inspection and remote viewing of human spaceflight activities. The technology demonstration system, known as the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam), has been integrated into the approximate form and function of a flight system. The primary focus has been to develop a system capable of providing external views of the International Space Station. The Mini AERCam system is spherical-shaped and less than eight inches in diameter. It has a full suite of guidance, navigation, and control hardware and software, and is equipped with two digital video cameras and a high resolution still image camera. The vehicle is designed for either remotely piloted operations or supervised autonomous operations. Tests have been performed in both a six degree-of-freedom closed-loop orbital simulation and on an air-bearing table. The Mini AERCam system can also be used as a test platform for evaluating algorithms and relative navigation for autonomous proximity operations and docking around the Space Shuttle Orbiter or the ISS.
Li, Kui; Wang, Lei; Lv, Yanhong; Gao, Pengyu; Song, Tianxiao
2015-01-01
Getting a land vehicle’s accurate position, azimuth and attitude rapidly is significant for vehicle based weapons’ combat effectiveness. In this paper, a new approach to acquire vehicle’s accurate position and orientation is proposed. It uses biaxial optical detection platform (BODP) to aim at and lock in no less than three pre-set cooperative targets, whose accurate positions are measured beforehand. Then, it calculates the vehicle’s accurate position, azimuth and attitudes by the rough position and orientation provided by vehicle based navigation systems and no less than three couples of azimuth and pitch angles measured by BODP. The proposed approach does not depend on Global Navigation Satellite System (GNSS), thus it is autonomous and difficult to interfere. Meanwhile, it only needs a rough position and orientation as algorithm’s iterative initial value, consequently, it does not have high performance requirement for Inertial Navigation System (INS), odometer and other vehicle based navigation systems, even in high precise applications. This paper described the system’s working procedure, presented theoretical deviation of the algorithm, and then verified its effectiveness through simulation and vehicle experiments. The simulation and experimental results indicate that the proposed approach can achieve positioning and orientation accuracy of 0.2 m and 20″ respectively in less than 3 min. PMID:26492249
Li, Kui; Wang, Lei; Lv, Yanhong; Gao, Pengyu; Song, Tianxiao
2015-10-20
Getting a land vehicle's accurate position, azimuth and attitude rapidly is significant for vehicle based weapons' combat effectiveness. In this paper, a new approach to acquire vehicle's accurate position and orientation is proposed. It uses biaxial optical detection platform (BODP) to aim at and lock in no less than three pre-set cooperative targets, whose accurate positions are measured beforehand. Then, it calculates the vehicle's accurate position, azimuth and attitudes by the rough position and orientation provided by vehicle based navigation systems and no less than three couples of azimuth and pitch angles measured by BODP. The proposed approach does not depend on Global Navigation Satellite System (GNSS), thus it is autonomous and difficult to interfere. Meanwhile, it only needs a rough position and orientation as algorithm's iterative initial value, consequently, it does not have high performance requirement for Inertial Navigation System (INS), odometer and other vehicle based navigation systems, even in high precise applications. This paper described the system's working procedure, presented theoretical deviation of the algorithm, and then verified its effectiveness through simulation and vehicle experiments. The simulation and experimental results indicate that the proposed approach can achieve positioning and orientation accuracy of 0.2 m and 20″ respectively in less than 3 min.
An Architecture for Autonomous Rovers on Future Planetary Missions
NASA Astrophysics Data System (ADS)
Ocon, J.; Avilés, M.; Graziano, M.
2018-04-01
This paper proposes an architecture for autonomous planetary rovers. This architecture combines a set of characteristics required in this type of system: high level of abstraction, reactive event-based activity execution, and automous navigation.
Communication and Control for Fleets of Autonomous Underwater Vehicles
2006-10-30
Washington State University (WSU) on fuzzy logic control systems [2-4] and autonomous vehicles [5-10]. The ALWSE-MC program developed at NAVSEA CSS was...rotating head sonar on crawlers as an additional sensor for navigation. We have previously investigated the use of video cameras on autonomous vehicles for...simulates autonomous vehicles performing mine reconnaissance/mapping, clearance, and surveillance in a littoral region. Three simulations were preformed
Acoustic Communications and Navigation for Mobile Under-Ice Sensors
2017-02-04
From- To) 04/02/2017 Final Report 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Acoustic Communications and Navigation for Mobile Under-Ice Sensors...development and fielding of a new acoustic communications and navigation system for use on autonomous platforms (gliders and profiling floats) under the...contact below the ice. 15. SUBJECT TERMS Arctic Ocean, Undersea Workstations & Vehicles, Signal Processing, Navigation, Underwater Acoustics 16
Cooper, Andrew James; Redman, Chelsea Anne; Stoneham, David Mark; Gonzalez, Luis Felipe; Etse, Victor Kwesi
2015-08-28
This paper presents an unmanned aircraft system (UAS) that uses a probabilistic model for autonomous front-on environmental sensing or photography of a target. The system is based on low-cost and readily-available sensor systems in dynamic environments and with the general intent of improving the capabilities of dynamic waypoint-based navigation systems for a low-cost UAS. The behavioural dynamics of target movement for the design of a Kalman filter and Markov model-based prediction algorithm are included. Geometrical concepts and the Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of a target, thus delivering a new waypoint for autonomous navigation. The results of the application to aerial filming with low-cost UAS are presented, achieving the desired goal of maintained front-on perspective without significant constraint to the route or pace of target movement.
Navigation of autonomous vehicles for oil spill cleaning in dynamic and uncertain environments
NASA Astrophysics Data System (ADS)
Jin, Xin; Ray, Asok
2014-04-01
In the context of oil spill cleaning by autonomous vehicles in dynamic and uncertain environments, this paper presents a multi-resolution algorithm that seamlessly integrates the concepts of local navigation and global navigation based on the sensory information; the objective here is to enable adaptive decision making and online replanning of vehicle paths. The proposed algorithm provides a complete coverage of the search area for clean-up of the oil spills and does not suffer from the problem of having local minima, which is commonly encountered in potential-field-based methods. The efficacy of the algorithm is tested on a high-fidelity player/stage simulator for oil spill cleaning in a harbour, where the underlying oil weathering process is modelled as 2D random-walk particle tracking. A preliminary version of this paper was presented by X. Jin and A. Ray as 'Coverage Control of Autonomous Vehicles for Oil Spill Cleaning in Dynamic and Uncertain Environments', Proceedings of the American Control Conference, Washington, DC, June 2013, pp. 2600-2605.
Real-Time Hazard Detection and Avoidance Demonstration for a Planetary Lander
NASA Technical Reports Server (NTRS)
Epp, Chirold D.; Robertson, Edward A.; Carson, John M., III
2014-01-01
The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. In addition to precision landing close to a pre-mission defined landing location, the ALHAT System must be capable of autonomously identifying and avoiding surface hazards in real-time to enable a safe landing under any lighting conditions. This paper provides an overview of the recent results of the ALHAT closed loop hazard detection and avoidance flight demonstrations on the Morpheus Vertical Testbed (VTB) at the Kennedy Space Center, including results and lessons learned. This effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).
Cooper, Andrew James; Redman, Chelsea Anne; Stoneham, David Mark; Gonzalez, Luis Felipe; Etse, Victor Kwesi
2015-01-01
This paper presents an unmanned aircraft system (UAS) that uses a probabilistic model for autonomous front-on environmental sensing or photography of a target. The system is based on low-cost and readily-available sensor systems in dynamic environments and with the general intent of improving the capabilities of dynamic waypoint-based navigation systems for a low-cost UAS. The behavioural dynamics of target movement for the design of a Kalman filter and Markov model-based prediction algorithm are included. Geometrical concepts and the Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of a target, thus delivering a new waypoint for autonomous navigation. The results of the application to aerial filming with low-cost UAS are presented, achieving the desired goal of maintained front-on perspective without significant constraint to the route or pace of target movement. PMID:26343680
77 FR 27202 - 36(b)(1) Arms Sales Notification
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-09
... includes: Electronic Warfare Systems, Command, Control, Communication, Computers and Intelligence/Communication, Navigational and Identifications (C4I/CNI), Autonomic Logistics Global Support System (ALGS... Systems, Command, Control, Communication, Computers and Intelligence/Communication, Navigational and...
RAIM availability for supplemental GPS navigation
DOT National Transportation Integrated Search
1992-06-29
This paper examines GPS receiver autonomous integrity monitoring (RAIM) availability for supplemental navigation based on the approximate radial-error protection (ARP) method. This method applies ceiling levels for the ARP figure of merit to screen o...
Integrated vision-based GNC for autonomous rendezvous and capture around Mars
NASA Astrophysics Data System (ADS)
Strippoli, L.; Novelli, G.; Gil Fernandez, J.; Colmenarejo, P.; Le Peuvedic, C.; Lanza, P.; Ankersen, F.
2015-06-01
Integrated GNC (iGNC) is an activity aimed at designing, developing and validating the GNC for autonomously performing the rendezvous and capture phase of the Mars sample return mission as defined during the Mars sample return Orbiter (MSRO) ESA study. The validation cycle includes testing in an end-to-end simulator, in a real-time avionics-representative test bench and, finally, in a dynamic HW in the loop test bench for assessing the feasibility, performances and figure of merits of the baseline approach defined during the MSRO study, for both nominal and contingency scenarios. The on-board software (OBSW) is tailored to work with the sensors, actuators and orbits baseline proposed in MSRO. The whole rendezvous is based on optical navigation, aided by RF-Doppler during the search and first orbit determination of the orbiting sample. The simulated rendezvous phase includes also the non-linear orbit synchronization, based on a dedicated non-linear guidance algorithm robust to Mars ascent vehicle (MAV) injection accuracy or MAV failures resulting in elliptic target orbits. The search phase is very demanding for the image processing (IP) due to the very high visual magnitude of the target wrt. the stellar background, and the attitude GNC requires very high pointing stability accuracies to fulfil IP constraints. A trade-off of innovative, autonomous navigation filters indicates the unscented Kalman filter (UKF) as the approach that provides the best results in terms of robustness, response to non-linearities and performances compatibly with computational load. At short range, an optimized IP based on a convex hull algorithm has been developed in order to guarantee LoS and range measurements from hundreds of metres to capture.
Li, Hong; Liu, Mingyong; Zhang, Feihu
2017-01-01
This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments.
Li, Hong; Liu, Mingyong; Zhang, Feihu
2017-01-01
This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments. PMID:28747884
Spacecraft angular velocity estimation algorithm for star tracker based on optical flow techniques
NASA Astrophysics Data System (ADS)
Tang, Yujie; Li, Jian; Wang, Gangyi
2018-02-01
An integrated navigation system often uses the traditional gyro and star tracker for high precision navigation with the shortcomings of large volume, heavy weight and high-cost. With the development of autonomous navigation for deep space and small spacecraft, star tracker has been gradually used for attitude calculation and angular velocity measurement directly. At the same time, with the dynamic imaging requirements of remote sensing satellites and other imaging satellites, how to measure the angular velocity in the dynamic situation to improve the accuracy of the star tracker is the hotspot of future research. We propose the approach to measure angular rate with a nongyro and improve the dynamic performance of the star tracker. First, the star extraction algorithm based on morphology is used to extract the star region, and the stars in the two images are matched according to the method of angular distance voting. The calculation of the displacement of the star image is measured by the improved optical flow method. Finally, the triaxial angular velocity of the star tracker is calculated by the star vector using the least squares method. The method has the advantages of fast matching speed, strong antinoise ability, and good dynamic performance. The triaxial angular velocity of star tracker can be obtained accurately with these methods. So, the star tracker can achieve better tracking performance and dynamic attitude positioning accuracy to lay a good foundation for the wide application of various satellites and complex space missions.
Research of autonomous celestial navigation based on new measurement model of stellar refraction
NASA Astrophysics Data System (ADS)
Yu, Cong; Tian, Hong; Zhang, Hui; Xu, Bo
2014-09-01
Autonomous celestial navigation based on stellar refraction has attracted widespread attention for its high accuracy and full autonomy.In this navigation method, establishment of accurate stellar refraction measurement model is the fundament and key issue to achieve high accuracy navigation. However, the existing measurement models are limited due to the uncertainty of atmospheric parameters. Temperature, pressure and other factors which affect the stellar refraction within the height of earth's stratosphere are researched, and the varying model of atmosphere with altitude is derived on the basis of standard atmospheric data. Furthermore, a novel measurement model of stellar refraction in a continuous range of altitudes from 20 km to 50 km is produced by modifying the fixed altitude (25 km) measurement model, and equation of state with the orbit perturbations is established, then a simulation is performed using the improved Extended Kalman Filter. The results show that the new model improves the navigation accuracy, which has a certain practical application value.
Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles.
Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F
2016-09-16
Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV's navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results.
Development of Mission Enabling Infrastructure — Cislunar Autonomous Positioning System (CAPS)
NASA Astrophysics Data System (ADS)
Cheetham, B. W.
2017-10-01
Advanced Space, LLC is developing the Cislunar Autonomous Positioning System (CAPS) which would provide a scalable and evolvable architecture for navigation to reduce ground congestion and improve operations for missions throughout cislunar space.
DEMONSTRATION OF AUTONOMOUS AIR MONITORING THROUGH ROBOTICS
This project included modifying an existing teleoperated robot to include autonomous navigation, large object avoidance, and air monitoring and demonstrating that prototype robot system in indoor and outdoor environments. An existing teleoperated "Surveyor" robot developed by ARD...
Lessons Learned from OSIRIS-Rex Autonomous Navigation Using Natural Feature Tracking
NASA Technical Reports Server (NTRS)
Lorenz, David A.; Olds, Ryan; May, Alexander; Mario, Courtney; Perry, Mark E.; Palmer, Eric E.; Daly, Michael
2017-01-01
The Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (Osiris-REx) spacecraft is scheduled to launch in September, 2016 to embark on an asteroid sample return mission. It is expected to rendezvous with the asteroid, Bennu, navigate to the surface, collect a sample (July 20), and return the sample to Earth (September 23). The original mission design called for using one of two Flash Lidar units to provide autonomous navigation to the surface. Following Preliminary design and initial development of the Lidars, reliability issues with the hardware and test program prompted the project to begin development of an alternative navigation technique to be used as a backup to the Lidar. At the critical design review, Natural Feature Tracking (NFT) was added to the mission. NFT is an onboard optical navigation system that compares observed images to a set of asteroid terrain models which are rendered in real-time from a catalog stored in memory on the flight computer. Onboard knowledge of the spacecraft state is then updated by a Kalman filter using the measured residuals between the rendered reference images and the actual observed images. The asteroid terrain models used by NFT are built from a shape model generated from observations collected during earlier phases of the mission and include both terrain shape and albedo information about the asteroid surface. As a result, the success of NFT is highly dependent on selecting a set of topographic features that can be both identified during descent as well as reliably rendered using the shape model data available. During development, the OSIRIS-REx team faced significant challenges in developing a process conducive to robust operation. This was especially true for terrain models to be used as the spacecraft gets close to the asteroid and higher fidelity models are required for reliable image correlation. This paper will present some of the challenges and lessons learned from the development of the NFT system which includes not just the flight hardware and software but the development of the terrain models used to generate the onboard rendered images.
Autonomous Flight Safety System
NASA Technical Reports Server (NTRS)
Simpson, James
2010-01-01
The Autonomous Flight Safety System (AFSS) is an independent self-contained subsystem mounted onboard a launch vehicle. AFSS has been developed by and is owned by the US Government. Autonomously makes flight termination/destruct decisions using configurable software-based rules implemented on redundant flight processors using data from redundant GPS/IMU navigation sensors. AFSS implements rules determined by the appropriate Range Safety officials.
Experiments in teleoperator and autonomous control of space robotic vehicles
NASA Technical Reports Server (NTRS)
Alexander, Harold L.
1990-01-01
A research program and strategy are described which include fundamental teleoperation issues and autonomous-control issues of sensing and navigation for satellite robots. The program consists of developing interfaces for visual operation and studying the consequences of interface designs as well as developing navigation and control technologies based on visual interaction. A space-robot-vehicle simulator is under development for use in virtual-environment teleoperation experiments and neutral-buoyancy investigations. These technologies can be utilized in a study of visual interfaces to address tradeoffs between head-tracking and manual remote cameras, panel-mounted and helmet-mounted displays, and stereoscopic and monoscopic display systems. The present program can provide significant data for the development of control experiments for autonomously controlled satellite robots.
Optical navigation during the Voyager Neptune encounter
NASA Technical Reports Server (NTRS)
Riedel, J. E.; Owen, W. M., Jr.; Stuve, J. A.; Synnott, S. P.; Vaughan, R. M.
1990-01-01
Optical navigation techniques were required to successfully complete the planetary exploration phase of the NASA deep-space Voyager mission. The last of Voyager's planetary encounters, with Neptune, posed unique problems from an optical navigation standpoint. In this paper we briefly review general aspects of the optical navigation process as practiced during the Voyager mission, and discuss in detail particular features of the Neptune encounter which affected optical navigation. New approaches to the centerfinding problem were developed for both stars and extended bodies, and these are described. Results of the optical navigation data analysis are presented, as well as a description of the optical orbit determination system and results of its use during encounter. Partially as a result of the optical navigation processing, results of scientific significance were obtained. These results include the discovery and orbit determination of several new satellites of Neptune and the determination of the size of Triton, Neptune's largest moon.
Development of a GPS/INS/MAG navigation system and waypoint navigator for a VTOL UAV
NASA Astrophysics Data System (ADS)
Meister, Oliver; Mönikes, Ralf; Wendel, Jan; Frietsch, Natalie; Schlaile, Christian; Trommer, Gert F.
2007-04-01
Unmanned aerial vehicles (UAV) can be used for versatile surveillance and reconnaissance missions. If a UAV is capable of flying automatically on a predefined path the range of possible applications is widened significantly. This paper addresses the development of the integrated GPS/INS/MAG navigation system and a waypoint navigator for a small vertical take-off and landing (VTOL) unmanned four-rotor helicopter with a take-off weight below 1 kg. The core of the navigation system consists of low cost inertial sensors which are continuously aided with GPS, magnetometer compass, and a barometric height information. Due to the fact, that the yaw angle becomes unobservable during hovering flight, the integration with a magnetic compass is mandatory. This integration must be robust with respect to errors caused by the terrestrial magnetic field deviation and interferences from surrounding electronic devices as well as ferrite metals. The described integration concept with a Kalman filter overcomes the problem that erroneous magnetic measurements yield to an attitude error in the roll and pitch axis. The algorithm provides long-term stable navigation information even during GPS outages which is mandatory for the flight control of the UAV. In the second part of the paper the guidance algorithms are discussed in detail. These algorithms allow the UAV to operate in a semi-autonomous mode position hold as well an complete autonomous waypoint mode. In the position hold mode the helicopter maintains its position regardless of wind disturbances which ease the pilot job during hold-and-stare missions. The autonomous waypoint navigator enable the flight outside the range of vision and beyond the range of the radio link. Flight test results of the implemented modes of operation are shown.
Cheap or Robust? The practical realization of self-driving wheelchair technology.
Burhanpurkar, Maya; Labbe, Mathieu; Guan, Charlie; Michaud, Francois; Kelly, Jonathan
2017-07-01
To date, self-driving experimental wheelchair technologies have been either inexpensive or robust, but not both. Yet, in order to achieve real-world acceptance, both qualities are fundamentally essential. We present a unique approach to achieve inexpensive and robust autonomous and semi-autonomous assistive navigation for existing fielded wheelchairs, of which there are approximately 5 million units in Canada and United States alone. Our prototype wheelchair platform is capable of localization and mapping, as well as robust obstacle avoidance, using only a commodity RGB-D sensor and wheel odometry. As a specific example of the navigation capabilities, we focus on the single most common navigation problem: the traversal of narrow doorways in arbitrary environments. The software we have developed is generalizable to corridor following, desk docking, and other navigation tasks that are either extremely difficult or impossible for people with upper-body mobility impairments.
Local navigation and fuzzy control realization for autonomous guided vehicle
NASA Astrophysics Data System (ADS)
El-Konyaly, El-Sayed H.; Saraya, Sabry F.; Shehata, Raef S.
1996-10-01
This paper addresses the problem of local navigation for an autonomous guided vehicle (AGV) in a structured environment that contains static and dynamic obstacles. Information about the environment is obtained via a CCD camera. The problem is formulated as a dynamic feedback control problem in which speed and steering decisions are made on the fly while the AGV is moving. A decision element (DE) that uses local information is proposed. The DE guides the vehicle in the environment by producing appropriate navigation decisions. Dynamic models of a three-wheeled vehicle for driving and steering mechanisms are derived. The interaction between them is performed via the local feedback DE. A controller, based on fuzzy logic, is designed to drive the vehicle safely in an intelligent and human-like manner. The effectiveness of the navigation and control strategies in driving the AGV is illustrated and evaluated.
Development and Evaluation of Positioning Systems for Autonomous Vehicle Navigation
2001-12-01
generation of autonomous vehicles to utilize NTV technology is built on a commercially-available vehicle built by ASV. The All-Purpose Remote Transport...larger scale, AFRL and CIMAR are involved in the development of a standard approach in the design and specification of autonomous vehicles being...1996. Shi92 Shin, D.H., Sanjiv, S., and Lee, J.J., “Explicit Path Tracking by Autonomous Vehicles ,” Robotica, 10, (1992), 69-87. Ste95
Visual Requirements for Human Drivers and Autonomous Vehicles
DOT National Transportation Integrated Search
2016-03-01
Identification of published literature between 1995 and 2013, focusing on determining the quantity and quality of visual information needed under both driving modes (i.e., human and autonomous) to navigate the road safely, especially as it pertains t...
GPS navigation algorithms for Autonomous Airborne Refueling of Unmanned Air Vehicles
NASA Astrophysics Data System (ADS)
Khanafseh, Samer Mahmoud
Unmanned Air Vehicles (UAVs) have recently generated great interest because of their potential to perform hazardous missions without risking loss of life. If autonomous airborne refueling is possible for UAVs, mission range and endurance will be greatly enhanced. However, concerns about UAV-tanker proximity, dynamic mobility and safety demand that the relative navigation system meets stringent requirements on accuracy, integrity, and continuity. In response, this research focuses on developing high-performance GPS-based navigation architectures for Autonomous Airborne Refueling (AAR) of UAVs. The AAR mission is unique because of the potentially severe sky blockage introduced by the tanker. To address this issue, a high-fidelity dynamic sky blockage model was developed and experimentally validated. In addition, robust carrier phase differential GPS navigation algorithms were derived, including a new method for high-integrity reacquisition of carrier cycle ambiguities for recently-blocked satellites. In order to evaluate navigation performance, world-wide global availability and sensitivity covariance analyses were conducted. The new navigation algorithms were shown to be sufficient for turn-free scenarios, but improvement in performance was necessary to meet the difficult requirements for a general refueling mission with banked turns. Therefore, several innovative methods were pursued to enhance navigation performance. First, a new theoretical approach was developed to quantify the position-domain integrity risk in cycle ambiguity resolution problems. A mechanism to implement this method with partially-fixed cycle ambiguity vectors was derived, and it was used to define tight upper bounds on AAR navigation integrity risk. A second method, where a new algorithm for optimal fusion of measurements from multiple antennas was developed, was used to improve satellite coverage in poor visibility environments such as in AAR. Finally, methods for using data-link extracted measurements as an additional inter-vehicle ranging measurement were also introduced. The algorithms and methods developed in this work are generally applicable to realize high-performance GPS-based navigation in partially obstructed environments. Navigation performance for AAR was quantified through covariance analysis, and it was shown that the stringent navigation requirements for this application are achievable. Finally, a real-time implementation of the algorithms was developed and successfully validated in autopiloted flight tests.
A reactive system for open terrain navigation: Performance and limitations
NASA Technical Reports Server (NTRS)
Langer, D.; Rosenblatt, J.; Hebert, M.
1994-01-01
We describe a core system for autonomous navigation in outdoor natural terrain. The system consists of three parts: a perception module which processes range images to identify untraversable regions of the terrain, a local map management module which maintains a representation of the environment in the vicinity of the vehicle, and a planning module which issues commands to the vehicle controller. Our approach is to use the concept of 'early traversability evaluation', and on the use of reactive planning for generating commands to drive the vehicle. We argue that our approach leads to a robust and efficient navigation system. We illustrate our approach by an experiment in which a vehicle travelled autonomously for one kilometer through unmapped cross-country terrain.
NASA Astrophysics Data System (ADS)
Cheng, Xiang-Qin; Qu, Jing-Yuan; Yan, Zhe-Ping; Bian, Xin-Qian
2010-03-01
In order to improve the security and reliability for autonomous underwater vehicle (AUV) navigation, an H∞ robust fault-tolerant controller was designed after analyzing variations in state-feedback gain. Operating conditions and the design method were then analyzed so that the control problem could be expressed as a mathematical optimization problem. This permitted the use of linear matrix inequalities (LMI) to solve for the H∞ controller for the system. When considering different actuator failures, these conditions were then also mathematically expressed, allowing the H∞ robust controller to solve for these events and thus be fault-tolerant. Finally, simulation results showed that the H∞ robust fault-tolerant controller could provide precise AUV navigation control with strong robustness.
Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) Project Status as of May 2010
NASA Technical Reports Server (NTRS)
Striepe, Scott A.; Epp, Chirold D.; Robertson, Edward A.
2010-01-01
This paper includes the current status of NASA s Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) Project. The ALHAT team has completed several flight tests and two major design analysis cycles. These tests and analyses examine terrain relative navigation sensors, hazard detection and avoidance sensors and algorithms, and hazard relative navigation algorithms, and the guidance and navigation system using these ALHAT functions. The next flight test is scheduled for July 2010. The paper contains results from completed flight tests and analysis cycles. ALHAT system status, upcoming tests and analyses is also addressed. The current ALHAT plans as of May 2010 are discussed. Application of the ALHAT system to landing on bodies other than the Moon is included
NASA Astrophysics Data System (ADS)
Croft, John; Deily, John; Hartman, Kathy; Weidow, David
1998-01-01
In the twenty-first century, NASA envisions frequent low-cost missions to explore the solar system, observe the universe, and study our planet. To realize NASA's goal, the Guidance, Navigation, and Control Center (GNCC) at the Goddard Space Flight Center sponsors technology programs that enhance spacecraft performance, streamline processes and ultimately enable cheaper science. Our technology programs encompass control system architectures, sensor and actuator components, electronic systems, design and development of algorithms, embedded systems and space vehicle autonomy. Through collaboration with government, universities, non-profit organizations, and industry, the GNCC incrementally develops key technologies that conquer NASA's challenges. This paper presents an overview of several innovative technology initiatives for the autonomous guidance, navigation, and control (GN&C) of satellites.
Improving mobile robot localization: grid-based approach
NASA Astrophysics Data System (ADS)
Yan, Junchi
2012-02-01
Autonomous mobile robots have been widely studied not only as advanced facilities for industrial and daily life automation, but also as a testbed in robotics competitions for extending the frontier of current artificial intelligence. In many of such contests, the robot is supposed to navigate on the ground with a grid layout. Based on this observation, we present a localization error correction method by exploring the geometric feature of the tile patterns. On top of the classical inertia-based positioning, our approach employs three fiber-optic sensors that are assembled under the bottom of the robot, presenting an equilateral triangle layout. The sensor apparatus, together with the proposed supporting algorithm, are designed to detect a line's direction (vertical or horizontal) by monitoring the grid crossing events. As a result, the line coordinate information can be fused to rectify the cumulative localization deviation from inertia positioning. The proposed method is analyzed theoretically in terms of its error bound and also has been implemented and tested on a customary developed two-wheel autonomous mobile robot.
Demonstration of automated proximity and docking technologies
NASA Astrophysics Data System (ADS)
Anderson, Robert L.; Tsugawa, Roy K.; Bryan, Thomas C.
An autodock was demonstrated using straightforward techniques and real sensor hardware. A simulation testbed was established and validated. The sensor design was refined with improved optical performance and image processing noise mitigation techniques, and the sensor is ready for production from off-the-shelf components. The autonomous spacecraft architecture is defined. The areas of sensors, docking hardware, propulsion, and avionics are included in the design. The Guidance Navigation and Control architecture and requirements are developed. Modular structures suitable for automated control are used. The spacecraft system manager functions including configuration, resource, and redundancy management are defined. The requirements for autonomous spacecraft executive are defined. High level decisionmaking, mission planning, and mission contingency recovery are a part of this. The next step is to do flight demonstrations. After the presentation the following question was asked. How do you define validation? There are two components to validation definition: software simulation with formal and vigorous validation, and hardware and facility performance validated with respect to software already validated against analytical profile.
Maiden Voyage of the Under-Ice Float
NASA Astrophysics Data System (ADS)
Shcherbina, A.; D'Asaro, E. A.; Light, B.; Deming, J. W.; Rehm, E.
2016-02-01
The Under-Ice Float (UIF) is a new autonomous platform for sea ice and upper ocean observations in the marginal ice zone (MIZ). UIF is based on the Mixed Layer Lagrangian Float design, inheriting its accurate buoyancy control and relatively heavy payload capability. A major challenge for sustained autonomous observations in the MIZ is detection of open water for navigation and telemetry surfacings. UIF employs the new surface classification algorithm based on the spectral analysis of surface roughness sensed by an upward-looking sonar. A prototype UIF was deployed in the MIZ of the central Arctic Ocean in late August 2015. The main payload of the first UIF was a bio-optical suit consisting of upward- and downward hyperspectral radiometers; temperature, salinity, chlorophyll, turbidity, and dissolved oxygen sensors, and a high-definition photo camera. In the early stages of its mission, the float successfully avoided ice, detected leads, surfaced in open water, and transmitted data and photographs. We will present the analysis of these observations from the full UIF mission extending into the freeze-up season.
Experiments in autonomous robotics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamel, W.R.
1987-01-01
The Center for Engineering Systems Advanced Research (CESAR) is performing basic research in autonomous robotics for energy-related applications in hazardous environments. The CESAR research agenda includes a strong experimental component to assure practical evaluation of new concepts and theories. An evolutionary sequence of mobile research robots has been planned to support research in robot navigation, world sensing, and object manipulation. A number of experiments have been performed in studying robot navigation and path planning with planar sonar sensing. Future experiments will address more complex tasks involving three-dimensional sensing, dexterous manipulation, and human-scale operations.
Improving geolocation and spatial accuracies with the modular integrated avionics group (MIAG)
NASA Astrophysics Data System (ADS)
Johnson, Einar; Souter, Keith
1996-05-01
The modular integrated avionics group (MIAG) is a single unit approach to combining position, inertial and baro-altitude/air data sensors to provide optimized navigation, guidance and control performance. Lear Astronics Corporation is currently working within the navigation community to upgrade existing MIAG performance with precise GPS positioning mechanization tightly integrated with inertial, baro and other sensors. Among the immediate benefits are the following: (1) accurate target location in dynamic conditions; (2) autonomous launch and recovery using airborne avionics only; (3) precise flight path guidance; and (4) improved aircraft and payload stability information. This paper will focus on the impact of using the MIAG with its multimode navigation accuracies on the UAV targeting mission. Gimbaled electro-optical sensors mounted on a UAV can be used to determine ground coordinates of a target at the center of the field of view by a series of vector rotation and scaling computations. The accuracy of the computed target coordinates is dependent on knowing the UAV position and the UAV-to-target offset computation. Astronics performed a series of simulations to evaluate the effects that the improved angular and position data available from the MIAG have on target coordinate accuracy.
Terminal Homing for Autonomous Underwater Vehicle Docking
2016-06-01
underwater domain, accurate navigation. Above the water, light and electromagnetic signals travel well through air and space, mediums that allow for a...DISTRIBUTION CODE 13. ABSTRACT The use of docking stations for autonomous underwater vehicles (AUV) provides the ability to keep a vehicle on...Mechanical and Aerospace Engineering iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT The use of docking stations for autonomous underwater
2014-09-30
underwater acoustic communication technologies for autonomous distributed underwater networks , through innovative signal processing, coding, and...4. TITLE AND SUBTITLE Advancing Underwater Acoustic Communication for Autonomous Distributed Networks via Sparse Channel Sensing, Coding, and...coding: 3) OFDM modulated dynamic coded cooperation in underwater acoustic channels; 3 Localization, Networking , and Testbed: 4) On-demand
Terrain shape estimation from optical flow, using Kalman filtering
NASA Astrophysics Data System (ADS)
Hoff, William A.; Sklair, Cheryl W.
1990-01-01
As one moves through a static environment, the visual world as projected on the retina seems to flow past. This apparent motion, called optical flow, can be an important source of depth perception for autonomous robots. An important application is in planetary exploration -the landing vehicle must find a safe landing site in rugged terrain, and an autonomous rover must be able to navigate safely through this terrain. In this paper, we describe a solution to this problem. Image edge points are tracked between frames of a motion sequence, and the range to the points is calculated from the displacement of the edge points and the known motion of the camera. Kalman filtering is used to incrementally improve the range estimates to those points, and provide an estimate of the uncertainty in each range. Errors in camera motion and image point measurement can also be modelled with Kalman filtering. A surface is then interpolated to these points, providing a complete map from which hazards such as steeply sloping areas can be detected. Using the method of extended Kalman filtering, our approach allows arbitrary camera motion. Preliminary results of an implementation are presented, and show that the resulting range accuracy is on the order of 1-2% of the range.
On exploration of geometrically constrained space by medicinal leeches Hirudo verbana.
Adamatzky, Andrew
2015-04-01
Leeches are fascinating creatures: they have simple modular nervous circuitry yet exhibit a rich spectrum of behavioural modes. Leeches could be ideal blue-prints for designing flexible soft robots which are modular, multi-functional, fault-tolerant, easy to control, capable for navigating using optical, mechanical and chemical sensorial inputs, have autonomous inter-segmental coordination and adaptive decision-making. With future designs of leech-robots in mind we study how leeches behave in geometrically constrained spaces. Core results of the paper deal with leeches exploring a row of rooms arranged along a narrow corridor. In laboratory experiments we find that rooms closer to ends of the corridor are explored by leeches more often than rooms in the middle of the corridor. Also, in series of scoping experiments, we evaluate leeches capabilities to navigating in mazes towards sources of vibration and chemo-attraction. We believe our results lay foundation for future developments of robots mimicking behaviour of leeches. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
1987-06-01
by block numoiber) The study of human driving of automotive vehicles is an important aid to the development of viable autonomous vehicle navigation...of human driving which could provide some different insights into possible approaches to autonomous vehicle control. At the start of this work, it was...advanced work in the behavioral aspects of human driving . Research of this nature can have a significant impact on the development of autonomous vehicles
Detection of obstacles on runway using Ego-Motion compensation and tracking of significant features
NASA Technical Reports Server (NTRS)
Kasturi, Rangachar (Principal Investigator); Camps, Octavia (Principal Investigator); Gandhi, Tarak; Devadiga, Sadashiva
1996-01-01
This report describes a method for obstacle detection on a runway for autonomous navigation and landing of an aircraft. Detection is done in the presence of extraneous features such as tiremarks. Suitable features are extracted from the image and warping using approximately known camera and plane parameters is performed in order to compensate ego-motion as far as possible. Residual disparity after warping is estimated using an optical flow algorithm. Features are tracked from frame to frame so as to obtain more reliable estimates of their motion. Corrections are made to motion parameters with the residual disparities using a robust method, and features having large residual disparities are signaled as obstacles. Sensitivity analysis of the procedure is also studied. Nelson's optical flow constraint is proposed to separate moving obstacles from stationary ones. A Bayesian framework is used at every stage so that the confidence in the estimates can be determined.
Optical surgical navigation system causes pulse oximeter malfunction.
Satoh, Masaaki; Hara, Tetsuhito; Tamai, Kenji; Shiba, Juntaro; Hotta, Kunihisa; Takeuchi, Mamoru; Watanabe, Eiju
2015-01-01
An optical surgical navigation system is used as a navigator to facilitate surgical approaches, and pulse oximeters provide valuable information for anesthetic management. However, saw-tooth waves on the monitor of a pulse oximeter and the inability of the pulse oximeter to accurately record the saturation of a percutaneous artery were observed when a surgeon started an optical navigation system. The current case is thought to be the first report of this navigation system interfering with pulse oximetry. The causes of pulse jamming and how to manage an optical navigation system are discussed.
Libration Point Navigation Concepts Supporting the Vision for Space Exploration
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Folta, David C.; Moreau, Michael C.; Quinn, David A.
2004-01-01
This work examines the autonomous navigation accuracy achievable for a lunar exploration trajectory from a translunar libration point lunar navigation relay satellite, augmented by signals from the Global Positioning System (GPS). We also provide a brief analysis comparing the libration point relay to lunar orbit relay architectures, and discuss some issues of GPS usage for cis-lunar trajectories.
Navigation of military and space unmanned ground vehicles in unstructured terrains
NASA Technical Reports Server (NTRS)
Lescoe, Paul; Lavery, David; Bedard, Roger
1991-01-01
Development of unmanned vehicles for local navigation in terrains unstructured by humans is reviewed. Modes of navigation include teleoperation or remote control, computer assisted remote driving (CARD), and semiautonomous navigation (SAN). A first implementation of a CARD system was successfully tested using the Robotic Technology Test Vehicle developed by Jet Propulsion Laboratory. Stereo pictures were transmitted to a remotely located human operator, who performed the sensing, perception, and planning functions of navigation. A computer provided range and angle measurements and the path plan was transmitted to the vehicle which autonomously executed the path. This implementation is to be enhanced by providing passive stereo vision and a reflex control system for autonomously stopping the vehicle if blocked by an obstacle. SAN achievements include implementation of a navigation testbed on a six wheel, three-body articulated rover vehicle, development of SAN algorithms and code, integration of SAN software onto the vehicle, and a successful feasibility demonstration that represents a step forward towards the technology required for long-range exploration of the lunar or Martian surface. The vehicle includes a passive stereo vision system with real-time area-based stereo image correlation, a terrain matcher, a path planner, and a path execution planner.
ARK: Autonomous mobile robot in an industrial environment
NASA Technical Reports Server (NTRS)
Nickerson, S. B.; Jasiobedzki, P.; Jenkin, M.; Jepson, A.; Milios, E.; Down, B.; Service, J. R. R.; Terzopoulos, D.; Tsotsos, J.; Wilkes, D.
1994-01-01
This paper describes research on the ARK (Autonomous Mobile Robot in a Known Environment) project. The technical objective of the project is to build a robot that can navigate in a complex industrial environment using maps with permanent structures. The environment is not altered in any way by adding easily identifiable beacons and the robot relies on naturally occurring objects to use as visual landmarks for navigation. The robot is equipped with various sensors that can detect unmapped obstacles, landmarks and objects. In this paper we describe the robot's industrial environment, it's architecture, a novel combined range and vision sensor and our recent results in controlling the robot in the real-time detection of objects using their color and in the processing of the robot's range and vision sensor data for navigation.
Ribbon networks for modeling navigable paths of autonomous agents in virtual environments.
Willemsen, Peter; Kearney, Joseph K; Wang, Hongling
2006-01-01
This paper presents the Environment Description Framework (EDF) for modeling complex networks of intersecting roads and pathways in virtual environments. EDF represents information about the layout of streets and sidewalks, the rules that govern behavior on roads and walkways, and the locations of agents with respect to navigable structures. The framework serves as the substrate on which behavior programs for autonomous vehicles and pedestrians are built. Pathways are modeled as ribbons in space. The ribbon structure provides a natural coordinate frame for defining the local geometry of navigable surfaces. EDF includes a powerful runtime interface supported by robust and efficient code for locating objects on the ribbon network, for mapping between Cartesian and ribbon coordinates, and for determining behavioral constraints imposed by the environment.
Underwater terrain-aided navigation system based on combination matching algorithm.
Li, Peijuan; Sheng, Guoliang; Zhang, Xiaofei; Wu, Jingqiu; Xu, Baochun; Liu, Xing; Zhang, Yao
2018-07-01
Considering that the terrain-aided navigation (TAN) system based on iterated closest contour point (ICCP) algorithm diverges easily when the indicative track of strapdown inertial navigation system (SINS) is large, Kalman filter is adopted in the traditional ICCP algorithm, difference between matching result and SINS output is used as the measurement of Kalman filter, then the cumulative error of the SINS is corrected in time by filter feedback correction, and the indicative track used in ICCP is improved. The mathematic model of the autonomous underwater vehicle (AUV) integrated into the navigation system and the observation model of TAN is built. Proper matching point number is designated by comparing the simulation results of matching time and matching precision. Simulation experiments are carried out according to the ICCP algorithm and the mathematic model. It can be concluded from the simulation experiments that the navigation accuracy and stability are improved with the proposed combinational algorithm in case that proper matching point number is engaged. It will be shown that the integrated navigation system is effective in prohibiting the divergence of the indicative track and can meet the requirements of underwater, long-term and high precision of the navigation system for autonomous underwater vehicles. Copyright © 2017. Published by Elsevier Ltd.
Pulsar Timing and Its Application for Navigation and Gravitational Wave Detection
NASA Astrophysics Data System (ADS)
Becker, Werner; Kramer, Michael; Sesana, Alberto
2018-02-01
Pulsars are natural cosmic clocks. On long timescales they rival the precision of terrestrial atomic clocks. Using a technique called pulsar timing, the exact measurement of pulse arrival times allows a number of applications, ranging from testing theories of gravity to detecting gravitational waves. Also an external reference system suitable for autonomous space navigation can be defined by pulsars, using them as natural navigation beacons, not unlike the use of GPS satellites for navigation on Earth. By comparing pulse arrival times measured on-board a spacecraft with predicted pulse arrivals at a reference location (e.g. the solar system barycenter), the spacecraft position can be determined autonomously and with high accuracy everywhere in the solar system and beyond. We describe the unique properties of pulsars that suggest that such a navigation system will certainly have its application in future astronautics. We also describe the on-going experiments to use the clock-like nature of pulsars to "construct" a galactic-sized gravitational wave detector for low-frequency (f_{GW}˜ 10^{-9} - 10^{-7} Hz) gravitational waves. We present the current status and provide an outlook for the future.
Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles
Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F.
2016-01-01
Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV’s navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results. PMID:27649203
Navigation Architecture For A Space Mobile Network
NASA Technical Reports Server (NTRS)
Valdez, Jennifer E.; Ashman, Benjamin; Gramling, Cheryl; Heckler, Gregory W.; Carpenter, Russell
2016-01-01
The Tracking and Data Relay Satellite System (TDRSS) Augmentation Service for Satellites (TASS) is a proposed beacon service to provide a global, space-based GPS augmentation service based on the NASA Global Differential GPS (GDGPS) System. The TASS signal will be tied to the GPS time system and usable as an additional ranging and Doppler radiometric source. Additionally, it will provide data vital to autonomous navigation in the near Earth regime, including space weather information, TDRS ephemerides, Earth Orientation Parameters (EOP), and forward commanding capability. TASS benefits include enhancing situational awareness, enabling increased autonomy, and providing near real-time command access for user platforms. As NASA Headquarters Space Communication and Navigation Office (SCaN) begins to move away from a centralized network architecture and towards a Space Mobile Network (SMN) that allows for user initiated services, autonomous navigation will be a key part of such a system. This paper explores how a TASS beacon service enables the Space Mobile Networking paradigm, what a typical user platform would require, and provides an in-depth analysis of several navigation scenarios and operations concepts.
NASA Astrophysics Data System (ADS)
Emter, Thomas; Petereit, Janko
2014-05-01
An integrated multi-sensor fusion framework for localization and mapping for autonomous navigation in unstructured outdoor environments based on extended Kalman filters (EKF) is presented. The sensors for localization include an inertial measurement unit, a GPS, a fiber optic gyroscope, and wheel odometry. Additionally a 3D LIDAR is used for simultaneous localization and mapping (SLAM). A 3D map is built while concurrently a localization in a so far established 2D map is estimated with the current scan of the LIDAR. Despite of longer run-time of the SLAM algorithm compared to the EKF update, a high update rate is still guaranteed by sophisticatedly joining and synchronizing two parallel localization estimators.
First Results from a Hardware-in-the-Loop Demonstration of Closed-Loop Autonomous Formation Flying
NASA Technical Reports Server (NTRS)
Gill, E.; Naasz, Bo; Ebinuma, T.
2003-01-01
A closed-loop system for the demonstration of formation flying technologies has been developed at NASA s Goddard Space Flight Center. Making use of a GPS signal simulator with a dual radio frequency outlet, the system includes two GPS space receivers as well as a powerful onboard navigation processor dedicated to the GPS-based guidance, navigation, and control of a satellite formation in real-time. The closed-loop system allows realistic simulations of autonomous formation flying scenarios, enabling research in the fields of tracking and orbit control strategies for a wide range of applications. A sample scenario has been set up where the autonomous transition of a satellite formation from an initial along-track separation of 800 m to a final distance of 100 m has been demonstrated. As a result, a typical control accuracy of about 5 m has been achieved which proves the applicability of autonomous formation flying techniques to formations of satellites as close as 50 m.
Robot navigation research using the HERMIES mobile robot
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnett, D.L.
1989-01-01
In recent years robot navigation has attracted much attention from researchers around the world. Not only are theoretical studies being simulated on sophisticated computers, but many mobile robots are now used as test vehicles for these theoretical studies. Various algorithms have been perfected for navigation in a known static environment; but navigation in an unknown and dynamic environment poses a much more challenging problem for researchers. Many different methodologies have been developed for autonomous robot navigation, but each methodology is usually restricted to a particular type of environment. One important research focus of the Center for Engineering Systems Advanced researchmore » (CESAR) at Oak Ridge National Laboratory, is autonomous navigation in unknown and dynamic environments using the series of HERMIES mobile robots. The research uses an expert system for high-level planning interfaced with C-coded routines for implementing the plans, and for quick processing of data requested by the expert system. In using this approach, the navigation is not restricted to one methodology since the expert system can activate a rule module for the methodology best suited for the current situation. Rule modules can be added the rule base as they are developed and tested. Modules are being developed or enhanced for navigating from a map, searching for a target, exploring, artificial potential-field navigation, navigation using edge-detection, etc. This paper will report on the various rule modules and methods of navigation in use, or under development at CESAR, using the HERMIES-IIB robot as a testbed. 13 refs., 5 figs., 1 tab.« less
Autonomous navigation and control of a Mars rover
NASA Technical Reports Server (NTRS)
Miller, D. P.; Atkinson, D. J.; Wilcox, B. H.; Mishkin, A. H.
1990-01-01
A Mars rover will need to be able to navigate autonomously kilometers at a time. This paper outlines the sensing, perception, planning, and execution monitoring systems that are currently being designed for the rover. The sensing is based around stereo vision. The interpretation of the images use a registration of the depth map with a global height map provided by an orbiting spacecraft. Safe, low energy paths are then planned through the map, and expectations of what the rover's articulation sensors should sense are generated. These expectations are then used to ensure that the planned path is correctly being executed.
The JPL roadmap for Deep Space navigation
NASA Technical Reports Server (NTRS)
Martin-Mur, Tomas J.; Abraham, Douglas S.; Berry, David; Bhaskaran, Shyam; Cesarone, Robert J.; Wood, Lincoln
2006-01-01
This paper reviews the tentative set of deep space missions that will be supported by NASA's Deep Space Mission System in the next twenty-five years, and extracts the driving set of navigation capabilities that these missions will require. There will be many challenges including the support of new mission navigation approaches such as formation flying and rendezvous in deep space, low-energy and low-thrust orbit transfers, precise landing and ascent vehicles, and autonomous navigation. Innovative strategies and approaches will be needed to develop and field advanced navigation capabilities.
Autonomous Mars ascent and orbit rendezvous for earth return missions
NASA Technical Reports Server (NTRS)
Edwards, H. C.; Balmanno, W. F.; Cruz, Manuel I.; Ilgen, Marc R.
1991-01-01
The details of tha assessment of autonomous Mars ascent and orbit rendezvous for earth return missions are presented. Analyses addressing navigation system assessments, trajectory planning, targeting approaches, flight control guidance strategies, and performance sensitivities are included. Tradeoffs in the analysis and design process are discussed.
Vision-Aided Autonomous Landing and Ingress of Micro Aerial Vehicles
NASA Technical Reports Server (NTRS)
Brockers, Roland; Ma, Jeremy C.; Matthies, Larry H.; Bouffard, Patrick
2012-01-01
Micro aerial vehicles have limited sensor suites and computational power. For reconnaissance tasks and to conserve energy, these systems need the ability to autonomously land at vantage points or enter buildings (ingress). But for autonomous navigation, information is needed to identify and guide the vehicle to the target. Vision algorithms can provide egomotion estimation and target detection using input from cameras that are easy to include in miniature systems.
Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm
Kyristsis, Sarantis; Antonopoulos, Angelos; Chanialakis, Theofilos; Stefanakis, Emmanouel; Linardos, Christos; Tripolitsiotis, Achilles; Partsinevelos, Panagiotis
2016-01-01
Nowadays, various unmanned aerial vehicle (UAV) applications become increasingly demanding since they require real-time, autonomous and intelligent functions. Towards this end, in the present study, a fully autonomous UAV scenario is implemented, including the tasks of area scanning, target recognition, geo-location, monitoring, following and finally landing on a high speed moving platform. The underlying methodology includes AprilTag target identification through Graphics Processing Unit (GPU) parallelized processing, image processing and several optimized locations and approach algorithms employing gimbal movement, Global Navigation Satellite System (GNSS) readings and UAV navigation. For the experimentation, a commercial and a custom made quad-copter prototype were used, portraying a high and a low-computational embedded platform alternative. Among the successful targeting and follow procedures, it is shown that the landing approach can be successfully performed even under high platform speeds. PMID:27827883
Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm.
Kyristsis, Sarantis; Antonopoulos, Angelos; Chanialakis, Theofilos; Stefanakis, Emmanouel; Linardos, Christos; Tripolitsiotis, Achilles; Partsinevelos, Panagiotis
2016-11-03
Nowadays, various unmanned aerial vehicle (UAV) applications become increasingly demanding since they require real-time, autonomous and intelligent functions. Towards this end, in the present study, a fully autonomous UAV scenario is implemented, including the tasks of area scanning, target recognition, geo-location, monitoring, following and finally landing on a high speed moving platform. The underlying methodology includes AprilTag target identification through Graphics Processing Unit (GPU) parallelized processing, image processing and several optimized locations and approach algorithms employing gimbal movement, Global Navigation Satellite System (GNSS) readings and UAV navigation. For the experimentation, a commercial and a custom made quad-copter prototype were used, portraying a high and a low-computational embedded platform alternative. Among the successful targeting and follow procedures, it is shown that the landing approach can be successfully performed even under high platform speeds.
Multidisciplinary unmanned technology teammate (MUTT)
NASA Astrophysics Data System (ADS)
Uzunovic, Nenad; Schneider, Anne; Lacaze, Alberto; Murphy, Karl; Del Giorno, Mark
2013-01-01
The U.S. Army Tank Automotive Research, Development and Engineering Center (TARDEC) held an autonomous robot competition called CANINE in June 2012. The goal of the competition was to develop innovative and natural control methods for robots. This paper describes the winning technology, including the vision system, the operator interaction, and the autonomous mobility. The rules stated only gestures or voice commands could be used for control. The robots would learn a new object at the start of each phase, find the object after it was thrown into a field, and return the object to the operator. Each of the six phases became more difficult, including clutter of the same color or shape as the object, moving and stationary obstacles, and finding the operator who moved from the starting location to a new location. The Robotic Research Team integrated techniques in computer vision, speech recognition, object manipulation, and autonomous navigation. A multi-filter computer vision solution reliably detected the objects while rejecting objects of similar color or shape, even while the robot was in motion. A speech-based interface with short commands provided close to natural communication of complicated commands from the operator to the robot. An innovative gripper design allowed for efficient object pickup. A robust autonomous mobility and navigation solution for ground robotic platforms provided fast and reliable obstacle avoidance and course navigation. The research approach focused on winning the competition while remaining cognizant and relevant to real world applications.
1994-09-01
Hyslop , G.L., Schieber, G.E., Schwartz, M.K., "Automated Mission Planning for the Standoff Land Attack Missile (SLAM)", Proceedings of the...1993, pp. 277-290. [PARK80] Parkinson, B.W., "Overview", Global Positioning System, Vol. 1, The Institute of Navigation, Washington, D.C., 1980 , pp...Navigation Message", Global Positioning System, Vol. 1, The Institute of Navigation, Washington, D.C., 1980 , pp. 55-73. 139 [WOOD851 Wooden, W. H
NASA Technical Reports Server (NTRS)
Rutishauser, David K.; Epp, Chirold; Robertson, Ed
2012-01-01
The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. Since its inception in 2006, the ALHAT Project has executed four field test campaigns to characterize and mature sensors and algorithms that support real-time hazard detection and global/local precision navigation for planetary landings. The driving objective for Government Fiscal Year 2012 (GFY2012) is to successfully demonstrate autonomous, real-time, closed loop operation of the ALHAT system in a realistic free flight scenario on Earth using the Morpheus lander developed at the Johnson Space Center (JSC). This goal represents an aggressive target consistent with a lean engineering culture of rapid prototyping and development. This culture is characterized by prioritizing early implementation to gain practical lessons learned and then building on this knowledge with subsequent prototyping design cycles of increasing complexity culminating in the implementation of the baseline design. This paper provides an overview of the ALHAT/Morpheus flight demonstration activities in GFY2012, including accomplishments, current status, results, and lessons learned. The ALHAT/Morpheus effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).
Autonomous Navigation of the SSTI/Lewis Spacecraft Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Hart, R. C.; Long, A. C.; Lee, T.
1997-01-01
The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) is pursuing the application of Global Positioning System (GPS) technology to improve the accuracy and economy of spacecraft navigation. High-accuracy autonomous navigation algorithms are being flight qualified in conjunction with GSFC's GPS Attitude Determination Flyer (GADFLY) experiment on the Small Satellite Technology Initiative (SSTI) Lewis spacecraft, which is scheduled for launch in 1997. Preflight performance assessments indicate that these algorithms can provide a real-time total position accuracy of better than 10 meters (1 sigma) and velocity accuracy of better than 0.01 meter per second (1 sigma), with selective availability at typical levels. This accuracy is projected to improve to the 2-meter level if corrections to be provided by the GPS Wide Area Augmentation System (WAAS) are included.
DOT National Transportation Integrated Search
2008-01-28
The Volpe Center designed, implemented, and deployed a Global Positioning System (GPS) Receiver Autonomous Integrity Monitoring (RAIM) prediction system in the mid 1990s to support both Air Force and Federal Aviation Administration (FAA) use of TSO C...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pin, F.G.; de Saussure, G.; Spelt, P.F.
1988-01-01
This paper describes recent research activities at the Center for Engineering Systems Advanced Research (CESAR) in the area of sensor based reasoning, with emphasis being given to their application and implementation on our HERMIES-IIB autonomous mobile vehicle. These activities, including navigation and exploration in a-priori unknown and dynamic environments, goal recognition, vision-guided manipulation and sensor-driven machine learning, are discussed within the framework of a scenario in which an autonomous robot is asked to navigate through an unknown dynamic environment, explore, find and dock at the panel, read and understand the status of the panel's meters and dials, learn the functioningmore » of a process control panel, and successfully manipulate the control devices of the panel to solve a maintenance emergency problems. A demonstration of the successful implementation of the algorithms on our HERMIES-IIB autonomous robot for resolution of this scenario is presented. Conclusions are drawn concerning the applicability of the methodologies to more general classes of problems and implications for future work on sensor-driven reasoning for autonomous robots are discussed. 8 refs., 3 figs.« less
Satellite Imagery Assisted Road-Based Visual Navigation System
NASA Astrophysics Data System (ADS)
Volkova, A.; Gibbens, P. W.
2016-06-01
There is a growing demand for unmanned aerial systems as autonomous surveillance, exploration and remote sensing solutions. Among the key concerns for robust operation of these systems is the need to reliably navigate the environment without reliance on global navigation satellite system (GNSS). This is of particular concern in Defence circles, but is also a major safety issue for commercial operations. In these circumstances, the aircraft needs to navigate relying only on information from on-board passive sensors such as digital cameras. An autonomous feature-based visual system presented in this work offers a novel integral approach to the modelling and registration of visual features that responds to the specific needs of the navigation system. It detects visual features from Google Earth* build a feature database. The same algorithm then detects features in an on-board cameras video stream. On one level this serves to localise the vehicle relative to the environment using Simultaneous Localisation and Mapping (SLAM). On a second level it correlates them with the database to localise the vehicle with respect to the inertial frame. The performance of the presented visual navigation system was compared using the satellite imagery from different years. Based on comparison results, an analysis of the effects of seasonal, structural and qualitative changes of the imagery source on the performance of the navigation algorithm is presented. * The algorithm is independent of the source of satellite imagery and another provider can be used
Structured Kernel Subspace Learning for Autonomous Robot Navigation.
Kim, Eunwoo; Choi, Sungjoon; Oh, Songhwai
2018-02-14
This paper considers two important problems for autonomous robot navigation in a dynamic environment, where the goal is to predict pedestrian motion and control a robot with the prediction for safe navigation. While there are several methods for predicting the motion of a pedestrian and controlling a robot to avoid incoming pedestrians, it is still difficult to safely navigate in a dynamic environment due to challenges, such as the varying quality and complexity of training data with unwanted noises. This paper addresses these challenges simultaneously by proposing a robust kernel subspace learning algorithm based on the recent advances in nuclear-norm and l 1 -norm minimization. We model the motion of a pedestrian and the robot controller using Gaussian processes. The proposed method efficiently approximates a kernel matrix used in Gaussian process regression by learning low-rank structured matrix (with symmetric positive semi-definiteness) to find an orthogonal basis, which eliminates the effects of erroneous and inconsistent data. Based on structured kernel subspace learning, we propose a robust motion model and motion controller for safe navigation in dynamic environments. We evaluate the proposed robust kernel learning in various tasks, including regression, motion prediction, and motion control problems, and demonstrate that the proposed learning-based systems are robust against outliers and outperform existing regression and navigation methods.
GPS/Optical/Inertial Integration for 3D Navigation Using Multi-Copter Platforms
NASA Technical Reports Server (NTRS)
Dill, Evan T.; Young, Steven D.; Uijt De Haag, Maarten
2017-01-01
In concert with the continued advancement of a UAS traffic management system (UTM), the proposed uses of autonomous unmanned aerial systems (UAS) have become more prevalent in both the public and private sectors. To facilitate this anticipated growth, a reliable three-dimensional (3D) positioning, navigation, and mapping (PNM) capability will be required to enable operation of these platforms in challenging environments where global navigation satellite systems (GNSS) may not be available continuously. Especially, when the platform's mission requires maneuvering through different and difficult environments like outdoor opensky, outdoor under foliage, outdoor-urban and indoor, and may include transitions between these environments. There may not be a single method to solve the PNM problem for all environments. The research presented in this paper is a subset of a broader research effort, described in [1]. The research is focused on combining data from dissimilar sensor technologies to create an integrated navigation and mapping method that can enable reliable operation in both an outdoor and structured indoor environment. The integrated navigation and mapping design is utilizes a Global Positioning System (GPS) receiver, an Inertial Measurement Unit (IMU), a monocular digital camera, and three short to medium range laser scanners. This paper describes specifically the techniques necessary to effectively integrate the monocular camera data within the established mechanization. To evaluate the developed algorithms a hexacopter was built, equipped with the discussed sensors, and both hand-carried and flown through representative environments. This paper highlights the effect that the monocular camera has on the aforementioned sensor integration scheme's reliability, accuracy and availability.
Real-time visual mosaicking and navigation on the seafloor
NASA Astrophysics Data System (ADS)
Richmond, Kristof
Remote robotic exploration holds vast potential for gaining knowledge about extreme environments accessible to humans only with great difficulty. Robotic explorers have been sent to other solar system bodies, and on this planet into inaccessible areas such as caves and volcanoes. In fact, the largest unexplored land area on earth lies hidden in the airless cold and intense pressure of the ocean depths. Exploration in the oceans is further hindered by water's high absorption of electromagnetic radiation, which both inhibits remote sensing from the surface, and limits communications with the bottom. The Earth's oceans thus provide an attractive target for developing remote exploration capabilities. As a result, numerous robotic vehicles now routinely survey this environment, from remotely operated vehicles piloted over tethers from the surface to torpedo-shaped autonomous underwater vehicles surveying the mid-waters. However, these vehicles are limited in their ability to navigate relative to their environment. This limits their ability to return to sites with precision without the use of external navigation aids, and to maneuver near and interact with objects autonomously in the water and on the sea floor. The enabling of environment-relative positioning on fully autonomous underwater vehicles will greatly extend their power and utility for remote exploration in the furthest reaches of the Earth's waters---even under ice and under ground---and eventually in extraterrestrial liquid environments such as Europa's oceans. This thesis presents an operational, fielded system for visual navigation of underwater robotic vehicles in unexplored areas of the seafloor. The system does not depend on external sensing systems, using only instruments on board the vehicle. As an area is explored, a camera is used to capture images and a composite view, or visual mosaic, of the ocean bottom is created in real time. Side-to-side visual registration of images is combined with dead-reckoned navigation information in a framework allowing the creation and updating of large, locally consistent mosaics. These mosaics are used as maps in which the vehicle can navigate and localize itself with respect to points in the environment. The system achieves real-time performance in several ways. First, wherever possible, direct sensing of motion parameters is used in place of extracting them from visual data. Second, trajectories are chosen to enable a hierarchical search for side-to-side links which limits the amount of searching performed without sacrificing robustness. Finally, the map estimation is formulated as a sparse, linear information filter allowing rapid updating of large maps. The visual navigation enabled by the work in this thesis represents a new capability for remotely operated vehicles, and an enabling capability for a new generation of autonomous vehicles which explore and interact with remote, unknown and unstructured underwater environments. The real-time mosaic can be used on current tethered vehicles to create pilot aids and provide a vehicle user with situational awareness of the local environment and the position of the vehicle within it. For autonomous vehicles, the visual navigation system enables precise environment-relative positioning and mapping, without requiring external navigation systems, opening the way for ever-expanding autonomous exploration capabilities. The utility of this system was demonstrated in the field at sites of scientific interest using the ROVs Ventana and Tiburon operated by the Monterey Bay Aquarium Research Institute. A number of sites in and around Monterey Bay, California were mosaicked using the system, culminating in a complete imaging of the wreck site of the USS Macon , where real-time visual mosaics containing thousands of images were generated while navigating using only sensor systems on board the vehicle.
NASA Astrophysics Data System (ADS)
Napoli, Jay
2016-05-01
Precision fiber optic gyroscopes (FOGs) are critical components for an array of platforms and applications ranging from stabilization and pointing orientation of payloads and platforms to navigation and control for unmanned and autonomous systems. In addition, FOG-based inertial systems provide extremely accurate data for geo-referencing systems. Significant improvements in the performance of FOGs and FOG-based inertial systems at KVH are due, in large part, to advancements in the design and manufacture of optical fiber, as well as in manufacturing operations and signal processing. Open loop FOGs, such as those developed and manufactured by KVH Industries, offer tactical-grade performance in a robust, small package. The success of KVH FOGs and FOG-based inertial systems is due to innovations in key fields, including the development of proprietary D-shaped fiber with an elliptical core, and KVH's unique ThinFiber. KVH continually improves its FOG manufacturing processes and signal processing, which result in improved accuracies across its entire FOG product line. KVH acquired its FOG capabilities, including its patented E•Core fiber, when the company purchased Andrew Corporation's Fiber Optic Group in 1997. E•Core fiber is unique in that the light-guiding core - critical to the FOG's performance - is elliptically shaped. The elliptical core produces a fiber that has low loss and high polarization-maintaining ability. In 2010, KVH developed its ThinFiber, a 170-micron diameter fiber that retains the full performance characteristics of E•Core fiber. ThinFiber has enabled the development of very compact, high-performance open-loop FOGs, which are also used in a line of FOG-based inertial measurement units and inertial navigation systems.
Autonomous precision landing using terrain-following navigation
NASA Technical Reports Server (NTRS)
Vaughan, R. M.; Gaskell, R. W.; Halamek, P.; Klumpp, A. R.; Synnott, S. P.
1991-01-01
Terrain-following navigation studies that have been done over the past two years in the navigation system section at JPL are described. A descent to Mars scenario based on Mars Rover and Sample Return mission profiles is described, and navigation and image processing issues pertaining to descent phases where landmark picture can be obtained are examined. A covariance analysis is performed to verify that landmark measurements from a terrain-following navigation system can satisfy precision landing requirements. Image processing problems involving known landmarks in actual pictures are considered. Mission design alternatives that can alleviate some of these problems are suggested.
A Self-Tuning Kalman Filter for Autonomous Navigation Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, Son H.
1999-01-01
Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS (Global Positioning Systems) data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.
A Self-Tuning Kalman Filter for Autonomous Navigation using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, S. H.
1999-01-01
Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.
NASA Technical Reports Server (NTRS)
Stuart, J. R.
1984-01-01
The evolution of NASA's planetary navigation techniques is traced, and radiometric and optical data types are described. Doppler navigation; the Deep Space Network; differenced two-way range techniques; differential very long base interferometry; and optical navigation are treated. The Doppler system enables a spacecraft in cruise at high absolute declination to be located within a total angular uncertainty of 1/4 microrad. The two-station range measurement provides a 1 microrad backup at low declinations. Optical data locate the spacecraft relative to the target to an angular accuracy of 5 microrad. Earth-based radio navigation and its less accurate but target-relative counterpart, optical navigation, thus form complementary measurement sources, which provide a powerful sensory system to produce high-precision orbit estimates.
Autonomous Navigation Apparatus With Neural Network for a Mobile Vehicle
NASA Technical Reports Server (NTRS)
Quraishi, Naveed (Inventor)
1996-01-01
An autonomous navigation system for a mobile vehicle arranged to move within an environment includes a plurality of sensors arranged on the vehicle and at least one neural network including an input layer coupled to the sensors, a hidden layer coupled to the input layer, and an output layer coupled to the hidden layer. The neural network produces output signals representing respective positions of the vehicle, such as the X coordinate, the Y coordinate, and the angular orientation of the vehicle. A plurality of patch locations within the environment are used to train the neural networks to produce the correct outputs in response to the distances sensed.
Daytime Water Detection by Fusing Multiple Cues for Autonomous Off-Road Navigation
NASA Technical Reports Server (NTRS)
Rankin, A. L.; Matthies, L. H.; Huertas, A.
2004-01-01
Detecting water hazards is a significant challenge to unmanned ground vehicle autonomous off-road navigation. This paper focuses on detecting the presence of water during the daytime using color cameras. A multi-cue approach is taken. Evidence of the presence of water is generated from color, texture, and the detection of reflections in stereo range data. A rule base for fusing water cues was developed by evaluating detection results from an extensive archive of data collection imagery containing water. This software has been implemented into a run-time passive perception subsystem and tested thus far under Linux on a Pentium based processor.
Obstacle Avoidance On Roadways Using Range Data
NASA Astrophysics Data System (ADS)
Dunlay, R. Terry; Morgenthaler, David G.
1987-02-01
This report describes range data based obstacle avoidance techniques developed for use on an autonomous road-following robot vehicle. The purpose of these techniques is to detect and locate obstacles present in a road environment for navigation of a robot vehicle equipped with an active laser-based range sensor. Techniques are presented for obstacle detection, obstacle location, and coordinate transformations needed in the construction of Scene Models (symbolic structures representing the 3-D obstacle boundaries used by the vehicle's Navigator for path planning). These techniques have been successfully tested on an outdoor robotic vehicle, the Autonomous Land Vehicle (ALV), at speeds up to 3.5 km/hour.
Dynamic multisensor fusion for mobile robot navigation in an indoor environment
NASA Astrophysics Data System (ADS)
Jin, Taeseok; Lee, Jang-Myung; Luk, Bing L.; Tso, Shiu K.
2001-10-01
In this study, as the preliminary step for developing a multi-purpose Autonomous robust carrier mobile robot to transport trolleys or heavy goods and serve as robotic nursing assistant in hospital wards. The aim of this paper is to present the use of multi-sensor data fusion such as sonar, CCD camera dn IR sensor for map-building mobile robot to navigate, and presents an experimental mobile robot designed to operate autonomously within both indoor and outdoor environments. Smart sensory systems are crucial for successful autonomous systems. We will give an explanation for the robot system architecture designed and implemented in this study and a short review of existing techniques, since there exist several recent thorough books and review paper on this paper. Instead we will focus on the main results with relevance to the intelligent service robot project at the Centre of Intelligent Design, Automation & Manufacturing (CIDAM). We will conclude by discussing some possible future extensions of the project. It is first dealt with the general principle of the navigation and guidance architecture, then the detailed functions recognizing environments updated, obstacle detection and motion assessment, with the first results form the simulations run.
Neuro-fuzzy controller to navigate an unmanned vehicle.
Selma, Boumediene; Chouraqui, Samira
2013-12-01
A Neuro-fuzzy control method for an Unmanned Vehicle (UV) simulation is described. The objective is guiding an autonomous vehicle to a desired destination along a desired path in an environment characterized by a terrain and a set of distinct objects, such as obstacles like donkey traffic lights and cars circulating in the trajectory. The autonomous navigate ability and road following precision are mainly influenced by its control strategy and real-time control performance. Fuzzy Logic Controller can very well describe the desired system behavior with simple "if-then" relations owing the designer to derive "if-then" rules manually by trial and error. On the other hand, Neural Networks perform function approximation of a system but cannot interpret the solution obtained neither check if its solution is plausible. The two approaches are complementary. Combining them, Neural Networks will allow learning capability while Fuzzy-Logic will bring knowledge representation (Neuro-Fuzzy). In this paper, an artificial neural network fuzzy inference system (ANFIS) controller is described and implemented to navigate the autonomous vehicle. Results show several improvements in the control system adjusted by neuro-fuzzy techniques in comparison to the previous methods like Artificial Neural Network (ANN).
NASA Astrophysics Data System (ADS)
Brown, C. David; Ih, Charles S.; Arce, Gonzalo R.; Fertell, David A.
1987-01-01
Vision systems for mobile robots or autonomous vehicles navigating in an unknown terrain environment must provide a rapid and accurate method of segmenting the scene ahead into regions of pathway and background. A major distinguishing feature between the pathway and background is the three dimensional texture of these two regions. Typical methods of textural image segmentation are very computationally intensive, often lack the required robustness, and are incapable of sensing the three dimensional texture of various regions of the scene. A method is presented where scanned laser projected lines of structured light, viewed by a stereoscopically located single video camera, resulted in an image in which the three dimensional characteristics of the scene were represented by the discontinuity of the projected lines. This image was conducive to processing with simple regional operators to classify regions as pathway or background. Design of some operators and application methods, and demonstration on sample images are presented. This method provides rapid and robust scene segmentation capability that has been implemented on a microcomputer in near real time, and should result in higher speed and more reliable robotic or autonomous navigation in unstructured environments.
Non-destructive inspection in industrial equipment using robotic mobile manipulation
NASA Astrophysics Data System (ADS)
Maurtua, Iñaki; Susperregi, Loreto; Ansuategui, Ander; Fernández, Ane; Ibarguren, Aitor; Molina, Jorge; Tubio, Carlos; Villasante, Cristobal; Felsch, Torsten; Pérez, Carmen; Rodriguez, Jorge R.; Ghrissi, Meftah
2016-05-01
MAINBOT project has developed service robots based applications to autonomously execute inspection tasks in extensive industrial plants in equipment that is arranged horizontally (using ground robots) or vertically (climbing robots). The industrial objective has been to provide a means to help measuring several physical parameters in multiple points by autonomous robots, able to navigate and climb structures, handling non-destructive testing sensors. MAINBOT has validated the solutions in two solar thermal plants (cylindrical-parabolic collectors and central tower), that are very demanding from mobile manipulation point of view mainly due to the extension (e.g. a thermal solar plant of 50Mw, with 400 hectares, 400.000 mirrors, 180 km of absorber tubes, 140m height tower), the variability of conditions (outdoor, day-night), safety requirements, etc. Once the technology was validated in simulation, the system was deployed in real setups and different validation tests carried out. In this paper two of the achievements related with the ground mobile inspection system are presented: (1) Autonomous navigation localization and planning algorithms to manage navigation in huge extensions and (2) Non-Destructive Inspection operations: thermography based detection algorithms to provide automatic inspection abilities to the robots.
Cerebellum Augmented Rover Development
NASA Technical Reports Server (NTRS)
King, Matthew
2005-01-01
Bio-Inspired Technologies and Systems (BITS) are a very natural result of thinking about Nature's way of solving problems. Knowledge of animal behaviors an be used in developing robotic behaviors intended for planetary exploration. This is the expertise of the JFL BITS Group and has served as a philosophical model for NMSU RioRobolab. Navigation is a vital function for any autonomous system. Systems must have the ability to determine a safe path between their current location and some target location. The MER mission, as well as other JPL rover missions, uses a method known as dead-reckoning to determine position information. Dead-reckoning uses wheel encoders to sense the wheel's rotation. In a sandy environment such as Mars, this method is highly inaccurate because the wheels will slip in the sand. Improving positioning error will allow the speed of an autonomous navigating rover to be greatly increased. Therefore, local navigation based upon landmark tracking is desirable in planetary exploration. The BITS Group is developing navigation technology based upon landmark tracking. Integration of the current rover architecture with a cerebellar neural network tracking algorithm will demonstrate that this approach to navigation is feasible and should be implemented in future rover and spacecraft missions.
The study of stereo vision technique for the autonomous vehicle
NASA Astrophysics Data System (ADS)
Li, Pei; Wang, Xi; Wang, Jiang-feng
2015-08-01
The stereo vision technology by two or more cameras could recovery 3D information of the field of view. This technology can effectively help the autonomous navigation system of unmanned vehicle to judge the pavement conditions within the field of view, and to measure the obstacles on the road. In this paper, the stereo vision technology in measuring the avoidance of the autonomous vehicle is studied and the key techniques are analyzed and discussed. The system hardware of the system is built and the software is debugged, and finally the measurement effect is explained by the measured data. Experiments show that the 3D reconstruction, within the field of view, can be rebuilt by the stereo vision technology effectively, and provide the basis for pavement condition judgment. Compared with unmanned vehicle navigation radar used in measuring system, the stereo vision system has the advantages of low cost, distance and so on, it has a good application prospect.
HERMIES-3: A step toward autonomous mobility, manipulation, and perception
NASA Technical Reports Server (NTRS)
Weisbin, C. R.; Burks, B. L.; Einstein, J. R.; Feezell, R. R.; Manges, W. W.; Thompson, D. H.
1989-01-01
HERMIES-III is an autonomous robot comprised of a seven degree-of-freedom (DOF) manipulator designed for human scale tasks, a laser range finder, a sonar array, an omni-directional wheel-driven chassis, multiple cameras, and a dual computer system containing a 16-node hypercube expandable to 128 nodes. The current experimental program involves performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES-III). The environment in which the robots operate has been designed to include multiple valves, pipes, meters, obstacles on the floor, valves occluded from view, and multiple paths of differing navigation complexity. The ongoing research program supports the development of autonomous capability for HERMIES-IIB and III to perform complex navigation and manipulation under time constraints, while dealing with imprecise sensory information.
Development of a Coherent Lidar for Aiding Precision Soft Landing on Planetary Bodies
NASA Technical Reports Server (NTRS)
Amzajerdian, Farzin; Pierrottet, Diego; Tolson, Robert H.; Powell, Richard W.; Davidson, John B.; Peri, Frank
2005-01-01
Coherent lidar can play a critical role in future planetary exploration missions by providing key guidance, navigation, and control (GNC) data necessary for navigating planetary landers to the pre-selected site and achieving autonomous safe soft-landing. Although the landing accuracy has steadily improved over time to approximately 35 km for the recent Mars Exploration Rovers due to better approach navigation, a drastically different guidance, navigation and control concept is required to meet future mission requirements. For example, future rovers will require better than 6 km landing accuracy for Mars and better than 1 km for the Moon plus maneuvering capability to avoid hazardous terrain features. For this purpose, an all-fiber coherent lidar is being developed to address the call for advancement of entry, descent, and landing technologies. This lidar will be capable of providing precision range to the ground and approach velocity data, and in the case of landing on Mars, it will also measure the atmospheric wind and density. The lidar obtains high resolution range information from a frequency modulated-continuous wave (FM-CW) laser beam whose instantaneous frequency varies linearly with time, and the ground vector velocity is directly extracted from the Doppler frequency shift. Utilizing the high concentration of aerosols in the Mars atmosphere (approx. two order of magnitude higher than the Earth), the lidar can measure wind velocity with a few watts of optical power. Operating in 1.57 micron wavelength regime, the lidar can use the differential absorption (DIAL) technique to measure the average CO2 concentration along the laser beam using, that is directly proportional to the Martian atmospheric density. Employing fiber optics components allows for the lidar multi-functional operation while facilitating a highly efficient, compact and reliable design suitable for integration into a spacecraft with limited mass, size, and power resources.
Sahl, Jason W; Fairfield, Nathaniel; Harris, J Kirk; Wettergreen, David; Stone, William C; Spear, John R
2010-03-01
The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (approximately 318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment.
Systems and Methods for Automated Vessel Navigation Using Sea State Prediction
NASA Technical Reports Server (NTRS)
Huntsberger, Terrance L. (Inventor); Howard, Andrew B. (Inventor); Reinhart, Rene Felix (Inventor); Aghazarian, Hrand (Inventor); Rankin, Arturo (Inventor)
2017-01-01
Systems and methods for sea state prediction and autonomous navigation in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes a method of predicting a future sea state including generating a sequence of at least two 3D images of a sea surface using at least two image sensors, detecting peaks and troughs in the 3D images using a processor, identifying at least one wavefront in each 3D image based upon the detected peaks and troughs using the processor, characterizing at least one propagating wave based upon the propagation of wavefronts detected in the sequence of 3D images using the processor, and predicting a future sea state using at least one propagating wave characterizing the propagation of wavefronts in the sequence of 3D images using the processor. Another embodiment includes a method of autonomous vessel navigation based upon a predicted sea state and target location.
Systems and Methods for Automated Vessel Navigation Using Sea State Prediction
NASA Technical Reports Server (NTRS)
Aghazarian, Hrand (Inventor); Reinhart, Rene Felix (Inventor); Huntsberger, Terrance L. (Inventor); Rankin, Arturo (Inventor); Howard, Andrew B. (Inventor)
2015-01-01
Systems and methods for sea state prediction and autonomous navigation in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes a method of predicting a future sea state including generating a sequence of at least two 3D images of a sea surface using at least two image sensors, detecting peaks and troughs in the 3D images using a processor, identifying at least one wavefront in each 3D image based upon the detected peaks and troughs using the processor, characterizing at least one propagating wave based upon the propagation of wavefronts detected in the sequence of 3D images using the processor, and predicting a future sea state using at least one propagating wave characterizing the propagation of wavefronts in the sequence of 3D images using the processor. Another embodiment includes a method of autonomous vessel navigation based upon a predicted sea state and target location.
NASA Technical Reports Server (NTRS)
Larimer, Stanley J.; Lisec, Thomas R.; Spiessbach, Andrew J.
1989-01-01
Under a contract with NASA's Jet Propulsion Laboratory, Martin Marietta has developed several alternative rover concepts for unmanned exploration of the planet Mars. One of those concepts, the 'Walking Beam', is the subject of this paper. This concept was developed with the goal of achieving many of the capabilities of more sophisticated articulated-leg walkers with a much simpler, more robust, less computationally demanding and more power efficient design. It consists of two large-base tripods nested one within the other which alternately translate with respect to each other along a 5-meter beam to propel the vehicle. The semiautonomous navigation system relies on terrain geometry sensors and tacticle feedback from each foot to autonomously select a path which avoids hazards along a route designated from earth. Both mobility and navigation features of this concept are discussed including a top-level description of the vehicle's physical characteristics, deployment strategy, mobility elements, sensor suite, theory of operation, navigation and control processes, and estimated performance.
NASA Technical Reports Server (NTRS)
2004-01-01
This pair of pieced-together images was taken by the Mars Exploration Rover Spirit's left navigation camera looking aft on March 6, 2004. It reveals the long and rocky path of nearly 240 meters (787 feet) that Spirit had traveled since safely arriving at Gusev Crater on Jan. 3, 2004.
The lander can still be seen in the distance, but will never be 'home' again for the journeying rover. This image is also a tribute to the effectiveness of the autonomous navigation system that the rovers use during parts of their martian drives. Instead of driving directly through the 'hollow' seen in the middle right of the image, the autonomous navigation system guided Spirit around the high ridge bordering the hollow. In the two days after these images were taken, Spirit has traveled roughly 60 meters (197 feet) farther toward its destination at the crater nicknamed 'Bonneville'.Navigation-guided optic canal decompression for traumatic optic neuropathy: Two case reports.
Bhattacharjee, Kasturi; Serasiya, Samir; Kapoor, Deepika; Bhattacharjee, Harsha
2018-06-01
Two cases of traumatic optic neuropathy presented with profound loss of vision. Both cases received a course of intravenous corticosteroids elsewhere but did not improve. They underwent Navigation guided optic canal decompression via external transcaruncular approach, following which both cases showed visual improvement. Postoperative Visual Evoked Potential and optical coherence technology of Retinal nerve fibre layer showed improvement. These case reports emphasize on the role of stereotactic navigation technology for optic canal decompression in cases of traumatic optic neuropathy.
Three-dimensional motor schema based navigation
NASA Technical Reports Server (NTRS)
Arkin, Ronald C.
1989-01-01
Reactive schema-based navigation is possible in space domains by extending the methods developed for ground-based navigation found within the Autonomous Robot Architecture (AuRA). Reformulation of two dimensional motor schemas for three dimensional applications is a straightforward process. The manifold advantages of schema-based control persist, including modular development, amenability to distributed processing, and responsiveness to environmental sensing. Simulation results show the feasibility of this methodology for space docking operations in a cluttered work area.
Autonomous navigation system. [gyroscopic pendulum for air navigation
NASA Technical Reports Server (NTRS)
Merhav, S. J. (Inventor)
1981-01-01
An inertial navigation system utilizing a servo-controlled two degree of freedom pendulum to obtain specific force components in the locally level coordinate system is described. The pendulum includes a leveling gyroscope and an azimuth gyroscope supported on a two gimbal system. The specific force components in the locally level coordinate system are converted to components in the geographical coordinate system by means of a single Euler transformation. The standard navigation equations are solved to determine longitudinal and lateral velocities. Finally, vehicle position is determined by a further integration.
Miniature wide field-of-view star trackers for spacecraft attitude sensing and navigation
NASA Technical Reports Server (NTRS)
Mccarty, William; Curtis, Eric; Hull, Anthony; Morgan, William
1993-01-01
Introducing a family of miniature, wide field-of-view star trackers for low cost, high performance spacecraft attitude determination and navigation applications. These devices, derivative of the WFOV Star Tracker Camera developed cooperatively by OCA Applied Optics and the Lawrence Livermore National Laboratory for the Brilliant Pebbles program, offer a suite of options addressing a wide range of spacecraft attitude measurement and control requirements. These sensors employ much wider fields than are customary (ranging between 20 and 60 degrees) to assure enough bright stars for quick and accurate attitude determinations without long integration intervals. The key benefit of this approach are light weight, low power, reduced data processing loads and high information carrier rates for wide ACS bandwidths. Devices described range from the proven OCA/LLNL WFOV Star Tracker Camera (a low-cost, space-qualified star-field imager utilizing the spacecraft's own computer and centroiding and position-finding), to a new autonomous subsystem design featuring dual-redundant cameras and completely self-contained star-field data processing with output quaternion solutions accurate to 100 micro-rad, 3 sigma, for stand-alone applications.
Interplanetary approach optical navigation with applications
NASA Technical Reports Server (NTRS)
Jerath, N.
1978-01-01
The use of optical data from onboard television cameras for the navigation of interplanetary spacecraft during the planet approach phase is investigated. Three optical data types were studied: the planet limb with auxiliary celestial references, the satellite-star, and the planet-star two-camera methods. Analysis and modelling issues related to the nature and information content of the optical methods were examined. Dynamic and measurement system modelling, data sequence design, measurement extraction, model estimation and orbit determination, as relating optical navigation, are discussed, and the various error sources were analyzed. The methodology developed was applied to the Mariner 9 and the Viking Mars missions. Navigation accuracies were evaluated at the control and knowledge points, with particular emphasis devoted to the combined use of radio and optical data. A parametric probability analysis technique was developed to evaluate navigation performance as a function of system reliabilities.
Autonomous navigation and mobility for a planetary rover
NASA Technical Reports Server (NTRS)
Miller, David P.; Mishkin, Andrew H.; Lambert, Kenneth E.; Bickler, Donald; Bernard, Douglas E.
1989-01-01
This paper presents an overview of the onboard subsystems that will be used in guiding a planetary rover. Particular emphasis is placed on the planning and sensing systems and their associated costs, particularly in computation. Issues that will be used in evaluating trades between the navigation system and mobility system are also presented.
Autonomous Navigation With Ground Station One-Way Forward-Link Doppler Data
NASA Technical Reports Server (NTRS)
Horstkamp, G. M.; Niklewski, D. J.; Gramling, C. J.
1996-01-01
The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) has spent several years developing operational onboard navigation systems (ONS's) to provide real time autonomous, highly accurate navigation products for spacecraft using NASA's space and ground communication systems. The highly successful Tracking and Data Relay Satellite (TDRSS) ONS (TONS) experiment on the Explorer Platform/Extreme Ultraviolet (EP/EUV) spacecraft, launched on June 7, 1992, flight demonstrated the ONS for high accuracy navigation using TDRSS forward link communication services. In late 1994, a similar ONS experiment was performed using EP/EUV flight hardware (the ultrastable oscillator and Doppler extractor card in one of the TDRSS transponders) and ground system software to demonstrate the feasibility of using an ONS with ground station forward link communication services. This paper provides a detailed evaluation of ground station-based ONS performance of data collected over a 20 day period. The ground station ONS (GONS) experiment results are used to project the expected performance of an operational system. The GONS processes Doppler data derived from scheduled ground station forward link services using a sequential estimation algorithm enhanced by a sophisticated process noise model to provide onboard orbit and frequency determination. Analysis of the GONS experiment performance indicates that real time onboard position accuracies of better than 125 meters (1 sigma) are achievable with two or more 5-minute contacts per day for the EP/EUV 525 kilometer altitude, 28.5 degree inclination orbit. GONS accuracy is shown to be a function of the fidelity of the onboard propagation model, the frequency/geometry of the tracking contacts, and the quality of the tracking measurements. GONS provides a viable option for using autonomous navigation to reduce operational costs for upcoming spacecraft missions with moderate position accuracy requirements.
Concepts for fast acquisition in optical communications systems
NASA Astrophysics Data System (ADS)
Wilkerson, Brandon L.; Giggenbach, Dirk; Epple, Bernhard
2006-09-01
As free-space laser communications systems proliferate due to improved technology and transmission techniques, optical communication networks comprised of ground stations, aircraft, high altitude platforms, and satellites become an attainable goal. An important consideration for optical networks is the ability of optical communication terminals (OCT) to quickly locate one another and align their laser beams to initiate the acquisition sequence. This paper investigates promising low-cost technologies and novel approaches that will facilitate the targeting and acquisition tasks between counter terminals. Specifically, two critical technology areas are investigated: position determination (which includes location and attitude determination) and inter-terminal communications. A feasibility study identified multiple-antenna global navigation satellite system (GNSS) systems and GNSS-aided inertial systems as possible position determination solutions. Personal satellite communication systems (e.g. Iridium or Inmarsat), third generation cellular technology (IMT-2000/UMTS), and a relatively new air traffic surveillance technology called Autonomous Dependent Surveillance-Broadcast (ADS-B) were identified as possible inter-terminal communication solutions. A GNSS-aided inertial system and an ADS-B system were integrated into an OCT to demonstrate their utility in a typical optical communication scenario. Testing showed that these technologies have high potential in future OCTs, although improvements can be made to both to increase tracking accuracy.
Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)
NASA Technical Reports Server (NTRS)
Folta, David C.; Hawkins, Albin; Bauer, Frank H. (Technical Monitor)
2001-01-01
NASA's first autonomous formation flying mission completed its primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center (GSFC) implemented a universal 3-axis formation flying algorithm in an autonomous executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard flight design and presents the validation results of this unique system. Results from functionality assessment through fully autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a standalone algorithm.
Cybersecurity for aerospace autonomous systems
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2015-05-01
High profile breaches have occurred across numerous information systems. One area where attacks are particularly problematic is autonomous control systems. This paper considers the aerospace information system, focusing on elements that interact with autonomous control systems (e.g., onboard UAVs). It discusses the trust placed in the autonomous systems and supporting systems (e.g., navigational aids) and how this trust can be validated. Approaches to remotely detect the UAV compromise, without relying on the onboard software (on a potentially compromised system) as part of the process are discussed. How different levels of autonomy (task-based, goal-based, mission-based) impact this remote characterization is considered.
A Self-Tuning Kalman Filter for Autonomous Spacecraft Navigation
NASA Technical Reports Server (NTRS)
Truong, Son H.
1998-01-01
Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman Filter and Global Positioning System (GPS) data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. Current techniques of Kalman filtering, however, still rely on manual tuning from analysts, and cannot help in optimizing autonomy without compromising accuracy and performance. This paper presents an approach to produce a high accuracy autonomous navigation system fully integrated with the flight system. The resulting system performs real-time state estimation by using an Extended Kalman Filter (EKF) implemented with high-fidelity state dynamics model, as does the GPS Enhanced Orbit Determination Experiment (GEODE) system developed by the NASA Goddard Space Flight Center. Augmented to the EKF is a sophisticated neural-fuzzy system, which combines the explicit knowledge representation of fuzzy logic with the learning power of neural networks. The fuzzy-neural system performs most of the self-tuning capability and helps the navigation system recover from estimation errors. The core requirement is a method of state estimation that handles uncertainties robustly, capable of identifying estimation problems, flexible enough to make decisions and adjustments to recover from these problems, and compact enough to run on flight hardware. The resulting system can be extended to support geosynchronous spacecraft and high-eccentricity orbits. Mathematical methodology, systems and operations concepts, and implementation of a system prototype are presented in this paper. Results from the use of the prototype to evaluate optimal control algorithms implemented are discussed. Test data and major control issues (e.g., how to define specific roles for fuzzy logic to support the self-learning capability) are also discussed. In addition, architecture of a complete end-to-end candidate flight system that provides navigation with highly autonomous control using data from GPS is presented.
State estimation for autonomous flight in cluttered environments
NASA Astrophysics Data System (ADS)
Langelaan, Jacob Willem
Safe, autonomous operation in complex, cluttered environments is a critical challenge facing autonomous mobile systems. The research described in this dissertation was motivated by a particularly difficult example of autonomous mobility: flight of a small Unmanned Aerial Vehicle (UAV) through a forest. In cluttered environments (such as forests or natural and urban canyons) signals from navigation beacons such as GPS may frequently be occluded. Direct measurements of vehicle position are therefore unavailable, and information required for flight control, obstacle avoidance, and navigation must be obtained using only on-board sensors. However, payload limitations of small UAVs restrict both the mass and physical dimensions of sensors that can be carried. This dissertation describes the development and proof-of-concept demonstration of a navigation system that uses only a low-cost inertial measurement unit and a monocular camera. Micro electromechanical inertial measurements units are well suited to small UAV applications and provide measurements of acceleration and angular rate. However, they do not provide information about nearby obstacles (needed for collision avoidance) and their noise and bias characteristics lead to unbounded growth in computed position. A monocular camera can provide bearings to nearby obstacles and landmarks. These bearings can be used both to enable obstacle avoidance and to aid navigation. Presented here is a solution to the problem of estimating vehicle state (position, orientation and velocity) as well as positions of obstacles in the environment using only inertial measurements and bearings to obstacles. This is a highly nonlinear estimation problem, and standard estimation techniques such as the Extended Kalman Filter are prone to divergence in this application. In this dissertation a Sigma Point Kalman Filter is implemented, resulting in an estimator which is able to cope with the significant nonlinearities in the system equations and uncertainty in state estimates while remaining tractable for real-time operation. In addition, the issues of data association and landmark initialization are addressed. Estimator performance is examined through Monte Carlo simulations in both two and three dimensions for scenarios involving UAV flight in cluttered environments. Hardware tests and simulations demonstrate navigation through an obstacle-strewn environment by a small Unmanned Ground Vehicle.
NASA Astrophysics Data System (ADS)
Wright, Adam A.; Momin, Orko; Shin, Young Ho; Shakya, Rahul; Nepal, Kumud; Ahlgren, David J.
2010-01-01
This paper presents the application of a distributed systems architecture to an autonomous ground vehicle, Q, that participates in both the autonomous and navigation challenges of the Intelligent Ground Vehicle Competition. In the autonomous challenge the vehicle is required to follow a course, while avoiding obstacles and staying within the course boundaries, which are marked by white lines. For the navigation challenge, the vehicle is required to reach a set of target destinations, known as way points, with given GPS coordinates and avoid obstacles that it encounters in the process. Previously the vehicle utilized a single laptop to execute all processing activities including image processing, sensor interfacing and data processing, path planning and navigation algorithms and motor control. National Instruments' (NI) LabVIEW served as the programming language for software implementation. As an upgrade to last year's design, a NI compact Reconfigurable Input/Output system (cRIO) was incorporated to the system architecture. The cRIO is NI's solution for rapid prototyping that is equipped with a real time processor, an FPGA and modular input/output. Under the current system, the real time processor handles the path planning and navigation algorithms, the FPGA gathers and processes sensor data. This setup leaves the laptop to focus on running the image processing algorithm. Image processing as previously presented by Nepal et. al. is a multi-step line extraction algorithm and constitutes the largest processor load. This distributed approach results in a faster image processing algorithm which was previously Q's bottleneck. Additionally, the path planning and navigation algorithms are executed more reliably on the real time processor due to the deterministic nature of operation. The implementation of this architecture required exploration of various inter-system communication techniques. Data transfer between the laptop and the real time processor using UDP packets was established as the most reliable protocol after testing various options. Improvement can be made to the system by migrating more algorithms to the hardware based FPGA to further speed up the operations of the vehicle.
A Behavior-Based Strategy for Single and Multi-Robot Autonomous Exploration
Cepeda, Jesus S.; Chaimowicz, Luiz; Soto, Rogelio; Gordillo, José L.; Alanís-Reyes, Edén A.; Carrillo-Arce, Luis C.
2012-01-01
In this paper, we consider the problem of autonomous exploration of unknown environments with single and multiple robots. This is a challenging task, with several potential applications. We propose a simple yet effective approach that combines a behavior-based navigation with an efficient data structure to store previously visited regions. This allows robots to safely navigate, disperse and efficiently explore the environment. A series of experiments performed using a realistic robotic simulator and a real testbed scenario demonstrate that our technique effectively distributes the robots over the environment and allows them to quickly accomplish their mission in large open spaces, narrow cluttered environments, dead-end corridors, as well as rooms with minimum exits.
2011-12-01
study new multi-agent algorithms to avoid collision and obstacles. Others, including Hanford et al. [2], have tried to build low-cost experimental...2007. [2] S. D. Hanford , L. N. Long, and J. F. Horn, “A Small Semi-Autonomous Rotary-Wing Unmanned Air Vehicle ( UAV ),” 2003 AIAA Atmospheric
Design of an autonomous exterior security robot
NASA Technical Reports Server (NTRS)
Myers, Scott D.
1994-01-01
This paper discusses the requirements and preliminary design of robotic vehicle designed for performing autonomous exterior perimeter security patrols around warehouse areas, ammunition supply depots, and industrial parks for the U.S. Department of Defense. The preliminary design allows for the operation of up to eight vehicles in a six kilometer by six kilometer zone with autonomous navigation and obstacle avoidance. In addition to detection of crawling intruders at 100 meters, the system must perform real-time inventory checking and database comparisons using a microwave tags system.
Rhee, Seung Joon; Park, Shi Hwan; Cho, He Myung
2014-01-01
Purpose The purpose of this study is to compare and analyze the precision of optical and electromagnetic navigation systems in total knee arthroplasty (TKA). Materials and Methods We retrospectively reviewed 60 patients who underwent TKA using an optical navigation system and 60 patients who underwent TKA using an electromagnetic navigation system from June 2010 to March 2012. The mechanical axis that was measured on preoperative radiographs and by the intraoperative navigation systems were compared between the groups. The postoperative positions of the femoral and tibial components in the sagittal and coronal plane were assessed. Results The difference of the mechanical axis measured on the preoperative radiograph and by the intraoperative navigation systems was 0.6 degrees more varus in the electromagnetic navigation system group than in the optical navigation system group, but showed no statistically significant difference between the two groups (p>0.05). The positions of the femoral and tibial components in the sagittal and coronal planes on the postoperative radiographs also showed no statistically significant difference between the two groups (p>0.05). Conclusions In TKA, both optical and electromagnetic navigation systems showed high accuracy and reproducibility, and the measurements from the postoperative radiographs showed no significant difference between the two groups. PMID:25505703
Searching Lost People with Uavs: the System and Results of the Close-Search Project
NASA Astrophysics Data System (ADS)
Molina, P.; Colomina, I.; Vitoria, T.; Silva, P. F.; Skaloud, J.; Kornus, W.; Prades, R.; Aguilera, C.
2012-07-01
This paper will introduce the goals, concept and results of the project named CLOSE-SEARCH, which stands for 'Accurate and safe EGNOS-SoL Navigation for UAV-based low-cost Search-And-Rescue (SAR) operations'. The main goal is to integrate a medium-size, helicopter-type Unmanned Aerial Vehicle (UAV), a thermal imaging sensor and an EGNOS-based multi-sensor navigation system, including an Autonomous Integrity Monitoring (AIM) capability, to support search operations in difficult-to-access areas and/or night operations. The focus of the paper is three-fold. Firstly, the operational and technical challenges of the proposed approach are discussed, such as ultra-safe multi-sensor navigation system, the use of combined thermal and optical vision (infrared plus visible) for person recognition and Beyond-Line-Of-Sight communications among others. Secondly, the implementation of the integrity concept for UAV platforms is discussed herein through the AIM approach. Based on the potential of the geodetic quality analysis and on the use of the European EGNOS system as a navigation performance starting point, AIM approaches integrity from the precision standpoint; that is, the derivation of Horizontal and Vertical Protection Levels (HPLs, VPLs) from a realistic precision estimation of the position parameters is performed and compared to predefined Alert Limits (ALs). Finally, some results from the project test campaigns are described to report on particular project achievements. Together with actual Search-and-Rescue teams, the system was operated in realistic, user-chosen test scenarios. In this context, and specially focusing on the EGNOS-based UAV navigation, the AIM capability and also the RGB/thermal imaging subsystem, a summary of the results is presented.
Relative Navigation of Formation-Flying Satellites
NASA Technical Reports Server (NTRS)
Long, Anne; Kelbel, David; Lee, Taesul; Leung, Dominic; Carpenter, J. Russell; Grambling, Cheryl
2002-01-01
This paper compares autonomous relative navigation performance for formations in eccentric, medium and high-altitude Earth orbits using Global Positioning System (GPS) Standard Positioning Service (SPS), crosslink, and celestial object measurements. For close formations, the relative navigation accuracy is highly dependent on the magnitude of the uncorrelated measurement errors. A relative navigation position accuracy of better than 10 centimeters root-mean-square (RMS) can be achieved for medium-altitude formations that can continuously track at least one GPS signal. A relative navigation position accuracy of better than 15 meters RMS can be achieved for high-altitude formations that have sparse tracking of the GPS signals. The addition of crosslink measurements can significantly improve relative navigation accuracy for formations that use sparse GPS tracking or celestial object measurements for absolute navigation.
Rule-based navigation control design for autonomous flight
NASA Astrophysics Data System (ADS)
Contreras, Hugo; Bassi, Danilo
2008-04-01
This article depicts a navigation control system design that is based on a set of rules in order to follow a desired trajectory. The full control of the aircraft considered here comprises: a low level stability control loop, based on classic PID controller and the higher level navigation whose main job is to exercise lateral control (course) and altitude control, trying to follow a desired trajectory. The rules and PID gains were adjusted systematically according to the result of flight simulation. In spite of its simplicity, the rule-based navigation control proved to be robust, even with big perturbation, like crossing winds.
TDRSS Onboard Navigation System (TONS) experiment for the Explorer Platform (EP)
NASA Astrophysics Data System (ADS)
Gramling, C. J.; Hornstein, R. S.; Long, A. C.; Samii, M. V.; Elrod, B. D.
A TDRSS Onboard Navigation System (TONS) is currently being developed by NASA to provide a high-accuracy autonomous spacecraft navigation capability for users of TDRSS and its successor, the Advanced TDRSS. A TONS experiment will be performed in conjunction with the Explorer Platform (EP)/EUV Explorer mission to flight-qualify TONS Block I. This paper presents an overview of TDRSS on-board navigation goals and plans and the technical objectives of the TONS experiment. The operations concept of the experiment is described, including the characteristics of the ultrastable oscillator, the Doppler extractor, the signal-acquisition process, the TONS ground-support system, and the navigation flight software. A description of the on-board navigation algorithms and the rationale for their selection is also presented.
Preliminary Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)
NASA Technical Reports Server (NTRS)
Folta, David; Hawkins, Albin
2001-01-01
NASA's first autonomous formation flying mission is completing a primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center has implemented an autonomous universal three-axis formation flying algorithm in executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard design and presents the preliminary validation results of this unique system. Results from functionality assessment and autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a stand-alone algorithm.
The Role of X-Rays in Future Space Navigation and Communication
NASA Technical Reports Server (NTRS)
Winternitz, Luke M. B.; Gendreau, Keith C.; Hasouneh, Monther A.; Mitchell, Jason W.; Fong, Wai H.; Lee, Wing-Tsz; Gavriil, Fotis; Arzoumanian, Zaven
2013-01-01
In the near future, applications using X-rays will enable autonomous navigation and time distribution throughout the solar system, high capacity and low-power space data links, highly accurate attitude sensing, and extremely high-precision formation flying capabilities. Each of these applications alone has the potential to revolutionize mission capabilities, particularly beyond Earth orbit. This paper will outline the NASA Goddard Space Flight Center vision and efforts toward realizing the full potential of X-ray navigation and communications.
Robust analysis of an underwater navigational strategy in electrically heterogeneous corridors.
Dimble, Kedar D; Ranganathan, Badri N; Keshavan, Jishnu; Humbert, J Sean
2016-08-01
Obstacles and other global stimuli provide relevant navigational cues to a weakly electric fish. In this work, robust analysis of a control strategy based on electrolocation for performing obstacle avoidance in electrically heterogeneous corridors is presented and validated. Static output feedback control is shown to achieve the desired goal of reflexive obstacle avoidance in such environments in simulation and experimentation. The proposed approach is computationally inexpensive and readily implementable on a small scale underwater vehicle, making underwater autonomous navigation feasible in real-time.
NASA Technical Reports Server (NTRS)
Bishop, Robert H.; DeMars, Kyle; Trawny, Nikolas; Crain, Tim; Hanak, Chad; Carson, John M.; Christian, John
2016-01-01
The navigation filter architecture successfully deployed on the Morpheus flight vehicle is presented. The filter was developed as a key element of the NASA Autonomous Landing and Hazard Avoidance Technology (ALHAT) project and over the course of 15 free fights was integrated into the Morpheus vehicle, operations, and flight control loop. Flight testing completed by demonstrating autonomous hazard detection and avoidance, integration of an altimeter, surface relative velocity (velocimeter) and hazard relative navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman flter software, and landing within 2 meters of the vertical testbed GPS-based navigation solution at the safe landing site target. Morpheus followed a trajectory that included an ascent phase followed by a partial descent-to-landing, although the proposed filter architecture is applicable to more general planetary precision entry, descent, and landings. The main new contribution is the incorporation of a sophisticated hazard relative navigation sensor-originally intended to locate safe landing sites-into the navigation system and employed as a navigation sensor. The formulation of a dual-state inertial extended Kalman filter was designed to address the precision planetary landing problem when viewed as a rendezvous problem with an intended landing site. For the required precision navigation system that is capable of navigating along a descent-to-landing trajectory to a precise landing, the impact of attitude errors on the translational state estimation are included in a fully integrated navigation structure in which translation state estimation is combined with attitude state estimation. The map tie errors are estimated as part of the process, thereby creating a dual-state filter implementation. Also, the filter is implemented using inertial states rather than states relative to the target. External measurements include altimeter, velocimeter, star camera, terrain relative navigation sensor, and a hazard relative navigation sensor providing information regarding hazards on a map generated on-the-fly.
Insect-Based Vision for Autonomous Vehicles: A Feasibility Study
NASA Technical Reports Server (NTRS)
Srinivasan, Mandyam V.
1999-01-01
The aims of the project were to use a high-speed digital video camera to pursue two questions: i) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; To study the fine structure of insect flight trajectories with in order to better understand the characteristics of flight control, orientation and navigation.
Insect-Based Vision for Autonomous Vehicles: A Feasibility Study
NASA Technical Reports Server (NTRS)
Srinivasan, Mandyam V.
1999-01-01
The aims of the project were to use a high-speed digital video camera to pursue two questions: (1) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; (2) To study the fine structure of insect flight trajectories in order to better understand the characteristics of flight control, orientation and navigation.
Terrain discovery and navigation of a multi-articulated linear robot using map-seeking circuits
NASA Astrophysics Data System (ADS)
Snider, Ross K.; Arathorn, David W.
2006-05-01
A significant challenge in robotics is providing a robot with the ability to sense its environment and then autonomously move while accommodating obstacles. The DARPA Grand Challenge, one of the most visible examples, set the goal of driving a vehicle autonomously for over a hundred miles avoiding obstacles along a predetermined path. Map-Seeking Circuits have shown their biomimetic capability in both vision and inverse kinematics and here we demonstrate their potential usefulness for intelligent exploration of unknown terrain using a multi-articulated linear robot. A robot that could handle any degree of terrain complexity would be useful for exploring inaccessible crowded spaces such as rubble piles in emergency situations, patrolling/intelligence gathering in tough terrain, tunnel exploration, and possibly even planetary exploration. Here we simulate autonomous exploratory navigation by an interaction of terrain discovery using the multi-articulated linear robot to build a local terrain map and exploitation of that growing terrain map to solve the propulsion problem of the robot.
On-Orbit Autonomous Assembly from Nanosatellites
NASA Technical Reports Server (NTRS)
Murchison, Luke S.; Martinez, Andres; Petro, Andrew
2015-01-01
The On-Orbit Autonomous Assembly from Nanosatellites (OAAN) project will demonstrate autonomous control algorithms for rendezvous and docking maneuvers; low-power reconfigurable magnetic docking technology; and compact, lightweight and inexpensive precision relative navigation using carrier-phase differential (CD) GPS with a three-degree of freedom ground demonstration. CDGPS is a specific relative position determination method that measures the phase of the GPS carrier wave to yield relative position data accurate to.4 inch (1 centimeter). CDGPS is a technology commonly found in the surveying industry. The development and demonstration of these technologies will fill a current gap in the availability of proven autonomous rendezvous and docking systems for small satellites.
Linear FMCW Laser Radar for Precision Range and Vector Velocity Measurements
NASA Technical Reports Server (NTRS)
Pierrottet, Diego; Amzajerdian, Farzin; Petway, Larry; Barnes, Bruce; Lockhard, George; Rubio, Manuel
2008-01-01
An all fiber linear frequency modulated continuous wave (FMCW) coherent laser radar system is under development with a goal to aide NASA s new Space Exploration initiative for manned and robotic missions to the Moon and Mars. By employing a combination of optical heterodyne and linear frequency modulation techniques and utilizing state-of-the-art fiber optic technologies, highly efficient, compact and reliable laser radar suitable for operation in a space environment is being developed. Linear FMCW lidar has the capability of high-resolution range measurements, and when configured into a multi-channel receiver system it has the capability of obtaining high precision horizontal and vertical velocity measurements. Precision range and vector velocity data are beneficial to navigating planetary landing pods to the preselected site and achieving autonomous, safe soft-landing. The all-fiber coherent laser radar has several important advantages over more conventional pulsed laser altimeters or range finders. One of the advantages of the coherent laser radar is its ability to measure directly the platform velocity by extracting the Doppler shift generated from the motion, as opposed to time of flight range finders where terrain features such as hills, cliffs, or slopes add error to the velocity measurement. Doppler measurements are about two orders of magnitude more accurate than the velocity estimates obtained by pulsed laser altimeters. In addition, most of the components of the device are efficient and reliable commercial off-the-shelf fiber optic telecommunication components. This paper discusses the design and performance of a second-generation brassboard system under development at NASA Langley Research Center as part of the Autonomous Landing and Hazard Avoidance (ALHAT) project.
NASA Technical Reports Server (NTRS)
Carson, John M., III; Johnson, Andrew E.; Anderson, F. Scott; Condon, Gerald L.; Nguyen, Louis H.; Olansen, Jon B.; Devolites, Jennifer L.; Harris, William J.; Hines, Glenn D.; Lee, David E.;
2016-01-01
The Lunar MARE (Moon Age and Regolith Explorer) Discovery Mission concept targets delivery of a science payload to the lunar surface for sample collection and dating. The mission science is within a 100-meter radius region of smooth lunar maria terrain near Aristarchus crater. The location has several small, sharp craters and rocks that present landing hazards to the spacecraft. For successful delivery of the science payload to the surface, the vehicle Guidance, Navigation and Control (GN&C) subsystem requires safe and precise landing capability, so design infuses the NASA Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) and a gimbaled, throttleable LOX/LCH4 main engine. The ALHAT system implemented for Lunar MARE is a specialization of prototype technologies in work within NASA for the past two decades, including a passive optical Terrain Relative Navigation (TRN) sensor, a Navigation Doppler Lidar (NDL) velocity and range sensor, and a Lidar-based Hazard Detection (HD) sensor. The landing descent profile is from a retrograde orbit over lighted terrain with landing near lunar dawn. The GN&C subsystem with ALHAT capabilities will deliver the science payload to the lunar surface within a 20-meter landing ellipse of the target location and at a site having greater than 99% safety probability, which minimizes risk to safe landing and delivery of the MARE science payload to the intended terrain region.
Autonomous Flight Rules - A Concept for Self-Separation in U.S. Domestic Airspace
NASA Technical Reports Server (NTRS)
Wing, David J.; Cotton, William B.
2011-01-01
Autonomous Flight Rules (AFR) are proposed as a new set of operating regulations in which aircraft navigate on tracks of their choice while self-separating from traffic and weather. AFR would exist alongside Instrument and Visual Flight Rules (IFR and VFR) as one of three available flight options for any appropriately trained and qualified operator with the necessary certified equipment. Historically, ground-based separation services evolved by necessity as aircraft began operating in the clouds and were unable to see each other. Today, technologies for global navigation, airborne surveillance, and onboard computing enable the functions of traffic conflict management to be fully integrated with navigation procedures onboard the aircraft. By self-separating, aircraft can operate with more flexibility and fewer restrictions than are required when using ground-based separation. The AFR concept is described in detail and provides practical means by which self-separating aircraft could share the same airspace as IFR and VFR aircraft without disrupting the ongoing processes of Air Traffic Control.
Improving CAR Navigation with a Vision-Based System
NASA Astrophysics Data System (ADS)
Kim, H.; Choi, K.; Lee, I.
2015-08-01
The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.
Improving Car Navigation with a Vision-Based System
NASA Astrophysics Data System (ADS)
Kim, H.; Choi, K.; Lee, I.
2015-08-01
The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.
NASA Technical Reports Server (NTRS)
1976-01-01
The six themes identified by the Workshop have many common navigation guidance and control needs. All the earth orbit themes have a strong requirement for attitude, figure and stabilization control of large space structures, a requirement not currently being supported. All but the space transportation theme have need for precision pointing of spacecraft and instruments. In addition all the themes have requirements for increasing autonomous operations for such activities as spacecraft and experiment operations, onboard mission modification, rendezvous and docking, spacecraft assembly and maintenance, navigation and guidance, and self-checkout, test and repair. Major new efforts are required to conceptualize new approaches to large space antennas and arrays that are lightweight, readily deployable, and capable of precise attitude and figure control. Conventional approaches offer little hope of meeting these requirements. Functions that can benefit from increasing automation or autonomous operations are listed.
A method of real-time detection for distant moving obstacles by monocular vision
NASA Astrophysics Data System (ADS)
Jia, Bao-zhi; Zhu, Ming
2013-12-01
In this paper, we propose an approach for detection of distant moving obstacles like cars and bicycles by a monocular camera to cooperate with ultrasonic sensors in low-cost condition. We are aiming at detecting distant obstacles that move toward our autonomous navigation car in order to give alarm and keep away from them. Method of frame differencing is applied to find obstacles after compensation of camera's ego-motion. Meanwhile, each obstacle is separated from others in an independent area and given a confidence level to indicate whether it is coming closer. The results on an open dataset and our own autonomous navigation car have proved that the method is effective for detection of distant moving obstacles in real-time.
Learning for autonomous navigation : extrapolating from underfoot to the far field
NASA Technical Reports Server (NTRS)
Matthies, Larry; Turmon, Michael; Howard, Andrew; Angelova, Anelia; Tang, Benyang; Mjolsness, Eric
2005-01-01
Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter. Enabling robots to learn from experience may alleviate both of these problems. We define two paradigms for this, learning from 3-D geometry and learning from proprioception, and describe initial instantiations of them we have developed under DARPA and NASA programs. Field test results show promise for learning traversability of vegetated terrain, learning to extend the lookahead range of the vision system, and learning how slip varies with slope.
NASA Astrophysics Data System (ADS)
Martínez, Fredy; Martínez, Fernando; Jacinto, Edwar
2017-02-01
In this paper we propose an on-line motion planning strategy for autonomous robots in dynamic and locally observable environments. In this approach, we first visually identify geometric shapes in the environment by filtering images. Then, an ART-2 network is used to establish the similarity between patterns. The proposed algorithm allows that a robot establish its relative location in the environment, and define its navigation path based on images of the environment and its similarity to reference images. This is an efficient and minimalist method that uses the similarity of landmark view patterns to navigate to the desired destination. Laboratory tests on real prototypes demonstrate the performance of the algorithm.
Fast and reliable obstacle detection and segmentation for cross-country navigation
NASA Technical Reports Server (NTRS)
Talukder, A.; Manduchi, R.; Rankin, A.; Matthies, L.
2002-01-01
Obstacle detection is one of the main components of the control system of autonomous vehicles. In the case of indoor/urban navigation, obstacles are typically defined as surface points that are higher than the ground plane. This characterization, however, cannot be used in cross-country and unstructured environments, where the notion of ground plane is often not meaningful.
AUV SLAM and Experiments Using a Mechanical Scanning Forward-Looking Sonar
He, Bo; Liang, Yan; Feng, Xiao; Nian, Rui; Yan, Tianhong; Li, Minghui; Zhang, Shujing
2012-01-01
Navigation technology is one of the most important challenges in the applications of autonomous underwater vehicles (AUVs) which navigate in the complex undersea environment. The ability of localizing a robot and accurately mapping its surroundings simultaneously, namely the simultaneous localization and mapping (SLAM) problem, is a key prerequisite of truly autonomous robots. In this paper, a modified-FastSLAM algorithm is proposed and used in the navigation for our C-Ranger research platform, an open-frame AUV. A mechanical scanning imaging sonar is chosen as the active sensor for the AUV. The modified-FastSLAM implements the update relying on the on-board sensors of C-Ranger. On the other hand, the algorithm employs the data association which combines the single particle maximum likelihood method with modified negative evidence method, and uses the rank-based resampling to overcome the particle depletion problem. In order to verify the feasibility of the proposed methods, both simulation experiments and sea trials for C-Ranger are conducted. The experimental results show the modified-FastSLAM employed for the navigation of the C-Ranger AUV is much more effective and accurate compared with the traditional methods. PMID:23012549
AUV SLAM and experiments using a mechanical scanning forward-looking sonar.
He, Bo; Liang, Yan; Feng, Xiao; Nian, Rui; Yan, Tianhong; Li, Minghui; Zhang, Shujing
2012-01-01
Navigation technology is one of the most important challenges in the applications of autonomous underwater vehicles (AUVs) which navigate in the complex undersea environment. The ability of localizing a robot and accurately mapping its surroundings simultaneously, namely the simultaneous localization and mapping (SLAM) problem, is a key prerequisite of truly autonomous robots. In this paper, a modified-FastSLAM algorithm is proposed and used in the navigation for our C-Ranger research platform, an open-frame AUV. A mechanical scanning imaging sonar is chosen as the active sensor for the AUV. The modified-FastSLAM implements the update relying on the on-board sensors of C-Ranger. On the other hand, the algorithm employs the data association which combines the single particle maximum likelihood method with modified negative evidence method, and uses the rank-based resampling to overcome the particle depletion problem. In order to verify the feasibility of the proposed methods, both simulation experiments and sea trials for C-Ranger are conducted. The experimental results show the modified-FastSLAM employed for the navigation of the C-Ranger AUV is much more effective and accurate compared with the traditional methods.
Relative optical navigation around small bodies via Extreme Learning Machine
NASA Astrophysics Data System (ADS)
Law, Andrew M.
To perform close proximity operations under a low-gravity environment, relative and absolute positions are vital information to the maneuver. Hence navigation is inseparably integrated in space travel. Extreme Learning Machine (ELM) is presented as an optical navigation method around small celestial bodies. Optical Navigation uses visual observation instruments such as a camera to acquire useful data and determine spacecraft position. The required input data for operation is merely a single image strip and a nadir image. ELM is a machine learning Single Layer feed-Forward Network (SLFN), a type of neural network (NN). The algorithm is developed on the predicate that input weights and biases can be randomly assigned and does not require back-propagation. The learned model is the output layer weights which are used to calculate a prediction. Together, Extreme Learning Machine Optical Navigation (ELM OpNav) utilizes optical images and ELM algorithm to train the machine to navigate around a target body. In this thesis the asteroid, Vesta, is the designated celestial body. The trained ELMs estimate the position of the spacecraft during operation with a single data set. The results show the approach is promising and potentially suitable for on-board navigation.
Precision Landing and Hazard Avoidance Doman
NASA Technical Reports Server (NTRS)
Robertson, Edward A.; Carson, John M., III
2016-01-01
The Precision Landing and Hazard Avoidance (PL&HA) domain addresses the development, integration, testing, and spaceflight infusion of sensing, processing, and GN&C functions critical to the success and safety of future human and robotic exploration missions. PL&HA sensors also have applications to other mission events, such as rendezvous and docking. Autonomous PL&HA builds upon the core GN&C capabilities developed to enable soft, controlled landings on the Moon, Mars, and other solar system bodies. Through the addition of a Terrain Relative Navigation (TRN) function, precision landing within tens of meters of a map-based target is possible. The addition of a 3-D terrain mapping lidar sensor improves the probability of a safe landing via autonomous, real-time Hazard Detection and Avoidance (HDA). PL&HA significantly improves the probability of mission success and enhances access to sites of scientific interest located in challenging terrain. PL&HA can also utilize external navigation aids, such as navigation satellites and surface beacons. Advanced Lidar Sensors High precision ranging, velocimetry, and 3-D terrain mapping Terrain Relative Navigation (TRN) TRN compares onboard reconnaissance data with real-time terrain imaging data to update the S/C position estimate Hazard Detection and Avoidance (HDA) Generates a high-resolution, 3-D terrain map in real-time during the approach trajectory to identify safe landing targets Inertial Navigation During Terminal Descent High precision surface relative sensors enable accurate inertial navigation during terminal descent and a tightly controlled touchdown within meters of the selected safe landing target.
Navigation Architecture for a Space Mobile Network
NASA Technical Reports Server (NTRS)
Valdez, Jennifer E.; Ashman, Benjamin; Gramling, Cheryl; Heckler, Gregory W.; Carpenter, Russell
2016-01-01
The Tracking and Data Relay Satellite System (TDRSS) Augmentation Service for Satellites (TASS) is a proposed beacon service to provide a global, space based GPS augmentation service based on the NASA Global Differential GPS (GDGPS) System. The TASS signal will be tied to the GPS time system and usable as an additional ranging and Doppler radiometric source. Additionally, it will provide data vital to autonomous navigation in the near Earth regime, including space weather information, TDRS ephemerides, Earth Orientation Parameters (EOP), and forward commanding capability. TASS benefits include enhancing situational awareness, enabling increased autonomy, and providing near real-time command access for user platforms. As NASA Headquarters' Space Communication and Navigation Office (SCaN) begins to move away from a centralized network architecture and towards a Space Mobile Network (SMN) that allows for user initiated services, autonomous navigation will be a key part of such a system. This paper explores how a TASS beacon service enables the Space Mobile Networking paradigm, what a typical user platform would require, and provides an in-depth analysis of several navigation scenarios and operations concepts. This paper provides an overview of the TASS beacon and its role within the SMN and user community. Supporting navigation analysis is presented for two user mission scenarios: an Earth observing spacecraft in low earth orbit (LEO), and a highly elliptical spacecraft in a lunar resonance orbit. These diverse flight scenarios indicate the breadth of applicability of the TASS beacon for upcoming users within the current network architecture and in the SMN.
Preliminary Operational Results of the TDRSS Onboard Navigation System (TONS) for the Terra Mission
NASA Technical Reports Server (NTRS)
Gramling, Cheryl; Lorah, John; Santoro, Ernest; Work, Kevin; Chambers, Robert; Bauer, Frank H. (Technical Monitor)
2000-01-01
The Earth Observing System Terra spacecraft was launched on December 18, 1999, to provide data for the characterization of the terrestrial and oceanic surfaces, clouds, radiation, aerosols, and radiative balance. The Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (ONS) (TONS) flying on Terra provides the spacecraft with an operational real-time navigation solution. TONS is a passive system that makes judicious use of Terra's communication and computer subsystems. An objective of the ONS developed by NASA's Goddard Space Flight Center (GSFC) Guidance, Navigation and Control Center is to provide autonomous navigation with minimal power, weight, and volume impact on the user spacecraft. TONS relies on extracting tracking measurements onboard from a TDRSS forward-link communication signal and processing these measurements in an onboard extended Kalman filter to estimate Terra's current state. Terra is the first NASA low Earth orbiting mission to fly autonomous navigation which produces accurate results. The science orbital accuracy requirements for Terra are 150 meters (m) (3sigma) per axis with a goal of 5m (1 sigma) RSS which TONS is expected to meet. The TONS solutions are telemetered in real-time to the mission scientists along with their science data for immediate processing. Once set in the operational mode, TONS eliminates the need for ground orbit determination and allows for a smooth flow from the spacecraft telemetry to planning products for the mission team. This paper will present the preliminary results of the operational TONS solution available from Terra.
Design of all-weather celestial navigation system
NASA Astrophysics Data System (ADS)
Sun, Hongchi; Mu, Rongjun; Du, Huajun; Wu, Peng
2018-03-01
In order to realize autonomous navigation in the atmosphere, an all-weather celestial navigation system is designed. The research of celestial navigation system include discrimination method of comentropy and the adaptive navigation algorithm based on the P value. The discrimination method of comentropy is studied to realize the independent switching of two celestial navigation modes, starlight and radio. Finally, an adaptive filtering algorithm based on P value is proposed, which can greatly improve the disturbance rejection capability of the system. The experimental results show that the accuracy of the three axis attitude is better than 10″, and it can work all weather. In perturbation environment, the position accuracy of the integrated navigation system can be increased 20% comparing with the traditional method. It basically meets the requirements of the all-weather celestial navigation system, and it has the ability of stability, reliability, high accuracy and strong anti-interference.
Insect-Inspired Optical-Flow Navigation Sensors
NASA Technical Reports Server (NTRS)
Thakoor, Sarita; Morookian, John M.; Chahl, Javan; Soccol, Dean; Hines, Butler; Zornetzer, Steven
2005-01-01
Integrated circuits that exploit optical flow to sense motions of computer mice on or near surfaces ( optical mouse chips ) are used as navigation sensors in a class of small flying robots now undergoing development for potential use in such applications as exploration, search, and surveillance. The basic principles of these robots were described briefly in Insect-Inspired Flight Control for Small Flying Robots (NPO-30545), NASA Tech Briefs, Vol. 29, No. 1 (January 2005), page 61. To recapitulate from the cited prior article: The concept of optical flow can be defined, loosely, as the use of texture in images as a source of motion cues. The flight-control and navigation systems of these robots are inspired largely by the designs and functions of the vision systems and brains of insects, which have been demonstrated to utilize optical flow (as detected by their eyes and brains) resulting from their own motions in the environment. Optical flow has been shown to be very effective as a means of avoiding obstacles and controlling speeds and altitudes in robotic navigation. Prior systems used in experiments on navigating by means of optical flow have involved the use of panoramic optics, high-resolution image sensors, and programmable imagedata- processing computers.
NASA Astrophysics Data System (ADS)
Lu, Jiazhen; Yang, Lie
2018-05-01
To achieve accurate and completely autonomous navigation for spacecraft, inertial/celestial integrated navigation gets increasing attention. In this study, a missile-borne inertial/stellar refraction integrated navigation scheme is proposed. Position Dilution of Precision (PDOP) for stellar refraction is introduced and the corresponding equation is derived. Based on the condition when PDOP reaches the minimum value, an optimized observation scheme is proposed. To verify the feasibility of the proposed scheme, numerical simulation is conducted. The results of the Extended Kalman Filter (EKF) and Unscented Kalman Filter (UKF) are compared and impact factors of navigation accuracy are studied in the simulation. The simulation results indicated that the proposed observation scheme has an accurate positioning performance, and the results of EKF and UKF are similar.
Lu, Jiazhen; Yang, Lie
2018-05-01
To achieve accurate and completely autonomous navigation for spacecraft, inertial/celestial integrated navigation gets increasing attention. In this study, a missile-borne inertial/stellar refraction integrated navigation scheme is proposed. Position Dilution of Precision (PDOP) for stellar refraction is introduced and the corresponding equation is derived. Based on the condition when PDOP reaches the minimum value, an optimized observation scheme is proposed. To verify the feasibility of the proposed scheme, numerical simulation is conducted. The results of the Extended Kalman Filter (EKF) and Unscented Kalman Filter (UKF) are compared and impact factors of navigation accuracy are studied in the simulation. The simulation results indicated that the proposed observation scheme has an accurate positioning performance, and the results of EKF and UKF are similar.
NASA Technical Reports Server (NTRS)
Smith, David D.
2015-01-01
Next-generation space missions are currently constrained by existing spacecraft navigation systems which are not fully autonomous. These systems suffer from accumulated dead-reckoning errors and must therefore rely on periodic corrections provided by supplementary technologies that depend on line-of-sight signals from Earth, satellites, or other celestial bodies for absolute attitude and position determination, which can be spoofed, incorrectly identified, occluded, obscured, attenuated, or insufficiently available. These dead-reckoning errors originate in the ring laser gyros themselves, which constitute inertial measurement units. Increasing the time for standalone spacecraft navigation therefore requires fundamental improvements in gyroscope technologies. One promising solution to enhance gyro sensitivity is to place an anomalous dispersion or fast light material inside the gyro cavity. The fast light essentially provides a positive feedback to the gyro response, resulting in a larger measured beat frequency for a given rotation rate as shown in figure 1. Game Changing Development has been investing in this idea through the Fast Light Optical Gyros (FLOG) project, a collaborative effort which began in FY 2013 between NASA Marshall Space Flight Center (MSFC), the U.S. Army Aviation and Missile Research, Development, and Engineering Center (AMRDEC), and Northwestern University. MSFC and AMRDEC are working on the development of a passive FLOG (PFLOG), while Northwestern is developing an active FLOG (AFLOG). The project has demonstrated new benchmarks in the state of the art for scale factor sensitivity enhancement. Recent results show cavity scale factor enhancements of approx.100 for passive cavities.
Multiple Integrated Navigation Sensors for Improved Occupancy Grid FastSLAM
2011-03-01
to the Faculty Department of Electrical and Computer Engineering Graduate School of Engineering and Management Air Force Institute of Technology Air...autonomous vehicle exploration with applications to search and rescue. To current knowledge , this research presents the first SLAM solution to...solution is a key component of an autonomous vehicle, especially one whose mission involves gaining knowledge of unknown areas. It provides the ability
Autonomous interplanetary constellation design
NASA Astrophysics Data System (ADS)
Chow, Cornelius Channing, II
According to NASA's integrated space technology roadmaps, space-based infrastructures are envisioned as necessary ingredients to a sustained effort in continuing space exploration. Whether it be for extra-terrestrial habitats, roving/cargo vehicles, or space tourism, autonomous space networks will provide a vital communications lifeline for both future robotic and human missions alike. Projecting that the Moon will be a bustling hub of activity within a few decades, a near-term opportunity for in-situ infrastructure development is within reach. This dissertation addresses the anticipated need for in-space infrastructure by investigating a general design methodology for autonomous interplanetary constellations; to illustrate the theory, this manuscript presents results from an application to the Earth-Moon neighborhood. The constellation design methodology is formulated as an optimization problem, involving a trajectory design step followed by a spacecraft placement sequence. Modeling the dynamics as a restricted 3-body problem, the investigated design space consists of families of periodic orbits which play host to the constellations, punctuated by arrangements of spacecraft autonomously guided by a navigation strategy called LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation). Instead of more traditional exhaustive search methods, a numerical continuation approach is implemented to map the admissible configuration space. In particular, Keller's pseudo-arclength technique is used to follow folding/bifurcating solution manifolds, which are otherwise inaccessible with other parameter continuation schemes. A succinct characterization of the underlying structure of the local, as well as global, extrema is thus achievable with little a priori intuition of the solution space. Furthermore, the proposed design methodology offers benefits in computation speed plus the ability to handle mildly stochastic systems. An application of the constellation design methodology to the restricted Earth-Moon system, reveals optimal pairwise configurations for various L1, L2, and L5 (halo, axial, and vertical) periodic orbit families. Navigation accuracies, ranging from O (10+/-1) meters in position space, are obtained for the optimal Earth-Moon constellations, given measurement noise on the order of 1 meter.
A simplified satellite navigation system for an autonomous Mars roving vehicle.
NASA Technical Reports Server (NTRS)
Janosko, R. E.; Shen, C. N.
1972-01-01
The use of a retroflecting satellite and a laser rangefinder to navigate a Martian roving vehicle is considered in this paper. It is shown that a simple system can be employed to perform this task. An error analysis is performed on the navigation equations and it is shown that the error inherent in the scheme proposed can be minimized by the proper choice of measurement geometry. A nonlinear programming approach is used to minimize the navigation error subject to constraints that are due to geometric and laser requirements. The problem is solved for a particular set of laser parameters and the optimal solution is presented.
Simulating the Liaison Navigation Concept in a Geo + Earth-Moon Halo Constellation
NASA Technical Reports Server (NTRS)
Fujimoto, K.; Leonard, J. M.; McGranaghan, R. M.; Parker, J. S.; Anderson, R. L.; Born, G. H.
2012-01-01
Linked Autonomous Interplanetary Satellite Orbit Navigation, or LiAISON, is a novel satellite navigation technique where relative radiometric measurements between two or more spacecraft in a constellation are processed to obtain the absolute state of all spacecraft. The method leverages the asymmetry of the gravity field that the constellation exists in. This paper takes a step forward in developing a high fidelity navigation simulation for the LiAISON concept in an Earth-Moon constellation. In particular, we aim to process two-way Doppler measurements between a satellite in GEO orbit and another in a halo orbit about the Earth-Moon L1 point.
NASA Technical Reports Server (NTRS)
Alt, Shannon
2016-01-01
Electronic integrated circuits are considered one of the most significant technological advances of the 20th century, with demonstrated impact in their ability to incorporate successively higher numbers transistors and construct electronic devices onto a single CMOS chip. Photonic integrated circuits (PICs) exist as the optical analog to integrated circuits; however, in place of transistors, PICs consist of numerous scaled optical components, including such "building-block" structures as waveguides, MMIs, lasers, and optical ring resonators. The ability to construct electronic and photonic components on a single microsystems platform offers transformative potential for the development of technologies in fields including communications, biomedical device development, autonomous navigation, and chemical and atmospheric sensing. Developing on-chip systems that provide new avenues for integration and replacement of bulk optical and electro-optic components also reduces size, weight, power and cost (SWaP-C) limitations, which are important in the selection of instrumentation for specific flight projects. The number of applications currently emerging for complex photonics systems-particularly in data communications-warrants additional investigations when considering reliability for space systems development. This Body of Knowledge document seeks to provide an overview of existing integrated photonics architectures; the current state of design, development, and fabrication ecosystems in the United States and Europe; and potential space applications, with emphasis given to associated radiation effects and reliability.
Autonomous Navigation Performance During The Hartley 2 Comet Flyby
NASA Technical Reports Server (NTRS)
Abrahamson, Matthew J; Kennedy, Brian A.; Bhaskaran, Shyam
2012-01-01
On November 4, 2010, the EPOXI spacecraft performed a 700-km flyby of the comet Hartley 2 as follow-on to the successful 2005 Deep Impact prime mission. EPOXI, an extended mission for the Deep Impact Flyby spacecraft, returned a wealth of visual and infrared data from Hartley 2, marking the fifth time that high-resolution images of a cometary nucleus have been captured by a spacecraft. The highest resolution science return, captured at closest approach to the comet nucleus, was enabled by use of an onboard autonomous navigation system called AutoNav. AutoNav estimates the comet-relative spacecraft trajectory using optical measurements from the Medium Resolution Imager (MRI) and provides this relative position information to the Attitude Determination and Control System (ADCS) for maintaining instrument pointing on the comet. For the EPOXI mission, AutoNav was tasked to enable continuous tracking of a smaller, more active Hartley 2, as compared to Tempel 1, through the full encounter while traveling at a higher velocity. To meet the mission goal of capturing the comet in all MRI science images, position knowledge accuracies of +/- 3.5 km (3-?) cross track and +/- 0.3 seconds (3-?) time of flight were required. A flight-code-in-the-loop Monte Carlo simulation assessed AutoNav's statistical performance under the Hartley 2 flyby dynamics and determined optimal configuration. The AutoNav performance at Hartley 2 was successful, capturing the comet in all of the MRI images. The maximum residual between observed and predicted comet locations was 20 MRI pixels, primarily influenced by the center of brightness offset from the center of mass in the observations and attitude knowledge errors. This paper discusses the Monte Carlo-based analysis that led to the final AutoNav configuration and a comparison of the predicted performance with the flyby performance.
Learning for Autonomous Navigation
NASA Technical Reports Server (NTRS)
Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric
2005-01-01
Robotic ground vehicles for outdoor applications have achieved some remarkable successes, notably in autonomous highway following (Dickmanns, 1987), planetary exploration (1), and off-road navigation on Earth (1). Nevertheless, major challenges remain to enable reliable, high-speed, autonomous navigation in a wide variety of complex, off-road terrain. 3-D perception of terrain geometry with imaging range sensors is the mainstay of off-road driving systems. However, the stopping distance at high speed exceeds the effective lookahead distance of existing range sensors. Prospects for extending the range of 3-D sensors is strongly limited by sensor physics, eye safety of lasers, and related issues. Range sensor limitations also allow vehicles to enter large cul-de-sacs even at low speed, leading to long detours. Moreover, sensing only terrain geometry fails to reveal mechanical properties of terrain that are critical to assessing its traversability, such as potential for slippage, sinkage, and the degree of compliance of potential obstacles. Rovers in the Mars Exploration Rover (MER) mission have got stuck in sand dunes and experienced significant downhill slippage in the vicinity of large rock hazards. Earth-based off-road robots today have very limited ability to discriminate traversable vegetation from non-traversable vegetation or rough ground. It is impossible today to preprogram a system with knowledge of these properties for all types of terrain and weather conditions that might be encountered.
Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.
1996-01-01
The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,
Optical Flow Experiments for Small-Body Navigation
NASA Astrophysics Data System (ADS)
Schmidt, A.; Kueppers, M.
2012-09-01
Optical Flow algorithms [1, 2] have been successfully used and been robustly implemented in many application domains from motion estimation to video compression. We argue that they also show potential for autonomous spacecraft payload operation around small solar system bodies, such as comets or asteroids. Operating spacecraft around small bodies in close distance provides numerous challenges, many of which are related to uncertainties in spacecraft position and velocity relative to a body. To make best use of usually scarce resource, it would be good to grant a certain amount of autonomy to a spacecraft, for example, to make time-critical decisions when to operate the payload. The Optical Flow describes is the apparent velocities of common, usually brightness-related features in at least two images. From it, one can make estimates about the spacecraft velocity and direction relative to the last manoeuvre or known state. The authors have conducted experiments with readily-available optical imagery using the relatively robust and well-known Lucas-Kanade method [3]; it was found to be applicable in a large number of cases. Since one of the assumptions is that the brightness of corresponding points in subsequent images does not change greatly, it is important that imagery is acquired at sensible intervals, during which illumination conditions can be assumed constant and the spacecraft does not move too far so that there is significant overlap. Full-frame Optical Flow can be computationally more expensive than image compression and usually focuses on movements of regions with significant brightness-gradients. However, given that missions which explore small bodies move at low relative velocities, computation time is not expected to be a limiting resource. Since there are now several missions which either have flown to small bodies or are planned to visit small bodies and stay there for some time, it shows potential to explore how instrument operations can benefit from the additional knowledge that is gained from analysing readily available data on-board. The algorithms for Optical Flow show the maturity that is necessary to be considered in safety-critical systems; their use can be complemented with shape models, pattern matching, housekeeping data and navigation techniques to obtain even more accurate information.
The Ship Movement Trajectory Prediction Algorithm Using Navigational Data Fusion.
Borkowski, Piotr
2017-06-20
It is essential for the marine navigator conducting maneuvers of his ship at sea to know future positions of himself and target ships in a specific time span to effectively solve collision situations. This article presents an algorithm of ship movement trajectory prediction, which, through data fusion, takes into account measurements of the ship's current position from a number of doubled autonomous devices. This increases the reliability and accuracy of prediction. The algorithm has been implemented in NAVDEC, a navigation decision support system and practically used on board ships.
The Ship Movement Trajectory Prediction Algorithm Using Navigational Data Fusion
Borkowski, Piotr
2017-01-01
It is essential for the marine navigator conducting maneuvers of his ship at sea to know future positions of himself and target ships in a specific time span to effectively solve collision situations. This article presents an algorithm of ship movement trajectory prediction, which, through data fusion, takes into account measurements of the ship’s current position from a number of doubled autonomous devices. This increases the reliability and accuracy of prediction. The algorithm has been implemented in NAVDEC, a navigation decision support system and practically used on board ships. PMID:28632176
The Design of a Navigator for a Testbed Autonomous Underwater Vehicle
1989-12-01
AD-A231 733 NAVAL POSTGRADUATE SCHOOL Monterey, California DTC C ’S B’- i I A VDI ELECTE 1i EB 0 6 991| D THESIS E THE DESIGN OF A NAVIGATOR FOR A...255,255,0); /* yellow (512-1023) * for (i= 1024; i< 2048 ; ++i) mapcolor(i,255 ,0,255); 1* magenta (1024-2047 )*/ color(BLACK); clearo; swapbufferso; 1
Intelligent Behavioral Action Aiding for Improved Autonomous Image Navigation
2012-09-13
odometry, SICK laser scanning unit ( Lidar ), Inertial Measurement Unit (IMU) and ultrasonic distance measurement system (Figure 32). The Lidar , IMU...2010, July) GPS world. [Online]. http://www.gpsworld.com/tech-talk- blog/gnss-independent-navigation-solution-using-integrated- lidar -data-11378 [4...Milford, David McKinnon, Michael Warren, Gordon Wyeth, and Ben Upcroft, "Feature-based Visual Odometry and Featureless Place Recognition for SLAM in
NASA Astrophysics Data System (ADS)
Um, Jaeyong
2001-08-01
The Space Integrated GPS/INS (SIGI) sensor is the primary navigation and attitude determination source for the International Space Station (ISS). The SIGI was successfully demonstrated on-orbit for the first time in the SIGI Orbital Attitude Readiness (SOAR) demonstration on the Space Shuttle Atlantis in May 2000. Numerous proximity operations near the ISS have been and will be performed over the lifetime of the Station. The development of an autonomous relative navigation system is needed to improve the safety and efficiency of vehicle operations near the ISS. A hardware simulation study was performed for the GPS-based relative navigation using the state vector difference approach and the interferometric approach in the absence of multipath. The interferometric approach, where the relative states are estimated directly, showed comparable results for a 1 km baseline. One of the most pressing current technical issues is the design of an autonomous relative navigation system in the proximity of the ISS, where GPS signals are blocked and maneuvers happen frequently. An integrated GPS/INS system is investigated for the possibility of a fully autonomous relative navigation system. Another application of GPS measurements is determination of the vehicle's orientation in space. This study used the SOAR experiment data to characterize the SICI's on-orbit performance for attitude determination. A cold start initialization algorithm was developed for integer ambiguity resolution in any initial orientation. The original algorithm that was used in the SIGI had an operational limitation in the integer ambiguity resolution, which was developed for terrestrial applications, and limited its effectiveness in space. The new algorithm was tested using the SOAR data and has been incorporated in the current SIGI flight software. The attitude estimation performance was examined using two different GPS/INS integration algorithms. The GPS/INS attitude solution using the SOAR data was as accurate as 0.06 deg (RMS) in 3-axis with multipath mitigation. Other improvements to the attitude determination algorithm were the development of a faster integer ambiguity resolution method and the incorporation of line bias modeling.
Wind-based navigation of a hot-air balloon on Titan: a feasibility study
NASA Astrophysics Data System (ADS)
Furfaro, Roberto; Lunine, Jonathan I.; Elfes, Alberto; Reh, Kim
2008-04-01
Current analysis of data streamed back to Earth by the Cassini spacecraft features Titan as one of the most exciting places in the solar system. NASA centers and universities around the US, as well as the European Space Agency, are studying the possibility of sending, as part of the next mission to this giant moon of Saturn, a hot-air balloon (Montgolfier-type) for further and more in-depth exploration. The basic idea would be to design a reliable, semi-autonomous, and yet cheap Montgolfier capable of using continuous flow of waste heat from a power source to lift the balloon and sustain its altitude in the Titan environment. In this paper we study the problem of locally navigating a hot-air balloon in the nitrogen-based Titan atmosphere. The basic idea is to define a strategy (i.e. design of a suitable guidance system) that allows autonomous and semi-autonomous navigation of the balloon using the available (and partial) knowledge of the wind structure blowing on the saturnian satellite surface. Starting from first principles we determined the appropriate thermal and dynamical models describing (a) the vertical dynamics of the balloon and (b) the dynamics of the balloon moving on a vertical plane (2-D motion). Next, various non-linear fuzzy-based control strategies have been evaluated, analyzed and implemented in MATLAB to numerically simulate the capability of the system to simultaneously maintain altitude, as well as a scientifically desirable trajectory. We also looked at the ability of the balloon to perform station keeping. The results of the simulation are encouraging and show the effectiveness of such a system to cheaply and effectively perform semi-autonomous exploration of Titan.
NASA Astrophysics Data System (ADS)
Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.
2009-05-01
It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.
Immune systems are not just for making you feel better: they are for controlling autonomous robots
NASA Astrophysics Data System (ADS)
Rosenblum, Mark
2005-05-01
The typical algorithm for robot autonomous navigation in off-road complex environments involves building a 3D map of the robot's surrounding environment using a 3D sensing modality such as stereo vision or active laser scanning, and generating an instantaneous plan to navigate around hazards. Although there has been steady progress using these methods, these systems suffer from several limitations that cannot be overcome with 3D sensing and planning alone. Geometric sensing alone has no ability to distinguish between compressible and non-compressible materials. As a result, these systems have difficulty in heavily vegetated environments and require sensitivity adjustments across different terrain types. On the planning side, these systems have no ability to learn from their mistakes and avoid problematic environmental situations on subsequent encounters. We have implemented an adaptive terrain classification system based on the Artificial Immune System (AIS) computational model, which is loosely based on the biological immune system, that combines various forms of imaging sensor inputs to produce a "feature labeled" image of the scene categorizing areas as benign or detrimental for autonomous robot navigation. Because of the qualities of the AIS computation model, the resulting system will be able to learn and adapt on its own through interaction with the environment by modifying its interpretation of the sensor data. The feature labeled results from the AIS analysis are inserted into a map and can then be used by a planner to generate a safe route to a goal point. The coupling of diverse visual cues with the malleable AIS computational model will lead to autonomous robotic ground vehicles that require less human intervention for deployment in novel environments and more robust operation as a result of the system's ability to improve its performance through interaction with the environment.
Linked Autonomous Interplanetary Satellite Orbit Navigation
NASA Technical Reports Server (NTRS)
Parker, Jeffrey S.; Anderson, Rodney L.; Born, George H.; Leonard, Jason M.; McGranaghan, Ryan M.; Fujimoto, Kohei
2013-01-01
A navigation technology known as LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation) has been known to produce very impressive navigation results for scenarios involving two or more cooperative satellites near the Moon, such that at least one satellite must be in an orbit significantly perturbed by the Earth, such as a lunar halo orbit. The two (or more) satellites track each other using satellite-to-satellite range and/or range-rate measurements. These relative measurements yield absolute orbit navigation when one of the satellites is in a lunar halo orbit, or the like. The geometry between a lunar halo orbiter and a GEO satellite continuously changes, which dramatically improves the information content of a satellite-to-satellite tracking signal. The geometrical variations include significant out-of-plane shifts, as well as inplane shifts. Further, the GEO satellite is almost continuously in view of a lunar halo orbiter. High-fidelity simulations demonstrate that LiAISON technology improves the navigation of GEO orbiters by an order of magnitude, relative to standard ground tracking. If a GEO satellite is navigated using LiAISON- only tracking measurements, its position is typically known to better than 10 meters. If LiAISON measurements are combined with simple radiometric ground observations, then the satellite s position is typically known to better than 3 meters, which is substantially better than the current state of GEO navigation. There are two features of LiAISON that are novel and advantageous compared with conventional satellite navigation. First, ordinary satellite-to-satellite tracking data only provides relative navigation of each satellite. The novelty is the placement of one navigation satellite in an orbit that is significantly perturbed by both the Earth and the Moon. A navigation satellite can track other satellites elsewhere in the Earth-Moon system and acquire knowledge about both satellites absolute positions and velocities, as well as relative positions and velocities in space. The second novelty is that ordinarily one requires many satellites in order to achieve full navigation of any given customer s position and velocity over time. With LiAISON navigation, only a single navigation satellite is needed, provided that the satellite is significantly affected by the gravity of the Earth and the Moon. That single satellite can track another satellite elsewhere in the Earth- Moon system and obtain absolute knowledge of both satellites states.
NASA Technical Reports Server (NTRS)
Winternitz, Luke B.; Bamford, William A.; Price, Samuel R.
2017-01-01
As reported in a companion work, in its first phase, NASA's 2015 highly elliptic Magnetospheric Multiscale (MMS) mission set a record for the highest altitude operational use of on-board GPS-based navigation, returning state estimates at 12 Earth radii. In early 2017 MMS transitioned to its second phase which doubled the apogee distance to 25 Earth radii, approaching halfway to the Moon. This paper will present results for GPS observability and navigation performance achieved in MMS Phase 2. Additionally, it will provide simulation results predicting the performance of the MMS navigation system applied to a pair of concept missions at Lunar distances. These studies will demonstrate how high-sensitivity GPS (or GNSS) receivers paired with onboard navigation software, as in MMS-Navigation system, can extend the envelope of autonomous onboard GPS navigation far from the Earth.
Experiments in teleoperator and autonomous control of space robotic vehicles
NASA Technical Reports Server (NTRS)
Alexander, Harold L.
1991-01-01
A program of research embracing teleoperator and automatic navigational control of freely flying satellite robots is presented. Current research goals include: (1) developing visual operator interfaces for improved vehicle teleoperation; (2) determining the effects of different visual interface system designs on operator performance; and (3) achieving autonomous vision-based vehicle navigation and control. This research program combines virtual-environment teleoperation studies and neutral-buoyancy experiments using a space-robot simulator vehicle currently under development. Visual-interface design options under investigation include monoscopic versus stereoscopic displays and cameras, helmet-mounted versus panel-mounted display monitors, head-tracking versus fixed or manually steerable remote cameras, and the provision of vehicle-fixed visual cues, or markers, in the remote scene for improved sensing of vehicle position, orientation, and motion.
Autonomous navigation of structured city roads
NASA Astrophysics Data System (ADS)
Aubert, Didier; Kluge, Karl C.; Thorpe, Chuck E.
1991-03-01
Autonomous road following is a domain which spans a range of complexity from poorly defined unmarked dirt roads to well defined well marked highly struc-. tured highways. The YARF system (for Yet Another Road Follower) is designed to operate in the middle of this range of complexity driving on urban streets. Our research program has focused on the use of feature- and situation-specific segmentation techniques driven by an explicit model of the appearance and geometry of the road features in the environment. We report results in robust detection of white and yellow painted stripes fitting a road model to detected feature locations to determine vehicle position and local road geometry and automatic location of road features in an initial image. We also describe our planned extensions to include intersection navigation.
NASA Technical Reports Server (NTRS)
Crain, Timothy P.; Bishop, Robert H.; Carson, John M., III; Trawny, Nikolas; Hanak, Chad; Sullivan, Jacob; Christian, John; DeMars, Kyle; Campbell, Tom; Getchius, Joel
2016-01-01
The Morpheus Project began in late 2009 as an ambitious e ort code-named Project M to integrate three ongoing multi-center NASA technology developments: humanoid robotics, liquid oxygen/liquid methane (LOX/LCH4) propulsion and Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) into a single engineering demonstration mission to be own to the Moon by 2013. The humanoid robot e ort was redirected to a deploy- ment of Robonaut 2 on the International Space Station in February of 2011 while Morpheus continued as a terrestrial eld test project integrating the existing ALHAT Project's tech- nologies into a sub-orbital ight system using the world's rst LOX/LCH4 main propulsion and reaction control system fed from the same blowdown tanks. A series of 33 tethered tests with the Morpheus 1.0 vehicle and Morpheus 1.5 vehicle were conducted from April 2011 - December 2013 before successful, sustained free ights with the primary Vertical Testbed (VTB) navigation con guration began with Free Flight 3 on December 10, 2013. Over the course of the following 12 free ights and 3 tethered ights, components of the ALHAT navigation system were integrated into the Morpheus vehicle, operations, and ight control loop. The ALHAT navigation system was integrated and run concurrently with the VTB navigation system as a reference and fail-safe option in ight (see touchdown position esti- mate comparisons in Fig. 1). Flight testing completed with Free Flight 15 on December 15, 2014 with a completely autonomous Hazard Detection and Avoidance (HDA), integration of surface relative and Hazard Relative Navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman lter software, and landing within 2 meters of the VTB GPS-based navigation solution at the safe landing site target. This paper describes the Mor- pheus joint VTB/ALHAT navigation architecture, the sensors utilized during the terrestrial ight campaign, issues resolved during testing, and the navigation results from the ight tests.
NASA Astrophysics Data System (ADS)
Nittler, L. R.; Hong, J.; Kenter, A.; Romaine, S.; Allen, B.; Kraft, R.; Masterson, R.; Elvis, M.; Gendreau, K.; Crawford, I.; Binzel, R.; Boynton, W. V.; Grindlay, J.; Ramsey, B.
2017-12-01
The surface elemental composition of a planetary body provides crucial information about its origin, geological evolution, and surface processing, all of which can in turn provide information about solar system evolution as a whole. Remote sensing X-ray fluorescence (XRF) spectroscopy has been used successfully to probe the major-element compositions of airless bodies in the inner solar system, including the Moon, near-Earth asteroids, and Mercury. The CubeSAT X-ray Telescope (CubeX) is a concept for a 6U planetary X-ray telescope (36U with S/C), which utilizes Miniature Wolter-I X-ray optics (MiXO), monolithic CMOS and SDD X-ray sensors for the focal plane, and a Solar X-ray Monitor (heritage from the REXIS XRF instrument on NASA's OSIRIS-REx mission). CubeX will map the surface elemental composition of diverse airless bodies by spectral measurement of XRF excited by solar X-rays. The lightweight ( 1 kg) MiXO optics provide sub-arcminute resolution with low background, while the inherently rad-hard CMOS detectors provide improved spectral resolution ( 150 eV) at 0 °C. CubeX will also demonstrate X-ray pulsar timing based deep space navigation (XNAV). Successful XNAV will enable autonomous deep navigation with little to no support from the Deep Space Network, hence lowering the operation cost for many more planetary missions. Recently selected by NASA Planetary Science Deep Space SmallSat Studies, the first CubeX concept, designed to rideshare to the Moon as a secondary spacecraft on a primary mission, is under study in collaboration with the Mission Design Center at NASA Ames Research Center. From high altitude ( 6,000 km) frozen polar circular orbits, CubeX will study > 8 regions ( 110 km) of geological interest on the Moon over one year to produce a high resolution ( 2-3 km) elemental abundance map of each region. The novel focal plane design of CubeX also allows us to evaluate the performance of absolute navigation by sequential observations of several millisecond pulsars without moving parts.
Deep-space navigation applications of improved ground-based optical astrometry
NASA Technical Reports Server (NTRS)
Null, G. W.; Owen, W. M., Jr.; Synnott, S. P.
1992-01-01
Improvements in ground-based optical astrometry will eventually be required for navigation of interplanetary spacecraft when these spacecraft communicate at optical wavelengths. Although such spacecraft may be some years off, preliminary versions of the astrometric technology can also be used to obtain navigational improvements for the Galileo and Cassini missions. This article describes a technology-development and observational program to accomplish this, including a cooperative effort with U.S. Naval Observatory Flagstaff Station. For Galileo, Earth-based astrometry of Jupiter's Galilean satellites may improve their ephemeris accuracy by a factor of 3 to 6. This would reduce the requirements for onboard optical navigation pictures, so that more of the data transmission capability (currently limited by high-gain antenna deployment problems) can be used for science data. Also, observations of European Space Agency (ESA) Hipparcos stars with asteroid 243 Ida may provide significantly improved navigation accuracy for a planned August 1993 Galileo spacecraft encounter.
Terminal navigation analysis for the 1980 comet Encke slow flyby mission
NASA Technical Reports Server (NTRS)
Jacobson, R. A.; Mcdanell, J. P.; Rinker, G. C.
1973-01-01
The initial results of a terminal navigation analysis for the proposed 1980 solar electric slow flyby mission to the comet Encke are presented. The navigation technique employs onboard optical measurements with the scientific television camera, groundbased observations of the spacecraft and comet, and groundbased orbit determination and thrust vector update computation. The knowledge and delivery accuracies of the spacecraft are evaluated as a function of the important parameters affecting the terminal navigation. These include optical measurement accuracy, thruster noise level, duration of the planned terminal coast period, comet ephemeris uncertainty, guidance initiation time, guidance update frequency, and optical data rate.
Sensor fusion V; Proceedings of the Meeting, Boston, MA, Nov. 15-17, 1992
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1992-01-01
Topics addressed include 3D object perception, human-machine interface in multisensor systems, sensor fusion architecture, fusion of multiple and distributed sensors, interface and decision models for sensor fusion, computational networks, simple sensing for complex action, multisensor-based control, and metrology and calibration of multisensor systems. Particular attention is given to controlling 3D objects by sketching 2D views, the graphical simulation and animation environment for flexible structure robots, designing robotic systems from sensorimotor modules, cylindrical object reconstruction from a sequence of images, an accurate estimation of surface properties by integrating information using Bayesian networks, an adaptive fusion model for a distributed detection system, multiple concurrent object descriptions in support of autonomous navigation, robot control with multiple sensors and heuristic knowledge, and optical array detectors for image sensors calibration. (No individual items are abstracted in this volume)
Tick, David; Satici, Aykut C; Shen, Jinglin; Gans, Nicholas
2013-08-01
This paper presents a novel navigation and control system for autonomous mobile robots that includes path planning, localization, and control. A unique vision-based pose and velocity estimation scheme utilizing both the continuous and discrete forms of the Euclidean homography matrix is fused with inertial and optical encoder measurements to estimate the pose, orientation, and velocity of the robot and ensure accurate localization and control signals. A depth estimation system is integrated in order to overcome the loss of scale inherent in vision-based estimation. A path following control system is introduced that is capable of guiding the robot along a designated curve. Stability analysis is provided for the control system and experimental results are presented that prove the combined localization and control system performs with high accuracy.
Localization from Visual Landmarks on a Free-Flying Robot
NASA Technical Reports Server (NTRS)
Coltin, Brian; Fusco, Jesse; Moratto, Zack; Alexandrov, Oleg; Nakamura, Robert
2016-01-01
We present the localization approach for Astrobee,a new free-flying robot designed to navigate autonomously on board the International Space Station (ISS). Astrobee will conduct experiments in microgravity, as well as assisst astronauts and ground controllers. Astrobee replaces the SPHERES robots which currently operate on the ISS, which were limited to operating in a small cube since their localization system relied on triangulation from ultrasonic transmitters. Astrobee localizes with only monocular vision and an IMU, enabling it to traverse the entire US segment of the station. Features detected on a previously-built map, optical flow information,and IMU readings are all integrated into an extended Kalman filter (EKF) to estimate the robot pose. We introduce several modifications to the filter to make it more robust to noise.Finally, we extensively evaluate the behavior of the filter on atwo-dimensional testing surface.
Localization from Visual Landmarks on a Free-Flying Robot
NASA Technical Reports Server (NTRS)
Coltin, Brian; Fusco, Jesse; Moratto, Zack; Alexandrov, Oleg; Nakamura, Robert
2016-01-01
We present the localization approach for Astrobee, a new free-flying robot designed to navigate autonomously on the International Space Station (ISS). Astrobee will accommodate a variety of payloads and enable guest scientists to run experiments in zero-g, as well as assist astronauts and ground controllers. Astrobee will replace the SPHERES robots which currently operate on the ISS, whose use of fixed ultrasonic beacons for localization limits them to work in a 2 meter cube. Astrobee localizes with monocular vision and an IMU, without any environmental modifications. Visual features detected on a pre-built map, optical flow information, and IMU readings are all integrated into an extended Kalman filter (EKF) to estimate the robot pose. We introduce several modifications to the filter to make it more robust to noise, and extensively evaluate the localization algorithm.
Orion Optical Navigation for Loss of Communication Lunar Return Contingencies
NASA Technical Reports Server (NTRS)
Getchius, Joel; Hanak, Chad; Kubitschek, Daniel G.
2010-01-01
The Orion Crew Exploration Vehicle (CEV) will replace the Space Shuttle and serve as the next-generation spaceship to carry humans back to the Moon for the first time since the Apollo program. For nominal lunar mission operations, the Mission Control Navigation team will utilize radiometric measurements to determine the position and velocity of Orion and uplink state information to support Lunar return. However, in the loss of communications contingency return scenario, Orion must safely return the crew to the Earth's surface. The navigation design solution for this loss of communications scenario is optical navigation consisting of lunar landmark tracking in low lunar orbit and star- horizon angular measurements coupled with apparent planetary diameter for Earth return trajectories. This paper describes the optical measurement errors and the navigation filter that will process those measurements to support navigation for safe crew return.
Error Analysis System for Spacecraft Navigation Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, S. H.; Hart, R. C.; Hartman, K. R.; Tomcsik, T. L.; Searl, J. E.; Bernstein, A.
1997-01-01
The Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) is currently developing improved space-navigation filtering algorithms to use the Global Positioning System (GPS) for autonomous real-time onboard orbit determination. In connection with a GPS technology demonstration on the Small Satellite Technology Initiative (SSTI)/Lewis spacecraft, FDD analysts and programmers have teamed with the GSFC Guidance, Navigation, and Control Branch to develop the GPS Enhanced Orbit Determination Experiment (GEODE) system. The GEODE system consists of a Kalman filter operating as a navigation tool for estimating the position, velocity, and additional states required to accurately navigate the orbiting Lewis spacecraft by using astrodynamic modeling and GPS measurements from the receiver. A parallel effort at the FDD is the development of a GPS Error Analysis System (GEAS) that will be used to analyze and improve navigation filtering algorithms during development phases and during in-flight calibration. For GEAS, the Kalman filter theory is extended to estimate the errors in position, velocity, and other error states of interest. The estimation of errors in physical variables at regular intervals will allow the time, cause, and effect of navigation system weaknesses to be identified. In addition, by modeling a sufficient set of navigation system errors, a system failure that causes an observed error anomaly can be traced and accounted for. The GEAS software is formulated using Object Oriented Design (OOD) techniques implemented in the C++ programming language on a Sun SPARC workstation. The Phase 1 of this effort is the development of a basic system to be used to evaluate navigation algorithms implemented in the GEODE system. This paper presents the GEAS mathematical methodology, systems and operations concepts, and software design and implementation. Results from the use of the basic system to evaluate navigation algorithms implemented on GEODE are also discussed. In addition, recommendations for generalization of GEAS functions and for new techniques to optimize the accuracy and control of the GPS autonomous onboard navigation are presented.
An integrated autonomous rendezvous and docking system architecture using Centaur modern avionics
NASA Technical Reports Server (NTRS)
Nelson, Kurt
1991-01-01
The avionics system for the Centaur upper stage is in the process of being modernized with the current state-of-the-art in strapdown inertial guidance equipment. This equipment includes an integrated flight control processor with a ring laser gyro based inertial guidance system. This inertial navigation unit (INU) uses two MIL-STD-1750A processors and communicates over the MIL-STD-1553B data bus. Commands are translated into load activation through a Remote Control Unit (RCU) which incorporates the use of solid state relays. Also, a programmable data acquisition system replaces separate multiplexer and signal conditioning units. This modern avionics suite is currently being enhanced through independent research and development programs to provide autonomous rendezvous and docking capability using advanced cruise missile image processing technology and integrated GPS navigational aids. A system concept was developed to combine these technologies in order to achieve a fully autonomous rendezvous, docking, and autoland capability. The current system architecture and the evolution of this architecture using advanced modular avionics concepts being pursued for the National Launch System are discussed.
PRIMUS: autonomous navigation in open terrain with a tracked vehicle
NASA Astrophysics Data System (ADS)
Schaub, Guenter W.; Pfaendner, Alfred H.; Schaefer, Christoph
2004-09-01
The German experimental robotics program PRIMUS (PRogram for Intelligent Mobile Unmanned Systems) is focused on solutions for autonomous driving in unknown open terrain, over several project phases under specific realization aspects for more than 12 years. The main task of the program is to develop algorithms for a high degree of autonomous navigation skills with off-the-shelf available hardware/sensor technology and to integrate this into military vehicles. For obstacle detection a Dornier-3D-LADAR is integrated on a tracked vehicle "Digitized WIESEL 2". For road-following a digital video camera and a visual perception module from the Universitaet der Bundeswehr Munchen (UBM) has been integrated. This paper gives an overview of the PRIMUS program with a focus on the last program phase D (2001 - 2003). This includes the system architecture, the description of the modes of operation and the technology development with the focus on obstacle avoidance and obstacle classification using a 3-D LADAR. A collection of experimental results and a short look at the next steps in the German robotics program will conclude the paper.
Autonomous Navigation by a Mobile Robot
NASA Technical Reports Server (NTRS)
Huntsberger, Terrance; Aghazarian, Hrand
2005-01-01
ROAMAN is a computer program for autonomous navigation of a mobile robot on a long (as much as hundreds of meters) traversal of terrain. Developed for use aboard a robotic vehicle (rover) exploring the surface of a remote planet, ROAMAN could also be adapted to similar use on terrestrial mobile robots. ROAMAN implements a combination of algorithms for (1) long-range path planning based on images acquired by mast-mounted, wide-baseline stereoscopic cameras, and (2) local path planning based on images acquired by body-mounted, narrow-baseline stereoscopic cameras. The long-range path-planning algorithm autonomously generates a series of waypoints that are passed to the local path-planning algorithm, which plans obstacle-avoiding legs between the waypoints. Both the long- and short-range algorithms use an occupancy-grid representation in computations to detect obstacles and plan paths. Maps that are maintained by the long- and short-range portions of the software are not shared because substantial localization errors can accumulate during any long traverse. ROAMAN is not guaranteed to generate an optimal shortest path, but does maintain the safety of the rover.
Image Dependent Relative Formation Navigation for Autonomous Aerial Refueling
2011-03-01
and local variations of the Earth’s surface make a mathematical model difficult to create and use. The definition of an equipotential surface ...controlled with flight control surfaces attached to it. To refuel using this method, the receiver pilot flies the aircraft to within a defined refueling...I-frame would unnecessarily complicate aircraft navigation that, by definition, is limited to altitudes relatively close to the surface of the Earth
Bio-Inspired Navigation of Chemical Plumes
2006-07-01
Bio-Inspired Navigation of Chemical Plumes Maynard J. Porter III, Captain, USAF Department of Electrical and Computer Engineering Air Force Institute...Li. " Chemical plume tracing via an autonomous underwater vehicle". IEEE Journal of Ocean Engineering , 30(2):428— 442, 2005. [6] G. A. Nevitt...Electrical and Computer Engineering Air Force Institute of Technology Dayton, OH 45433-7765, U.S.A. juan.vasquez@afit.edu May 31, 2006 Abstract - The
2009-09-01
22 b. Hazard Detection and Avoidance ( HDA )...............................22 c. Hazard Relative Navigation (HRN...Navigation (HRN) and Hazard Detection and Avoidance ( HDA ). In addition to the TRN and HDA sensors used during these phases, which will be discussed...and Avoidance ( HDA ) During the HAD phase, the expected landing site is examined and evaluated, and a new site may be selected. Using the HDA
Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion
NASA Astrophysics Data System (ADS)
Steinhaus, Peter; Strand, Marcus; Dillmann, Rüdiger
2007-12-01
Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO) for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media) in Karlsruhe.
Autonomous Spacecraft Navigation Using Above-the-Constellation GPS Signals
NASA Technical Reports Server (NTRS)
Winternitz, Luke
2017-01-01
GPS-based spacecraft navigation offers many performance and cost benefits, and GPS receivers are now standard GNC components for LEO missions. Recently, more and more high-altitude missions are taking advantage of the benefits of GPS navigation as well. High-altitude applications pose challenges, however, because receivers operating above the GPS constellations are subject to reduced signal strength and availability, and uncertain signal quality. This presentation will present the history and state-of-the-art in high-altitude GPS spacecraft navigation, including early experiments, current missions and receivers, and efforts to characterize and protect signals available to high-altitude users. Recent results from the very-high altitude MMS mission are also provided.
Autonomous detection of indoor and outdoor signs
NASA Astrophysics Data System (ADS)
Holden, Steven; Snorrason, Magnus; Goodsell, Thomas; Stevens, Mark R.
2005-05-01
Most goal-oriented mobile robot tasks involve navigation to one or more known locations. This is generally done using GPS coordinates and landmarks outdoors, or wall-following and fiducial marks indoors. Such approaches ignore the rich source of navigation information that is already in place for human navigation in all man-made environments: signs. A mobile robot capable of detecting and reading arbitrary signs could be tasked using directions that are intuitive to hu-mans, and it could report its location relative to intuitive landmarks (a street corner, a person's office, etc.). Such ability would not require active marking of the environment and would be functional in the absence of GPS. In this paper we present an updated version of a system we call Sign Understanding in Support of Autonomous Navigation (SUSAN). This system relies on cues common to most signs, the presence of text, vivid color, and compact shape. By not relying on templates, SUSAN can detect a wide variety of signs: traffic signs, street signs, store-name signs, building directories, room signs, etc. In this paper we focus on the text detection capability. We present results summarizing probability of detection and false alarm rate across many scenes containing signs of very different designs and in a variety of lighting conditions.
Autonomous Wheeled Robot Platform Testbed for Navigation and Mapping Using Low-Cost Sensors
NASA Astrophysics Data System (ADS)
Calero, D.; Fernandez, E.; Parés, M. E.
2017-11-01
This paper presents the concept of an architecture for a wheeled robot system that helps researchers in the field of geomatics to speed up their daily research on kinematic geodesy, indoor navigation and indoor positioning fields. The presented ideas corresponds to an extensible and modular hardware and software system aimed at the development of new low-cost mapping algorithms as well as at the evaluation of the performance of sensors. The concept, already implemented in the CTTC's system ARAS (Autonomous Rover for Automatic Surveying) is generic and extensible. This means that it is possible to incorporate new navigation algorithms or sensors at no maintenance cost. Only the effort related to the development tasks required to either create such algorithms needs to be taken into account. As a consequence, change poses a much small problem for research activities in this specific area. This system includes several standalone sensors that may be combined in different ways to accomplish several goals; that is, this system may be used to perform a variety of tasks, as, for instance evaluates positioning algorithms performance or mapping algorithms performance.
Sample Return Robot Centennial Challenge
2012-06-16
NASA Deputy Administrator Lori Garver, left, listens as Worcester Polytechnic Institute (WPI) Robotics Resource Center Director and NASA-WPI Sample Return Robot Centennial Challenge Judge Ken Stafford points out how the robots navigate the playing field during the challenge on Saturday, June 16, 2012 in Worcester, Mass. Teams were challenged to build autonomous robots that can identify, collect and return samples. NASA needs autonomous robotic capability for future planetary exploration. Photo Credit: (NASA/Bill Ingalls)
Sample Return Robot Centennial Challenge
2012-06-16
NASA Deputy Administrator Lori Garver, right, listens as Worcester Polytechnic Institute (WPI) Robotics Resource Center Director and NASA-WPI Sample Return Robot Centennial Challenge Judge Ken Stafford points out how the robots navigate the playing field during the challenge on Saturday, June 16, 2012 in Worcester, Mass. Teams were challenged to build autonomous robots that can identify, collect and return samples. NASA needs autonomous robotic capability for future planetary exploration. Photo Credit: (NASA/Bill Ingalls)
Autonomous Underwater Vehicle Navigation
2008-02-01
three standard deviations are ignored as indicated by the × marker. 25 7. REFERENCES [1] R. G. Brown and P. Y. C. Hwang , Introduction to Random Signals...autonomous underwater vehicle with six degrees of freedom. We approach this problem using an error state formulation of the Kalman filter. Integration...each position fix, but is this ad-hoc method optimal? Here, we present an approach using an error state formulation of the Kalman filter to provide an
On-Line Point Positioning with Single Frame Camera Data
1992-03-15
tion algorithms and methods will be found in robotics and industrial quality control. 1. Project data The project has been defined as "On-line point...development and use of the OLT algorithms and meth- ods for applications in robotics , industrial quality control and autonomous vehicle naviga- tion...Of particular interest in robotics and autonomous vehicle navigation is, for example, the task of determining the position and orientation of a mobile
EnEx-RANGE - Robust autonomous Acoustic Navigation in Glacial icE
NASA Astrophysics Data System (ADS)
Heinen, Dirk; Eliseev, Dmitry; Henke, Christoph; Jeschke, Sabina; Linder, Peter; Reuter, Sebastian; Schönitz, Sebastian; Scholz, Franziska; Weinstock, Lars Steffen; Wickmann, Stefan; Wiebusch, Christopher; Zierke, Simon
2017-03-01
Within the Enceladus Explorer Initiative of the DLR Space Administration navigation technologies for a future space mission are in development. Those technologies are the basis for the search for extraterrestrial life on the Saturn moon Enceladus. An autonomous melting probe, the EnEx probe, aims to extract a liquid sample from a water reservoir below the icy crust. A first EnEx probe was developed and demonstrated in a terrestrial scenario at the Bloodfalls, Taylor Glacier, Antarctica in November 2014. To enable navigation in glacier ice two acoustic systems were integrated into the probe in addition to conventional navigation technologies. The first acoustic system determines the position of the probe during the run based on propagation times of acoustic signals from emitters at reference positions at the glacier surface to receivers in the probe. The second system provides information about the forefield of the probe. It is based on sonographic principles with phased array technology integrated in the probe's melting head. Information about obstacles or sampling regions in the probe's forefield can be acquired. The development of both systems is now continued in the project EnEx-RANGE. The emitters of the localization system are replaced by a network of intelligent acoustic enabled melting probes. These localize each other by means of acoustic signals and create the reference system for the EnEx probe. This presentation includes the discussion of the intelligent acoustic network, the acoustic navigation systems of the EnEx probe and results of terrestrial tests.
Performance Characterization of a Landmark Measurement System for ARRM Terrain Relative Navigation
NASA Technical Reports Server (NTRS)
Shoemaker, Michael A.; Wright, Cinnamon; Liounis, Andrew J.; Getzandanner, Kenneth M.; Van Eepoel, John M.; DeWeese, Keith D.
2016-01-01
This paper describes the landmark measurement system being developed for terrain relative navigation on NASAs Asteroid Redirect Robotic Mission (ARRM),and the results of a performance characterization study given realistic navigational and model errors. The system is called Retina, and is derived from the stereo-photoclinometry methods widely used on other small-body missions. The system is simulated using synthetic imagery of the asteroid surface and discussion is given on various algorithmic design choices. Unlike other missions, ARRMs Retina is the first planned autonomous use of these methods during the close-proximity and descent phase of the mission.
Performance Characterization of a Landmark Measurement System for ARRM Terrain Relative Navigation
NASA Technical Reports Server (NTRS)
Shoemaker, Michael; Wright, Cinnamon; Liounis, Andrew; Getzandanner, Kenneth; Van Eepoel, John; Deweese, Keith
2016-01-01
This paper describes the landmark measurement system being developed for terrain relative navigation on NASAs Asteroid Redirect Robotic Mission (ARRM),and the results of a performance characterization study given realistic navigational and model errors. The system is called Retina, and is derived from the stereophotoclinometry methods widely used on other small-body missions. The system is simulated using synthetic imagery of the asteroid surface and discussion is given on various algorithmic design choices. Unlike other missions, ARRMs Retina is the first planned autonomous use of these methods during the close-proximity and descent phase of the mission.
Xiao, Mengli; Zhang, Yongbo; Fu, Huimin; Wang, Zhihua
2018-05-01
High-precision navigation algorithm is essential for the future Mars pinpoint landing mission. The unknown inputs caused by large uncertainties of atmospheric density and aerodynamic coefficients as well as unknown measurement biases may cause large estimation errors of conventional Kalman filters. This paper proposes a derivative-free version of nonlinear unbiased minimum variance filter for Mars entry navigation. This filter has been designed to solve this problem by estimating the state and unknown measurement biases simultaneously with derivative-free character, leading to a high-precision algorithm for the Mars entry navigation. IMU/radio beacons integrated navigation is introduced in the simulation, and the result shows that with or without radio blackout, our proposed filter could achieve an accurate state estimation, much better than the conventional unscented Kalman filter, showing the ability of high-precision Mars entry navigation algorithm. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Autonomous navigation method for substation inspection robot based on travelling deviation
NASA Astrophysics Data System (ADS)
Yang, Guoqing; Xu, Wei; Li, Jian; Fu, Chongguang; Zhou, Hao; Zhang, Chuanyou; Shao, Guangting
2017-06-01
A new method of edge detection is proposed in substation environment, which can realize the autonomous navigation of the substation inspection robot. First of all, the road image and information are obtained by using an image acquisition device. Secondly, the noise in the region of interest which is selected in the road image, is removed with the digital image processing algorithm, the road edge is extracted by Canny operator, and the road boundaries are extracted by Hough transform. Finally, the distance between the robot and the left and the right boundaries is calculated, and the travelling distance is obtained. The robot's walking route is controlled according to the travel deviation and the preset threshold. Experimental results show that the proposed method can detect the road area in real time, and the algorithm has high accuracy and stable performance.
Sign detection for autonomous navigation
NASA Astrophysics Data System (ADS)
Goodsell, Thomas G.; Snorrason, Magnus S.; Cartwright, Dustin; Stube, Brian; Stevens, Mark R.; Ablavsky, Vitaly X.
2003-09-01
Mobile robots currently cannot detect and read arbitrary signs. This is a major hindrance to mobile robot usability, since they cannot be tasked using directions that are intuitive to humans. It also limits their ability to report their position relative to intuitive landmarks. Other researchers have demonstrated some success on traffic sign recognition, but using template based methods limits the set of recognizable signs. There is a clear need for a sign detection and recognition system that can process a much wider variety of signs: traffic signs, street signs, store-name signs, building directories, room signs, etc. We are developing a system for Sign Understanding in Support of Autonomous Navigation (SUSAN), that detects signs from various cues common to most signs: vivid colors, compact shape, and text. We have demonstrated the feasibility of our approach on a variety of signs in both indoor and outdoor locations.
Vision-based mapping with cooperative robots
NASA Astrophysics Data System (ADS)
Little, James J.; Jennings, Cullen; Murray, Don
1998-10-01
Two stereo-vision-based mobile robots navigate and autonomously explore their environment safely while building occupancy grid maps of the environment. The robots maintain position estimates within a global coordinate frame using landmark recognition. This allows them to build a common map by sharing position information and stereo data. Stereo vision processing and map updates are done at 3 Hz and the robots move at speeds of 200 cm/s. Cooperative mapping is achieved through autonomous exploration of unstructured and dynamic environments. The map is constructed conservatively, so as to be useful for collision-free path planning. Each robot maintains a separate copy of a shared map, and then posts updates to the common map when it returns to observe a landmark at home base. Issues include synchronization, mutual localization, navigation, exploration, registration of maps, merging repeated views (fusion), centralized vs decentralized maps.
Autonomous navigation system and method
Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID
2009-09-08
A robot platform includes perceptors, locomotors, and a system controller, which executes instructions for autonomously navigating a robot. The instructions repeat, on each iteration through an event timing loop, the acts of defining an event horizon based on the robot's current velocity, detecting a range to obstacles around the robot, testing for an event horizon intrusion by determining if any range to the obstacles is within the event horizon, and adjusting rotational and translational velocity of the robot accordingly. If the event horizon intrusion occurs, rotational velocity is modified by a proportion of the current rotational velocity reduced by a proportion of the range to the nearest obstacle and translational velocity is modified by a proportion of the range to the nearest obstacle. If no event horizon intrusion occurs, translational velocity is set as a ratio of a speed factor relative to a maximum speed.
The Theseus Autonomous Underwater Vehicle: A Canadian Success Story
1997-04-01
P502414.PDF [Page: 1 of 9] P502414.PDF [Page: 2 of 9] P502414.PDF [Page: 3 of 9] The Theseus Autonomous Underwater Vehicle A Canadian Success Story...autonomous underwater vehicle, named Theseus , for laying optical fiber cables in ice- covered waters. In trials and missions conducted in 1996, this...stations. An acoustic telemetry system enables communication with Theseus from surface stations, and an optical telemetry system is used for system
NASA Technical Reports Server (NTRS)
1975-01-01
User technology requirements are identified in relation to needed technology advancement for future space missions in the areas of navigation, guidance, and control. Emphasis is placed on: reduction of mission support cost by 50% through autonomous operation, a ten-fold increase in mission output through improved pointing and control, and a hundred-fold increase in human productivity in space through large-scale teleoperator applications.
A Neural Model of How the Brain Computes Heading from Optic Flow in Realistic Scenes
ERIC Educational Resources Information Center
Browning, N. Andrew; Grossberg, Stephen; Mingolla, Ennio
2009-01-01
Visually-based navigation is a key competence during spatial cognition. Animals avoid obstacles and approach goals in novel cluttered environments using optic flow to compute heading with respect to the environment. Most navigation models try either explain data, or to demonstrate navigational competence in real-world environments without regard…
Analysis of key technologies in geomagnetic navigation
NASA Astrophysics Data System (ADS)
Zhang, Xiaoming; Zhao, Yan
2008-10-01
Because of the costly price and the error accumulation of high precise Inertial Navigation Systems (INS) and the vulnerability of Global Navigation Satellite Systems (GNSS), the geomagnetic navigation technology, a passive autonomous navigation method, is paid attention again. Geomagnetic field is a natural spatial physical field, and is a function of position and time in near earth space. The navigation technology based on geomagnetic field is researched in a wide range of commercial and military applications. This paper presents the main features and the state-of-the-art of Geomagnetic Navigation System (GMNS). Geomagnetic field models and reference maps are described. Obtaining, modeling and updating accurate Anomaly Magnetic Field information is an important step for high precision geomagnetic navigation. In addition, the errors of geomagnetic measurement using strapdown magnetometers are analyzed. The precise geomagnetic data is obtained by means of magnetometer calibration and vehicle magnetic field compensation. According to the measurement data and reference map or model of geomagnetic field, the vehicle's position and attitude can be obtained using matching algorithm or state-estimating method. The tendency of geomagnetic navigation in near future is introduced at the end of this paper.
Assurance Technology Challenges of Advanced Space Systems
NASA Technical Reports Server (NTRS)
Chern, E. James
2004-01-01
The initiative to explore space and extend a human presence across our solar system to revisit the moon and Mars post enormous technological challenges to the nation's space agency and aerospace industry. Key areas of technology development needs to enable the endeavor include advanced materials, structures and mechanisms; micro/nano sensors and detectors; power generation, storage and management; advanced thermal and cryogenic control; guidance, navigation and control; command and data handling; advanced propulsion; advanced communication; on-board processing; advanced information technology systems; modular and reconfigurable systems; precision formation flying; solar sails; distributed observing systems; space robotics; and etc. Quality assurance concerns such as functional performance, structural integrity, radiation tolerance, health monitoring, diagnosis, maintenance, calibration, and initialization can affect the performance of systems and subsystems. It is thus imperative to employ innovative nondestructive evaluation methodologies to ensure quality and integrity of advanced space systems. Advancements in integrated multi-functional sensor systems, autonomous inspection approaches, distributed embedded sensors, roaming inspectors, and shape adaptive sensors are sought. Concepts in computational models for signal processing and data interpretation to establish quantitative characterization and event determination are also of interest. Prospective evaluation technologies include ultrasonics, laser ultrasonics, optics and fiber optics, shearography, video optics and metrology, thermography, electromagnetics, acoustic emission, x-ray, data management, biomimetics, and nano-scale sensing approaches for structural health monitoring.
COBALT CoOperative Blending of Autonomous Landing Technology
NASA Technical Reports Server (NTRS)
Carson, John M. III; Restrepo, Carolina I.; Robertson, Edward A.; Seubert, Carl R.; Amzajerdian, Farzin
2016-01-01
COBALT is a terrestrial test platform for development and maturation of GN&C (Guidance, Navigation and Control) technologies for PL&HA (Precision Landing and Hazard Avoidance). The project is developing a third generation, Langley Navigation Doppler Lidar (NDL) for ultra-precise velocity and range measurements, which will be integrated and tested with the JPL Lander Vision System (LVS) for Terrain Relative Navigation (TRN) position estimates. These technologies together provide navigation that enables controlled precision landing. The COBALT hardware will be integrated in 2017 into the GN&C subsystem of the Xodiac rocket-propulsive Vertical Test Bed (VTB) developed by Masten Space Systems (MSS), and two terrestrial flight campaigns will be conducted: one open-loop (i.e., passive) and one closed-loop (i.e., active).
Integrating Terrain Maps Into a Reactive Navigation Strategy
NASA Technical Reports Server (NTRS)
Howard, Ayanna; Werger, Barry; Seraji, Homayoun
2006-01-01
An improved method of processing information for autonomous navigation of a robotic vehicle across rough terrain involves the integration of terrain maps into a reactive navigation strategy. Somewhat more precisely, the method involves the incorporation, into navigation logic, of data equivalent to regional traversability maps. The terrain characteristic is mapped using a fuzzy-logic representation of the difficulty of traversing the terrain. The method is robust in that it integrates a global path-planning strategy with sensor-based regional and local navigation strategies to ensure a high probability of success in reaching a destination and avoiding obstacles along the way. The sensor-based strategies use cameras aboard the vehicle to observe the regional terrain, defined as the area of the terrain that covers the immediate vicinity near the vehicle to a specified distance a few meters away.
An Autonomous Control System for an Intra-Vehicular Spacecraft Mobile Monitor Prototype
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.; Desiano, Salvatore D.; Gawdiak, Yuri; Nicewarner, Keith
2003-01-01
This paper presents an overview of an ongoing research and development effort at the NASA Ames Research Center to create an autonomous control system for an internal spacecraft autonomous mobile monitor. It primary functions are to provide crew support and perform intra- vehicular sensing activities by autonomously navigating onboard the International Space Station. We describe the mission roles and high-level functional requirements for an autonomous mobile monitor. The mobile monitor prototypes, of which two are operational and one is actively being designed, physical test facilities used to perform ground testing, including a 3D micro-gravity test facility, and simulators are briefly described. We provide an overview of the autonomy framework and describe each of its components, including those used for automated planning, goal-oriented task execution, diagnosis, and fault recovery. A sample mission test scenario is also described.
Autonomous landing and ingress of micro-air-vehicles in urban environments based on monocular vision
NASA Astrophysics Data System (ADS)
Brockers, Roland; Bouffard, Patrick; Ma, Jeremy; Matthies, Larry; Tomlin, Claire
2011-06-01
Unmanned micro air vehicles (MAVs) will play an important role in future reconnaissance and search and rescue applications. In order to conduct persistent surveillance and to conserve energy, MAVs need the ability to land, and they need the ability to enter (ingress) buildings and other structures to conduct reconnaissance. To be safe and practical under a wide range of environmental conditions, landing and ingress maneuvers must be autonomous, using real-time, onboard sensor feedback. To address these key behaviors, we present a novel method for vision-based autonomous MAV landing and ingress using a single camera for two urban scenarios: landing on an elevated surface, representative of a rooftop, and ingress through a rectangular opening, representative of a door or window. Real-world scenarios will not include special navigation markers, so we rely on tracking arbitrary scene features; however, we do currently exploit planarity of the scene. Our vision system uses a planar homography decomposition to detect navigation targets and to produce approach waypoints as inputs to the vehicle control algorithm. Scene perception, planning, and control run onboard in real-time; at present we obtain aircraft position knowledge from an external motion capture system, but we expect to replace this in the near future with a fully self-contained, onboard, vision-aided state estimation algorithm. We demonstrate autonomous vision-based landing and ingress target detection with two different quadrotor MAV platforms. To our knowledge, this is the first demonstration of onboard, vision-based autonomous landing and ingress algorithms that do not use special purpose scene markers to identify the destination.
Autonomous Landing and Ingress of Micro-Air-Vehicles in Urban Environments Based on Monocular Vision
NASA Technical Reports Server (NTRS)
Brockers, Roland; Bouffard, Patrick; Ma, Jeremy; Matthies, Larry; Tomlin, Claire
2011-01-01
Unmanned micro air vehicles (MAVs) will play an important role in future reconnaissance and search and rescue applications. In order to conduct persistent surveillance and to conserve energy, MAVs need the ability to land, and they need the ability to enter (ingress) buildings and other structures to conduct reconnaissance. To be safe and practical under a wide range of environmental conditions, landing and ingress maneuvers must be autonomous, using real-time, onboard sensor feedback. To address these key behaviors, we present a novel method for vision-based autonomous MAV landing and ingress using a single camera for two urban scenarios: landing on an elevated surface, representative of a rooftop, and ingress through a rectangular opening, representative of a door or window. Real-world scenarios will not include special navigation markers, so we rely on tracking arbitrary scene features; however, we do currently exploit planarity of the scene. Our vision system uses a planar homography decomposition to detect navigation targets and to produce approach waypoints as inputs to the vehicle control algorithm. Scene perception, planning, and control run onboard in real-time; at present we obtain aircraft position knowledge from an external motion capture system, but we expect to replace this in the near future with a fully self-contained, onboard, vision-aided state estimation algorithm. We demonstrate autonomous vision-based landing and ingress target detection with two different quadrotor MAV platforms. To our knowledge, this is the first demonstration of onboard, vision-based autonomous landing and ingress algorithms that do not use special purpose scene markers to identify the destination.
Optical Navigation Preparations for New Horizons Pluto Flyby
NASA Technical Reports Server (NTRS)
Owen, William M., Jr.; Dumont, Philip J.; Jackman, Coralie D.
2012-01-01
The New Horizons spacecraft will encounter Pluto and its satellites in July 2015. As was the case for the Voyager encounters with Jupiter, Saturn, Uranus and Neptune, mission success will depend heavily on accurate spacecraft navigation, and accurate navigation will be impossible without the use of pictures of the Pluto system taken by the onboard cameras. We describe the preparations made by the New Horizons optical navigators: picture planning, image processing algorithms, software development and testing, and results from in-flight imaging.
Terrain matching image pre-process and its format transform in autonomous underwater navigation
NASA Astrophysics Data System (ADS)
Cao, Xuejun; Zhang, Feizhou; Yang, Dongkai; Yang, Bogang
2007-06-01
Underwater passive navigation technology is one of the important development orientations in the field of modern navigation. With the advantage of high self-determination, stealth at sea, anti-jamming and high precision, passive navigation is completely meet with actual navigation requirements. Therefore passive navigation has become a specific navigating method for underwater vehicles. The scientists and researchers in the navigating field paid more attention to it. The underwater passive navigation can provide accurate navigation information with main Inertial Navigation System (INS) for a long period, such as location and speed. Along with the development of micro-electronics technology, the navigation of AUV is given priority to INS assisted with other navigation methods, such as terrain matching navigation. It can provide navigation ability for a long period, correct the errors of INS and make AUV not emerge from the seabed termly. With terrain matching navigation technique, in the assistance of digital charts and ocean geographical characteristics sensors, we carry through underwater image matching assistant navigation to obtain the higher location precision, therefore it is content with the requirement of underwater, long-term, high precision and all-weather of the navigation system for Autonomous Underwater Vehicles. Tertian-assistant navigation (TAN) is directly dependent on the image information (map information) in the navigating field to assist the primary navigation system according to the path appointed in advance. In TAN, a factor coordinative important with the system operation is precision and practicability of the storable images and the database which produce the image data. If the data used for characteristics are not suitable, the system navigation precision will be low. Comparing with terrain matching assistant navigation system, image matching navigation system is a kind of high precision and low cost assistant navigation system, and its matching precision directly influences the final precision of integrated navigation system. Image matching assistant navigation is spatially matching and aiming at two underwater scenery images coming from two different sensors matriculating of the same scenery in order to confirm the relative displacement of the two images. In this way, we can obtain the vehicle's location in fiducial image known geographical relation, and the precise location information given from image matching location is transmitted to INS to eliminate its location error and greatly enhance the navigation precision of vehicle. Digital image data analysis and processing of image matching in underwater passive navigation is important. In regard to underwater geographic data analysis, we focus on the acquirement, disposal, analysis, expression and measurement of database information. These analysis items structure one of the important contents of underwater terrain matching and are propitious to know the seabed terrain configuration of navigation areas so that the best advantageous seabed terrain district and dependable navigation algorithm can be selected. In this way, we can improve the precision and reliability of terrain assistant navigation system. The pre-process and format transformation of digital image during underwater image matching are expatiated in this paper. The information of the terrain status in navigation areas need further study to provide the reliable data terrain characteristic and underwater overcast for navigation. Through realizing the choice of sea route, danger district prediction and navigating algorithm analysis, TAN can obtain more high location precision and probability, hence provide technological support for image matching of underwater passive navigation.
Control of autonomous robot using neural networks
NASA Astrophysics Data System (ADS)
Barton, Adam; Volna, Eva
2017-07-01
The aim of the article is to design a method of control of an autonomous robot using artificial neural networks. The introductory part describes control issues from the perspective of autonomous robot navigation and the current mobile robots controlled by neural networks. The core of the article is the design of the controlling neural network, and generation and filtration of the training set using ART1 (Adaptive Resonance Theory). The outcome of the practical part is an assembled Lego Mindstorms EV3 robot solving the problem of avoiding obstacles in space. To verify models of an autonomous robot behavior, a set of experiments was created as well as evaluation criteria. The speed of each motor was adjusted by the controlling neural network with respect to the situation in which the robot was found.
Attenuating Stereo Pixel-Locking via Affine Window Adaptation
NASA Technical Reports Server (NTRS)
Stein, Andrew N.; Huertas, Andres; Matthies, Larry H.
2006-01-01
For real-time stereo vision systems, the standard method for estimating sub-pixel stereo disparity given an initial integer disparity map involves fitting parabolas to a matching cost function aggregated over rectangular windows. This results in a phenomenon known as 'pixel-locking,' which produces artificially-peaked histograms of sub-pixel disparity. These peaks correspond to the introduction of erroneous ripples or waves in the 3D reconstruction of truly Rat surfaces. Since stereo vision is a common input modality for autonomous vehicles, these inaccuracies can pose a problem for safe, reliable navigation. This paper proposes a new method for sub-pixel stereo disparity estimation, based on ideas from Lucas-Kanade tracking and optical flow, which substantially reduces the pixel-locking effect. In addition, it has the ability to correct much larger initial disparity errors than previous approaches and is more general as it applies not only to the ground plane.
Accurate estimation of object location in an image sequence using helicopter flight data
NASA Technical Reports Server (NTRS)
Tang, Yuan-Liang; Kasturi, Rangachar
1994-01-01
In autonomous navigation, it is essential to obtain a three-dimensional (3D) description of the static environment in which the vehicle is traveling. For a rotorcraft conducting low-latitude flight, this description is particularly useful for obstacle detection and avoidance. In this paper, we address the problem of 3D position estimation for static objects from a monocular sequence of images captured from a low-latitude flying helicopter. Since the environment is static, it is well known that the optical flow in the image will produce a radiating pattern from the focus of expansion. We propose a motion analysis system which utilizes the epipolar constraint to accurately estimate 3D positions of scene objects in a real world image sequence taken from a low-altitude flying helicopter. Results show that this approach gives good estimates of object positions near the rotorcraft's intended flight-path.
Autonomous Rovers for Polar Science Campaigns
NASA Astrophysics Data System (ADS)
Lever, J. H.; Ray, L. E.; Williams, R. M.; Morlock, A. M.; Burzynski, A. M.
2012-12-01
We have developed and deployed two over-snow autonomous rovers able to conduct remote science campaigns on Polar ice sheets. Yeti is an 80-kg, four-wheel-drive (4WD) battery-powered robot with 3 - 4 hr endurance, and Cool Robot is a 60-kg 4WD solar-powered robot with unlimited endurance during Polar summers. Both robots navigate using GPS waypoint-following to execute pre-planned courses autonomously, and they can each carry or tow 20 - 160 kg instrument payloads over typically firm Polar snowfields. In 2008 - 12, we deployed Yeti to conduct autonomous ground-penetrating radar (GPR) surveys to detect hidden crevasses to help establish safe routes for overland resupply of research stations at South Pole, Antarctica, and Summit, Greenland. We also deployed Yeti with GPR at South Pole in 2011 to identify the locations of potentially hazardous buried buildings from the original 1950's-era station. Autonomous surveys remove personnel from safety risks posed during manual GPR surveys by undetected crevasses or buried buildings. Furthermore, autonomous surveys can yield higher quality and more comprehensive data than manual ones: Yeti's low ground pressure (20 kPa) allows it to cross thinly bridged crevasses or other voids without interrupting a survey, and well-defined survey grids allow repeated detection of buried voids to improve detection reliability and map their extent. To improve survey efficiency, we have automated the mapping of detected hazards, currently identified via post-survey manual review of the GPR data. Additionally, we are developing machine-learning algorithms to detect crevasses autonomously in real time, with reliability potentially higher than manual real-time detection. These algorithms will enable the rover to relay crevasse locations to a base station for near real-time mapping and decision-making. We deployed Cool Robot at Summit Station in 2005 to verify its mobility and power budget over Polar snowfields. Using solar power, this zero-emissions rover could travel more than 500 km per week during Polar summers and provide 100 - 200 W to power instrument payloads to help investigate the atmosphere, magnetosphere, glaciology and sub-glacial geology in Antarctica and Greenland. We are currently upgrading Cool Robot's navigation and solar-power systems and will deploy it during 2013 to map the emissions footprint around Summit Station to demonstrate its potential to execute long-endurance Polar science campaigns. These rovers could assist science traverses to chart safe routes into the interior of Antarctica and Greenland or conduct autonomous, remote science campaigns to extend spatial and temporal coverage for data collection. Our goals include 1,000 - 2,000-km summertime traverses of Antarctica and Greenland, safe navigation through 0.5-m amplitude sastrugi fields, survival in blizzards, and rover-network adaptation to research events of opportunity. We are seeking Polar scientists interested in autonomous, mobile data collection and can adapt the rovers to meet their requirements.
Search Problems in Mission Planning and Navigation of Autonomous Aircraft. M.S. Thesis
NASA Technical Reports Server (NTRS)
Krozel, James A.
1988-01-01
An architecture for the control of an autonomous aircraft is presented. The architecture is a hierarchical system representing an anthropomorphic breakdown of the control problem into planner, navigator, and pilot systems. The planner system determines high level global plans from overall mission objectives. This abstract mission planning is investigated by focusing on the Traveling Salesman Problem with variations on local and global constraints. Tree search techniques are applied including the breadth first, depth first, and best first algorithms. The minimum-column and row entries for the Traveling Salesman Problem cost matrix provides a powerful heuristic to guide these search techniques. Mission planning subgoals are directed from the planner to the navigator for planning routes in mountainous terrain with threats. Terrain/threat information is abstracted into a graph of possible paths for which graph searches are performed. It is shown that paths can be well represented by a search graph based on the Voronoi diagram of points representing the vertices of mountain boundaries. A comparison of Dijkstra's dynamic programming algorithm and the A* graph search algorithm from artificial intelligence/operations research is performed for several navigation path planning examples. These examples illustrate paths that minimize a combination of distance and exposure to threats. Finally, the pilot system synthesizes the flight trajectory by creating the control commands to fly the aircraft.
Study on polarized optical flow algorithm for imaging bionic polarization navigation micro sensor
NASA Astrophysics Data System (ADS)
Guan, Le; Liu, Sheng; Li, Shi-qi; Lin, Wei; Zhai, Li-yuan; Chu, Jin-kui
2018-05-01
At present, both the point source and the imaging polarization navigation devices only can output the angle information, which means that the velocity information of the carrier cannot be extracted from the polarization field pattern directly. Optical flow is an image-based method for calculating the velocity of pixel point movement in an image. However, for ordinary optical flow, the difference in pixel value as well as the calculation accuracy can be reduced in weak light. Polarization imaging technology has the ability to improve both the detection accuracy and the recognition probability of the target because it can acquire the extra polarization multi-dimensional information of target radiation or reflection. In this paper, combining the polarization imaging technique with the traditional optical flow algorithm, a polarization optical flow algorithm is proposed, and it is verified that the polarized optical flow algorithm has good adaptation in weak light and can improve the application range of polarization navigation sensors. This research lays the foundation for day and night all-weather polarization navigation applications in future.
Autonomous Navigation and Control
1988-10-01
Ball Aerospace, Cincinnati Electronics, COMSAT, DSI, Harris Corporation, M/A-COM, Qualcomm , RCA, Rockwell and SPACECOM. The objective of the satellite...constellation was to provide global prioritized data-voice service during peacetime and essential communications during crises . This was to be
Design considerations for imaging charge-coupled device
NASA Astrophysics Data System (ADS)
1981-04-01
The image dissector tube, which was formerly used as detector in star trackers, will be replaced by solid state imaging devices. The technology advances of charge transfer devices, like the charge-coupled device (CCD) and the charge-injection device (CID) have made their application to star trackers an immediate reality. The Air Force in 1979 funded an American Aerospace company to develop an imaging CCD (ICCD) star sensor for the Multimission Attitude Determination and Autonomous Navigation (MADAN) system. The MADAN system is a technology development for a strapdown attitude and navigation system which can be used on all Air Force 3-axis stabilized satellites. The system will be autonomous and will provide real-time satellite attitude and position information. The star sensor accuracy provides an overall MADAN attitude accuracy of 2 arcsec for star rates up to 300 arcsec/sec. The ICCD is basically an integrating device. Its pixel resolution in not yet satisfactory for precision applications.
Optimal Path Planning Program for Autonomous Speed Sprayer in Orchard Using Order-Picking Algorithm
NASA Astrophysics Data System (ADS)
Park, T. S.; Park, S. J.; Hwang, K. Y.; Cho, S. I.
This study was conducted to develop a software program which computes optimal path for autonomous navigation in orchard, especially for speed sprayer. Possibilities of autonomous navigation in orchard were shown by other researches which have minimized distance error between planned path and performed path. But, research of planning an optimal path for speed sprayer in orchard is hardly founded. In this study, a digital map and a database for orchard which contains GPS coordinate information (coordinates of trees and boundary of orchard) and entity information (heights and widths of trees, radius of main stem of trees, disease of trees) was designed. An orderpicking algorithm which has been used for management of warehouse was used to calculate optimum path based on the digital map. Database for digital map was created by using Microsoft Access and graphic interface for database was made by using Microsoft Visual C++ 6.0. It was possible to search and display information about boundary of an orchard, locations of trees, daily plan for scattering chemicals and plan optimal path on different orchard based on digital map, on each circumstance (starting speed sprayer in different location, scattering chemicals for only selected trees).
Preliminary study of a millimeter wave FMCW InSAR for UAS indoor navigation.
Scannapieco, Antonio F; Renga, Alfredo; Moccia, Antonio
2015-01-22
Small autonomous unmanned aerial systems (UAS) could be used for indoor inspection in emergency missions, such as damage assessment or the search for survivors in dangerous environments, e.g., power plants, underground railways, mines and industrial warehouses. Two basic functions are required to carry out these tasks, that is autonomous GPS-denied navigation with obstacle detection and high-resolution 3Dmapping with moving target detection. State-of-the-art sensors for UAS are very sensitive to environmental conditions and often fail in the case of poor visibility caused by dust, fog, smoke, flames or other factors that are met as nominal mission scenarios when operating indoors. This paper is a preliminary study concerning an innovative radar sensor based on the interferometric Synthetic Aperture Radar (SAR) principle, which has the potential to satisfy stringent requirements set by indoor autonomous operation. An architectural solution based on a frequency-modulated continuous wave (FMCW) scheme is proposed after a detailed analysis of existing compact and lightweight SAR. A preliminary system design is obtained, and the main imaging peculiarities of the novel sensor are discussed, demonstrating that high-resolution, high-quality observation of an assigned control volume can be achieved.
Preliminary Study of a Millimeter Wave FMCW InSAR for UAS Indoor Navigation
Scannapieco, Antonio F.; Renga, Alfredo; Moccia, Antonio
2015-01-01
Small autonomous unmanned aerial systems (UAS) could be used for indoor inspection in emergency missions, such as damage assessment or the search for survivors in dangerous environments, e.g., power plants, underground railways, mines and industrial warehouses. Two basic functions are required to carry out these tasks, that is autonomous GPS-denied navigation with obstacle detection and high-resolution 3D mapping with moving target detection. State-of-the-art sensors for UAS are very sensitive to environmental conditions and often fail in the case of poor visibility caused by dust, fog, smoke, flames or other factors that are met as nominal mission scenarios when operating indoors. This paper is a preliminary study concerning an innovative radar sensor based on the interferometric Synthetic Aperture Radar (SAR) principle, which has the potential to satisfy stringent requirements set by indoor autonomous operation. An architectural solution based on a frequency-modulated continuous wave (FMCW) scheme is proposed after a detailed analysis of existing compact and lightweight SAR. A preliminary system design is obtained, and the main imaging peculiarities of the novel sensor are discussed, demonstrating that high-resolution, high-quality observation of an assigned control volume can be achieved. PMID:25621606
NASA Technical Reports Server (NTRS)
Hisamoto, Chuck (Inventor); Arzoumanian, Zaven (Inventor); Sheikh, Suneel I. (Inventor)
2015-01-01
A method and system for spacecraft navigation using distant celestial gamma-ray bursts which offer detectable, bright, high-energy events that provide well-defined characteristics conducive to accurate time-alignment among spatially separated spacecraft. Utilizing assemblages of photons from distant gamma-ray bursts, relative range between two spacecraft can be accurately computed along the direction to each burst's source based upon the difference in arrival time of the burst emission at each spacecraft's location. Correlation methods used to time-align the high-energy burst profiles are provided. The spacecraft navigation may be carried out autonomously or in a central control mode of operation.
Real-time adaptive off-road vehicle navigation and terrain classification
NASA Astrophysics Data System (ADS)
Muller, Urs A.; Jackel, Lawrence D.; LeCun, Yann; Flepp, Beat
2013-05-01
We are developing a complete, self-contained autonomous navigation system for mobile robots that learns quickly, uses commodity components, and has the added benefit of emitting no radiation signature. It builds on the autonomous navigation technology developed by Net-Scale and New York University during the Defense Advanced Research Projects Agency (DARPA) Learning Applied to Ground Robots (LAGR) program and takes advantage of recent scientific advancements achieved during the DARPA Deep Learning program. In this paper we will present our approach and algorithms, show results from our vision system, discuss lessons learned from the past, and present our plans for further advancing vehicle autonomy.
Using Bio-Optics to Reveal Phytoplankton Physiology from a Wirewalker Autonomous Platform
NASA Technical Reports Server (NTRS)
Omand, M. M.; Cetinic, I.; Lucas, A. J.
2017-01-01
Rapid, wave-powered profiling of bio-optical properties from an autonomous Wirewalker platform provides useful insights into phytoplankton physiology, including the patterns of diel growth, phytoplankton mortality, nonphotochemical quenching of chlorophyll a fluorescence, and natural (sun-induced) fluorescence of mixed communities. Methods are proposed to quantify each of these processes. Such autonomous measurements of phytoplankton physiological rates and responses open up new possibilities for studying phytoplankton in situ, over longer periods, and under a broader range of environmental conditions.
Integrity Analysis of Real-Time Ppp Technique with Igs-Rts Service for Maritime Navigation
NASA Astrophysics Data System (ADS)
El-Diasty, M.
2017-10-01
Open sea and inland waterways are the most widely used mode for transporting goods worldwide. It is the International Maritime Organization (IMO) that defines the requirements for position fixing equipment for a worldwide radio-navigation system, in terms of accuracy, integrity, continuity, availability and coverage for the various phases of navigation. Satellite positioning systems can contribute to meet these requirements, as well as optimize marine transportation. Marine navigation usually consists of three major phases identified as Ocean/Coastal/Port approach/Inland waterway, in port navigation and automatic docking with alert limit ranges from 25 m to 0.25 m. GPS positioning is widely used for many applications and is currently recognized by IMO for a future maritime navigation. With the advancement in autonomous GPS positioning techniques such as Precise Point Positioning (PPP) and with the advent of new real-time GNSS correction services such as IGS-Real-Time-Service (RTS), it is necessary to investigate the integrity of the PPP-based positioning technique along with IGS-RTS service in terms of availability and reliability for safe navigation in maritime application. This paper monitors the integrity of an autonomous real-time PPP-based GPS positioning system using the IGS real-time service (RTS) for maritime applications that require minimum availability of integrity of 99.8 % to fulfil the IMO integrity standards. To examine the integrity of the real-time IGS-RTS PPP-based technique for maritime applications, kinematic data from a dual frequency GPS receiver is collected onboard a vessel and investigated with the real-time IGS-RTS PPP-based GPS positioning technique. It is shown that the availability of integrity of the real-time IGS-RTS PPP-based GPS solution is 100 % for all navigation phases and therefore fulfil the IMO integrity standards (99.8 % availability) immediately (after 1 second), after 2 minutes and after 42 minutes of convergence time for Ocean/Coastal/Port approach/Inland waterway, in port navigation and automatic docking, respectively. Moreover, the misleading information is about 2 % for all navigation phases that is considered less safe is not in immediate danger because the horizontal position error is less than the navigation alert limits.
Navigation d'un vehicule autonome autour d'un asteroide
NASA Astrophysics Data System (ADS)
Dionne, Karine
Les missions d'exploration planetaire utilisent des vehicules spatiaux pour acquerir les donnees scientifiques qui font avancer notre connaissance du systeme solaire. Depuis les annees 90, ces missions ciblent non seulement les planetes, mais aussi les corps celestes de plus petite taille comme les asteroides. Ces astres representent un defi particulier du point de vue des systemes de navigation, car leur environnement dynamique est complexe. Une sonde spatiale doit reagir rapidement face aux perturbations gravitationnelles en presence, sans quoi sa securite pourrait etre compromise. Les delais de communication avec la Terre pouvant souvent atteindre plusieurs dizaines de minutes, il est necessaire de developper des logiciels permettant une plus grande autonomie d'operation pour ce type de mission. Ce memoire presente un systeme de navigation autonome qui determine la position et la vitesse d'un satellite en orbite autour d'un asteroide. Il s'agit d'un filtre de Kalman etendu adaptatif a trois degres de liberte. Le systeme propose se base sur l'imagerie optique pour detecter des " points de reperes " qui ont ete prealablement cartographies. Il peut s'agir de crateres, de rochers ou de n'importe quel trait physique discernable a la camera. Les travaux de recherche realises se concentrent sur les techniques d'estimation d'etat propres a la navigation autonome. Ainsi, on suppose l'existence d'un logiciel approprie qui realise les fonctions de traitement d'image. La principale contribution de recherche consiste en l'inclusion, a chaque cycle d'estimation, d'une mesure de distance afin d'ameliorer les performances de navigation. Un estimateur d'etat de type adaptatif est necessaire pour le traitement de ces mesures, car leur precision varie dans le temps en raison de l'erreur de pointage. Les contributions secondaires de recherche sont liees a l'analyse de l'observabilite du systeme ainsi qu'a une analyse de sensibilite pour six parametres principaux de conception. Les resultats de simulation montrent que l'ajout d'une mesure de distance par cycle de mise a jour entraine une amelioration significative des performances de navigation. Ce procede reduit l'erreur d'estimation ainsi que les periodes de non-observabilite en plus de contrer la dilution de precision des mesures. Les analyses de sensibilite confirment quant a elles la contribution des mesures de distance a la diminution globale de l'erreur d'estimation et ce pour une large gamme de parametres de conception. Elles indiquent egalement que l'erreur de cartographie est un parametre critique pour les performances du systeme de navigation developpe. Mots cles : Estimation d'etat, filtre de Kalman adaptatif, navigation optique, lidar, asteroide, simulations numeriques
Conventional vs Biomimetic Approaches to the Exploration of Mars
NASA Astrophysics Data System (ADS)
Ellery, A.
It is not usual to refer to convention in planetary exploration missions by virtue of the innovation required for such projects. The term conventional refers to the methodologies, tools and approaches typically adopted in engineering that are applied to such missions. Presented is a "conventional" Mars rover mission in which the author was involved - ExoMars - into which is interspersed references to examples where biomimetic approaches may yield superior capabilities. Biomimetics is a relatively recently active area of research which seeks to examine how biological systems solve the problem of survival in the natural environment. Biological organisms are autonomous entities that must survive in a hostile world adapting both adaptivity and robustness. It is not then surprising that biomimetics is particularly useful when applied to robotic elements of a Mars exploration mission. I present a number of areas in which biomimetics may yield new solutions to the problem of Mars exploration - optic flow navigation, potential field navigation, genetically-evolved neuro-controllers, legged locomotion, electric motors implementing muscular behaviour, and a biomimetic drill based on the wood wasp ovipositor. Each of these techniques offers an alternative approach to conventional ones. However, the perceptive hurdles are likely to dwarf the technical hurdles in implementing many of these methods in the near future.
Land, sea, and air unmanned systems research and development at SPAWAR Systems Center Pacific
NASA Astrophysics Data System (ADS)
Nguyen, Hoa G.; Laird, Robin; Kogut, Greg; Andrews, John; Fletcher, Barbara; Webber, Todd; Arrieta, Rich; Everett, H. R.
2009-05-01
The Space and Naval Warfare (SPAWAR) Systems Center Pacific (SSC Pacific) has a long and extensive history in unmanned systems research and development, starting with undersea applications in the 1960s and expanding into ground and air systems in the 1980s. In the ground domain, we are addressing force-protection scenarios using large unmanned ground vehicles (UGVs) and fixed sensors, and simultaneously pursuing tactical and explosive ordnance disposal (EOD) operations with small man-portable robots. Technology thrusts include improving robotic intelligence and functionality, autonomous navigation and world modeling in urban environments, extended operational range of small teleoperated UGVs, enhanced human-robot interaction, and incorporation of remotely operated weapon systems. On the sea surface, we are pushing the envelope on dynamic obstacle avoidance while conforming to established nautical rules-of-the-road. In the air, we are addressing cooperative behaviors between UGVs and small vertical-takeoff- and-landing unmanned air vehicles (UAVs). Underwater applications involve very shallow water mine countermeasures, ship hull inspection, oceanographic data collection, and deep ocean access. Specific technology thrusts include fiber-optic communications, adaptive mission controllers, advanced navigation techniques, and concepts of operations (CONOPs) development. This paper provides a review of recent accomplishments and current status of a number of projects in these areas.
Aspect-dependent radiated noise analysis of an underway autonomous underwater vehicle.
Gebbie, John; Siderius, Martin; Allen, John S
2012-11-01
This paper presents an analysis of the acoustic emissions emitted by an underway REMUS-100 autonomous underwater vehicle (AUV) that were obtained near Honolulu Harbor, HI using a fixed, bottom-mounted horizontal line array (HLA). Spectral analysis, beamforming, and cross-correlation facilitate identification of independent sources of noise originating from the AUV. Fusion of navigational records from the AUV with acoustic data from the HLA allows for an aspect-dependent presentation of calculated source levels of the strongest propulsion tone.
Autonomous Navigation, Dynamic Path and Work Flow Planning in Multi-Agent Robotic Swarms Project
NASA Technical Reports Server (NTRS)
Falker, John; Zeitlin, Nancy; Leucht, Kurt; Stolleis, Karl
2015-01-01
Kennedy Space Center has teamed up with the Biological Computation Lab at the University of New Mexico to create a swarm of small, low-cost, autonomous robots, called Swarmies, to be used as a ground-based research platform for in-situ resource utilization missions. The behavior of the robot swarm mimics the central-place foraging strategy of ants to find and collect resources in an unknown environment and return those resources to a central site.
Integration for navigation on the UMASS mobile perception lab
NASA Technical Reports Server (NTRS)
Draper, Bruce; Fennema, Claude; Rochwerger, Benny; Riseman, Edward; Hanson, Allen
1994-01-01
Integration of real-time visual procedures for use on the Mobile Perception Lab (MPL) was presented. The MPL is an autonomous vehicle designed for testing visually guided behavior. Two critical areas of focus in the system design were data storage/exchange and process control. The Intermediate Symbolic Representation (ISR3) supported data storage and exchange, and the MPL script monitor provided process control. Resource allocation, inter-process communication, and real-time control are difficult problems which must be solved in order to construct strong autonomous systems.
A Robust Mechanical Sensing System for Unmanned Sea Surface Vehicles
NASA Technical Reports Server (NTRS)
Kulczycki, Eric A.; Magnone, Lee J.; Huntsberger, Terrance; Aghazarian, Hrand; Padgett, Curtis W.; Trotz, David C.; Garrett, Michael S.
2009-01-01
The need for autonomous navigation and intelligent control of unmanned sea surface vehicles requires a mechanically robust sensing architecture that is watertight, durable, and insensitive to vibration and shock loading. The sensing system developed here comprises four black and white cameras and a single color camera. The cameras are rigidly mounted to a camera bar that can be reconfigured to mount multiple vehicles, and act as both navigational cameras and application cameras. The cameras are housed in watertight casings to protect them and their electronics from moisture and wave splashes. Two of the black and white cameras are positioned to provide lateral vision. They are angled away from the front of the vehicle at horizontal angles to provide ideal fields of view for mapping and autonomous navigation. The other two black and white cameras are positioned at an angle into the color camera's field of view to support vehicle applications. These two cameras provide an overlap, as well as a backup to the front camera. The color camera is positioned directly in the middle of the bar, aimed straight ahead. This system is applicable to any sea-going vehicle, both on Earth and in space.
The Integration, Testing and Flight of the EO-1 GPS
NASA Technical Reports Server (NTRS)
Quinn, David A.; Sanneman, Paul A.; Shulman, Seth E.; Sager, Jennifer A.
2001-01-01
The Global Positioning System has long been hailed as the wave of the future for autonomous on-board navigation of low Earth orbiting spacecraft despite the fact that relatively few spacecraft have actually employed it for this purpose. While several missions operated out of the Goddard Space Flight Center have flown GPS receivers on board, the New Millenium Program (NMP) Earth Orbiting-1 (EO-1) spacecraft is the first to employ GPS for active, autonomous on-board navigation. Since EO-1 was designed to employ GPS as its primary source of the navigation ephemeris, special care had to be taken during the integration phase of spacecraft construction to assure proper performance. This paper is a discussion of that process: a brief overview of how the GPS works, how it fits into the design of the EO-1 Attitude Control System (ACS), the steps taken to integrate the system into the EO-1 spacecraft, the ultimate on-orbit performance during launch and early operations of the EO-1 mission and the performance of the on-board GPS ephemeris versus the ground based ephemeris. Conclusions will include a discussion of the lessons learned.
Flight Analysis of an Autonomously Navigated Experimental Lander for High Altitude Recovery
NASA Technical Reports Server (NTRS)
Chin, Jeffrey; Niehaus, Justin; Goodenow, Debra; Dunker, Storm; Montague, David
2016-01-01
First steps have been taken to qualify a family of parafoil systems capable of increasing the survivability and reusability of high-altitude balloon payloads. The research is motivated by the common risk facing balloon payloads where expensive flight hardware can often land in inaccessible areas that make them difficult or impossible to recover. The Autonomously Navigated Experimental Lander (ANGEL) flight test introduced a commercial Guided Parachute Aerial Delivery System (GPADS) to a previously untested environment at 108,000ft MSL to determine its high-altitude survivability and capabilities. Following release, ANGEL descended under a drogue until approximately 25,000ft, at which point the drogue was jettisoned and the main parachute was deployed, commencing navigation. Multiple data acquisition platforms were used to characterize the return-to-point technology performance and help determine its suitability for returning future scientific payloads ranging from 180 to 10,000lbs to safer and more convenient landing locations. This report describes the test vehicle design, and summarizes the captured sensor data. Various post-flight analyses are used to quantify the system's performance, gondola load data, and serve as a reference point for subsequent missions.
Flight Analysis of an Autonomously Navigated Experimental Lander
NASA Technical Reports Server (NTRS)
Chin, Jeffrey; Niehaus, Justin; Goodenow, Debra; Dunker, Storm; Montague, David
2016-01-01
First steps have been taken to qualify a family of parafoil systems capable of increasing the survivability and reusability of high-altitude balloon payloads. The research is motivated by the common risk facing balloon payloads where expensive flight hardware can often land in inaccessible areas that make them difficult or impossible to recover. The Autonomously Navigated Experimental Lander (ANGEL) flight test introduced a commercial Guided Parachute Aerial Delivery System (GPADS) to a previously untested environment at 108,000 feet Mean Sea Level (MSL) to determine its high-altitude survivability and capabilities. Following release, ANGEL descended under a drogue until approximately 25,000 feet, at which point the drogue was jettisoned and the main parachute was deployed, commencing navigation. Multiple data acquisition platforms were used to characterize the return-to-point technology performance and help determine its suitability for returning future scientific payloads ranging from 180 to 10,000 pounds to safer and more convenient landing locations. This report describes the test vehicle design, and summarizes the captured sensor data. Various post-flight analyses are used to quantify the systems performance, gondola load data, and serve as a reference point for subsequent missions.
Under-vehicle autonomous inspection through undercarriage signatures
NASA Astrophysics Data System (ADS)
Schoenherr, Edward; Smuda, Bill
2005-05-01
Increased threats to gate security have caused recent need for improved vehicle inspection methods at security checkpoints in various fields of defense and security. A fast, reliable system of under-vehicle inspection that detects possibly harmful or unwanted materials hidden on vehicle undercarriages and notifies the user of the presence of these materials while allowing the user a safe standoff distance from the inspection site is desirable. An autonomous under-vehicle inspection system would provide for this. The proposed system would function as follows: A low-clearance tele-operated robotic platform would be equipped with sonar/laser range finding sensors as well as a video camera. As a vehicle to be inspected enters a checkpoint, the robot would autonomously navigate under the vehicle, using algorithms to detect tire locations for weigh points. During this navigation, data would be collected from the sonar/laser range finding hardware. This range data would be used to compile an impression of the vehicle undercarriage. Once this impression is complete, the system would compare it to a database of pre-scanned undercarriage impressions. Based on vehicle makes and models, any variance between the undercarriage being inspected and the impression compared against in the database would be marked as potentially threatening. If such variances exist, the robot would navigate to these locations and place the video camera in such a manner that the location in question can be viewed from a standoff position through a TV monitor. At this time, manual control of the robot navigation and camera control can be taken to imply further, more detailed inspection of the area/materials in question. After-market vehicle modifications would provide some difficulty, yet with enough pre-screening of such modifications, the system should still prove accurate. Also, impression scans that are taken in the field can be stored and tagged with a vehicles's license plate number, and future inspections of that vehicle can be compared to already screened and cleared impressions of the same vehicle in order to search for variance.
POSTMAN: Point of Sail Tacking for Maritime Autonomous Navigation
NASA Technical Reports Server (NTRS)
Huntsberger, Terrance L.; Reinhart, Felix
2012-01-01
Waves apply significant forces to small boats, in particular when such vessels are moving at a high speed in severe sea conditions. In addition, small high-speed boats run the risk of diving with the bow into the next wave crest during operations in the wavelengths and wave speeds that are typical for shallow water. In order to mitigate the issues of autonomous navigation in rough water, a hybrid controller called POSTMAN combines the concept of POS (point of sail) tack planning from the sailing domain with a standard PID (proportional-integral-derivative) controller that implements reliable target reaching for the motorized small boat control task. This is an embedded, adaptive software controller that uses look-ahead sensing in a closed loop method to perform path planning for safer navigation in rough waters. State-of-the-art controllers for small boats are based on complex models of the vessel's kinematics and dynamics. They enable the vessel to follow preplanned paths accurately and can theoretically control all of the small boat s six degrees of freedom. However, the problems of bow diving and other undesirable incidents are not addressed, and it is questionable if a six-DOF controller with basically a single actuator is possible at all. POSTMAN builds an adaptive capability into the controller based on sensed wave characteristics. This software will bring a muchneeded capability to unmanned small boats moving at high speeds. Previously, this class of boat was limited to wave heights of less than one meter in the sea states in which it could operate. POSTMAN is a major advance in autonomous safety for small maritime craft.
The Deep Space Atomic Clock: Ushering in a New Paradigm for Radio Navigation and Science
NASA Technical Reports Server (NTRS)
Ely, Todd; Seubert, Jill; Prestage, John; Tjoelker, Robert
2013-01-01
The Deep Space Atomic Clock (DSAC) mission will demonstrate the on-orbit performance of a high-accuracy, high-stability miniaturized mercury ion atomic clock during a year-long experiment in Low Earth Orbit. DSAC's timing error requirement provides the frequency stability necessary to perform deep space navigation based solely on one-way radiometric tracking data. Compared to a two-way tracking paradigm, DSAC-enabled one-way tracking will benefit navigation and radio science by increasing the quantity and quality of tracking data. Additionally, DSAC also enables fully-autonomous onboard navigation useful for time-sensitive situations. The technology behind the mercury ion atomic clock and a DSAC mission overview are presented. Example deep space applications of DSAC, including navigation of a Mars orbiter and Europa flyby gravity science, highlight the benefits of DSAC-enabled one-way Doppler tracking.
NASA Astrophysics Data System (ADS)
Thomas, Romain; Donikian, Stéphane
Many articles dealing with agent navigation in an urban environment involve the use of various heuristics. Among them, one is prevalent: the search of the shortest path between two points. This strategy impairs the realism of the resulting behaviour. Indeed, psychological studies state that such a navigation behaviour is conditioned by the knowledge the subject has of its environment. Furthermore, the path a city dweller can follow may be influenced by many factors like his daily habits, or the path simplicity in term of minimum of direction changes. It appeared interesting to us to investigate how to mimic human navigation behavior with an autonomous agent. The solution we propose relies on an architecture based on a generic model of informed environment, a spatial cognitive map model merged with a human-like memory model, representing the agent's temporal knowledge of the environment, it gained along its experiences of navigation.
Autonomous integrated GPS/INS navigation experiment for OMV. Phase 1: Feasibility study
NASA Technical Reports Server (NTRS)
Upadhyay, Triveni N.; Priovolos, George J.; Rhodehamel, Harley
1990-01-01
The phase 1 research focused on the experiment definition. A tightly integrated Global Positioning System/Inertial Navigation System (GPS/INS) navigation filter design was analyzed and was shown, via detailed computer simulation, to provide precise position, velocity, and attitude (alignment) data to support navigation and attitude control requirements of future NASA missions. The application of the integrated filter was also shown to provide the opportunity to calibrate inertial instrument errors which is particularly useful in reducing INS error growth during times of GPS outages. While the Orbital Maneuvering Vehicle (OMV) provides a good target platform for demonstration and for possible flight implementation to provide improved capability, a successful proof-of-concept ground demonstration can be obtained using any simulated mission scenario data, such as Space Transfer Vehicle, Shuttle-C, Space Station.
Characteristic changes in the physiological components of cybersickness.
Kim, Young Youn; Kim, Hyun Ju; Kim, Eun Nam; Ko, Hee Dong; Kim, Hyun Taek
2005-09-01
We investigated the characteristic changes in the physiology of cybersickness when subjects were exposed to virtual reality. Sixty-one participants experienced a virtual navigation for a total of 9.5 min, and were required to detect specific virtual objects. Three questionnaires for sickness susceptibility and immersive tendency were obtained before the navigation. Sixteen electrophysiological signals were recorded before, during, and after the navigation. The severity of cybersickness experienced by participants was reported from a simulator sickness questionnaire after the navigation. The total severity of cybersickness had a significant positive correlation with gastric tachyarrhythmia, eyeblink rate, heart period, and EEG delta wave and a negative correlation with EEG beta wave. These results suggest that cybersickness accompanies the pattern changes in the activities of the central and the autonomic nervous systems.
Gaspra Optical Navigation Image
1996-02-08
This time-exposure picture of the asteroid Gaspra and background stars is one of four optical navigation images made by NASA Galileo imaging system to improve knowledge of Gaspra location for the spacecraft flyby. http://photojournal.jpl.nasa.gov/catalog/PIA00229
78 FR 23226 - 36(b)(1) Arms Sales Notification
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-18
..., Communication, Computer and Intelligence/Communication, Navigational and Identification (C4I/CNI); Autonomic.../ integration, aircraft ferry and tanker support, support equipment, tools and test equipment, communication... aircraft equipment includes: Electronic Warfare Systems; Command, Control, Communication, Computer and...
Autonomous formation flying sensor for the Star Light Mission
NASA Technical Reports Server (NTRS)
Aung, M.; Purcell, G.; Tien, J.; Young, L.; Srinivasan, J.; Ciminera, M. A.; Chong, Y. J.; Amaro, L. R.; Young, L. E.
2002-01-01
The StarLight Mission, an element of NASA's Origins Program, was designed for first-time demonstration of two technologies: formation flying optical interferometry between spacecraft and autonomous precise formation flying of an array of spacecraft to support optical interferometry. The design overview and results of the technology effort are presented in this paper.
2001 Flight Mechanics Symposium
NASA Technical Reports Server (NTRS)
Lynch, John P. (Editor)
2001-01-01
This conference publication includes papers and abstracts presented at the Flight Mechanics Symposium held on June 19-21, 2001. Sponsored by the Guidance, Navigation and Control Center of Goddard Space Flight Center, this symposium featured technical papers on a wide range of issues related to attitude/orbit determination, prediction and control; attitude simulation; attitude sensor calibration; theoretical foundation of attitude computation; dynamics model improvements; autonomous navigation; constellation design and formation flying; estimation theory and computational techniques; Earth environment mission analysis and design; and, spacecraft re-entry mission design and operations.
2013-05-29
not necessarily express the views of and should not be attributed to ESA. 1 and visual navigation to maneuver autonomously to reduce the size of the...successful orbit and three-dimensional imaging of an RSO, using passive visual -only navigation and real-time near-optimal guidance. The mission design...Kit ( STK ) in the Earth-centered Earth-fixed (ECF) co- ordinate system, loaded to Simulink and transformed to the BFF for calculation of the SRP
Real-Time and High-Fidelity Simulation Environment for Autonomous Ground Vehicle Dynamics
NASA Technical Reports Server (NTRS)
Cameron, Jonathan; Myint, Steven; Kuo, Calvin; Jain, Abhi; Grip, Havard; Jayakumar, Paramsothy; Overholt, Jim
2013-01-01
This paper reports on a collaborative project between U.S. Army TARDEC and Jet Propulsion Laboratory (JPL) to develop a unmanned ground vehicle (UGV) simulation model using the ROAMS vehicle modeling framework. Besides modeling the physical suspension of the vehicle, the sensing and navigation of the HMMWV vehicle are simulated. Using models of urban and off-road environments, the HMMWV simulation was tested in several ways, including navigation in an urban environment with obstacle avoidance and the performance of a lane change maneuver.
Price, Richard; Marsh, Abbie J; Fisher, Marisa H
2018-03-01
Facilitating the use of public transportation enhances opportunities for independent living and competitive, community-based employment for individuals with intellectual and developmental disabilities (IDD). Four young adults with IDD were taught through total-task chaining to use the Google Maps application, a self-prompting, visual navigation system, to take the bus to locations around a college campus and the community. Three of four participants learned to use Google Maps to independently navigate public transportation. Google Maps may be helpful in supporting independent travel, highlighting the importance of future research in teaching navigation skills. Learning to independently use public transportation increases access to autonomous activities, such as opportunities to work and to attend postsecondary education programs on large college campuses.Individuals with IDD can be taught through chaining procedures to use the Google Maps application to navigate public transportation.Mobile map applications are an effective and functional modern tool that can be used to teach community navigation.
Precision analysis of autonomous orbit determination using star sensor for Beidou MEO satellite
NASA Astrophysics Data System (ADS)
Shang, Lin; Chang, Jiachao; Zhang, Jun; Li, Guotong
2018-04-01
This paper focuses on the autonomous orbit determination accuracy of Beidou MEO satellite using the onboard observations of the star sensors and infrared horizon sensor. A polynomial fitting method is proposed to calibrate the periodic error in the observation of the infrared horizon sensor, which will greatly influence the accuracy of autonomous orbit determination. Test results show that the periodic error can be eliminated using the polynomial fitting method. The User Range Error (URE) of Beidou MEO satellite is less than 2 km using the observations of the star sensors and infrared horizon sensor for autonomous orbit determination. The error of the Right Ascension of Ascending Node (RAAN) is less than 60 μrad and the observations of star sensors can be used as a spatial basis for Beidou MEO navigation constellation.
Mariner Mars 1971 optical navigation demonstration
NASA Technical Reports Server (NTRS)
Born, G. H.; Duxbury, T. C.; Breckenridge, W. G.; Acton, C. H.; Mohan, S.; Jerath, N.; Ohtakay, H.
1974-01-01
The feasibility of using a combination of spacecraft-based optical data and earth-based Doppler data to perform near-real-time approach navigation was demonstrated by the Mariner Mars 71 Project. The important findings, conclusions, and recommendations are documented. A summary along with publications and papers giving additional details on the objectives of the demonstration are provided. Instrument calibration and performance as well as navigation and science results are reported.
1999-08-01
Electro - Optic Sensor Integration Technology (NEOSIT) software application. The design is highly modular and based on COTS tools to facilitate integration with sensors, navigation and digital data sources already installed on different host
Optical Navigation Image of Ganymede
1996-06-06
NASA Galileo spacecraft, now in orbit around Jupiter, returned this optical navigation image June 3, 1996, showing that the spacecraft is accurately targeted for its first flyby of the giant moon Ganymede on June 27. http://photojournal.jpl.nasa.gov/catalog/PIA00273
Image navigation as a means to expand the boundaries of fluorescence-guided surgery
NASA Astrophysics Data System (ADS)
Brouwer, Oscar R.; Buckle, Tessa; Bunschoten, Anton; Kuil, Joeri; Vahrmeijer, Alexander L.; Wendler, Thomas; Valdés-Olmos, Renato A.; van der Poel, Henk G.; van Leeuwen, Fijs W. B.
2012-05-01
Hybrid tracers that are both radioactive and fluorescent help extend the use of fluorescence-guided surgery to deeper structures. Such hybrid tracers facilitate preoperative surgical planning using (3D) scintigraphic images and enable synchronous intraoperative radio- and fluorescence guidance. Nevertheless, we previously found that improved orientation during laparoscopic surgery remains desirable. Here we illustrate how intraoperative navigation based on optical tracking of a fluorescence endoscope may help further improve the accuracy of hybrid surgical guidance. After feeding SPECT/CT images with an optical fiducial as a reference target to the navigation system, optical tracking could be used to position the tip of the fluorescence endoscope relative to the preoperative 3D imaging data. This hybrid navigation approach allowed us to accurately identify marker seeds in a phantom setup. The multispectral nature of the fluorescence endoscope enabled stepwise visualization of the two clinically approved fluorescent dyes, fluorescein and indocyanine green. In addition, the approach was used to navigate toward the prostate in a patient undergoing robot-assisted prostatectomy. Navigation of the tracked fluorescence endoscope toward the target identified on SPECT/CT resulted in real-time gradual visualization of the fluorescent signal in the prostate, thus providing an intraoperative confirmation of the navigation accuracy.
Piao, Jin-Chun; Kim, Shin-Dug
2017-11-07
Simultaneous localization and mapping (SLAM) is emerging as a prominent issue in computer vision and next-generation core technology for robots, autonomous navigation and augmented reality. In augmented reality applications, fast camera pose estimation and true scale are important. In this paper, we present an adaptive monocular visual-inertial SLAM method for real-time augmented reality applications in mobile devices. First, the SLAM system is implemented based on the visual-inertial odometry method that combines data from a mobile device camera and inertial measurement unit sensor. Second, we present an optical-flow-based fast visual odometry method for real-time camera pose estimation. Finally, an adaptive monocular visual-inertial SLAM is implemented by presenting an adaptive execution module that dynamically selects visual-inertial odometry or optical-flow-based fast visual odometry. Experimental results show that the average translation root-mean-square error of keyframe trajectory is approximately 0.0617 m with the EuRoC dataset. The average tracking time is reduced by 7.8%, 12.9%, and 18.8% when different level-set adaptive policies are applied. Moreover, we conducted experiments with real mobile device sensors, and the results demonstrate the effectiveness of performance improvement using the proposed method.
Design Considerations For Imaging Charge-Coupled Device (ICCD) Star Sensors
NASA Astrophysics Data System (ADS)
McAloon, K. J.
1981-04-01
A development program is currently underway to produce a precision star sensor using imaging charge coupled device (ICCD) technology. The effort is the critical component development phase for the Air Force Multi-Mission Attitude Determination and Autonomous Navigation System (MADAN). A number of unique considerations have evolved in designing an arcsecond accuracy sensor around an ICCD detector. Three tiers of performance criteria are involved: at the spacecraft attitude determination system level, at the star sensor level, and at the detector level. Optimum attitude determination system performance involves a tradeoff between Kalman filter iteration time and sensor ICCD integration time. The ICCD star sensor lends itself to the use of a new approach in the functional interface between the attitude determination system and the sensor. At the sensor level image data processing tradeoffs are important for optimum sensor performance. These tradeoffs involve the sensor optic configuration, the optical point spread function (PSF) size and shape, the PSF position locator, and the microprocessor locator algorithm. Performance modelling of the sensor mandates the use of computer simulation programs. Five key performance parameters at the ICCD detector level are defined. ICCD error characteristics have also been isolated to five key parameters.
VCSELs in short-pulse operation for time-of-flight applications
NASA Astrophysics Data System (ADS)
Moench, Holger; Gronenborn, Stephan; Gu, Xi; Gudde, Ralph; Herper, Markus; Kolb, Johanna; Miller, Michael; Smeets, Michael; Weigl, Alexander
2018-02-01
VCSEL arrays are the ideal light source for 3D imaging applications. The narrow emission spectrum and the ability for short pulses make them superior to LEDs. Combined with fast photodiodes or special camera chips spatial information can be obtained which is needed in diverse applications like camera autofocus, indoor navigation, 3D-object recognition, augmented reality or autonomously driving vehicles. Pulse operation at the ns scale and at low duty cycle can work with significantly higher current than traditionally used for VCSELs in continuous wave operation. With reduced thermal limitations at low average heat dissipation very high currents become feasible and tens of Watts output power have been realized with small VCSEL chips. The optical emission pattern of VCSELs can be tailored to the desired field of view using beam shaping elements. Such optical elements also enable laser safe class 1 products. A detailed analysis of the complete system and the operation mode is required to calculate the maximum permitted power for a safe system. The good VCSEL properties like robustness, stability over temperature and the potential for integrated solutions open a huge potential for VCSELs in new mass applications in the consumer and automotive markets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harber, K.S.; Pin, F.G.
1990-03-01
The US DOE Center for Engineering Systems Advanced Research (CESAR) at the Oak Ridge National Laboratory (ORNL) and the Commissariat a l'Energie Atomique's (CEA) Office de Robotique et Productique within the Directorat a la Valorization are working toward a long-term cooperative agreement and relationship in the area of Intelligent Systems Research (ISR). This report presents the proceedings of the first CESAR/CEA Workshop on Autonomous Mobile Robots which took place at ORNL on May 30, 31 and June 1, 1989. The purpose of the workshop was to present and discuss methodologies and algorithms under development at the two facilities in themore » area of perception and navigation for autonomous mobile robots in unstructured environments. Experimental demonstration of the algorithms and comparison of some of their features were proposed to take place within the framework of a previously mutually agreed-upon demonstration scenario or base-case.'' The base-case scenario described in detail in Appendix A, involved autonomous navigation by the robot in an a priori unknown environment with dynamic obstacles, in order to reach a predetermined goal. From the intermediate goal location, the robot had to search for and locate a control panel, move toward it, and dock in front of the panel face. The CESAR demonstration was successfully accomplished using the HERMIES-IIB robot while subsets of the CEA demonstration performed using the ARES robot simulation and animation system were presented. The first session of the workshop focused on these experimental demonstrations and on the needs and considerations for establishing benchmarks'' for testing autonomous robot control algorithms.« less
Autonomous sensor-transponder RFID with supply energy conditioning for object navigation systems
NASA Astrophysics Data System (ADS)
Skoczylas, M.; Kamuda, K.; Jankowski-Mihułowicz, P.; Kalita, W.; Weglarski, Mariusz
2014-08-01
The properties of energy conditioning electrical circuits that are developed for powering additional functional blocks of autonomous RFID transponders working in the HF band have been analyzed and presented in the paper. The concept of autonomy is realized by implementing extra functions in the typical transponder. First of all, the autonomous system should harvest energy, e.g. from the electromagnetic field of read/write devices but also the possibility of gathering information about environment should be available, e.g. by measuring different kind of physical quantities. In such an electrical device, the crucial problem consists in energy conditioning because the output voltage-current characteristic of an front-end (antenna with matching and harvesting circuit) as well as the total and instantaneous power load generated by internal circuits are strongly dependent on a realized function but also on energy and communication conditions in the RFID interface. The properly designed solution should improve harvesting efficiency, current leakage of supply storage, matching between antenna and input circuits, in order to save energy and increase operating time in such a battery-free system. The authors present methods how to increase the autonomous operation time even at advanced measuring algorithms. The measuring system with wide spectrum of sensors dedicated for different quantities (physical, chemical, etc.) has also been presented. The results of model calculations and experimental verifications have been also discussed on the basis of investigations conducted in the unique laboratory stand of object navigation systems.
NASA Technical Reports Server (NTRS)
Brockers, Roland; Susca, Sara; Zhu, David; Matthies, Larry
2012-01-01
Direct-lift micro air vehicles have important applications in reconnaissance. In order to conduct persistent surveillance in urban environments, it is essential that these systems can perform autonomous landing maneuvers on elevated surfaces that provide high vantage points without the help of any external sensor and with a fully contained on-board software solution. In this paper, we present a micro air vehicle that uses vision feedback from a single down looking camera to navigate autonomously and detect an elevated landing platform as a surrogate for a roof top. Our method requires no special preparation (labels or markers) of the landing location. Rather, leveraging the planar character of urban structure, the landing platform detection system uses a planar homography decomposition to detect landing targets and produce approach waypoints for autonomous landing. The vehicle control algorithm uses a Kalman filter based approach for pose estimation to fuse visual SLAM (PTAM) position estimates with IMU data to correct for high latency SLAM inputs and to increase the position estimate update rate in order to improve control stability. Scale recovery is achieved using inputs from a sonar altimeter. In experimental runs, we demonstrate a real-time implementation running on-board a micro aerial vehicle that is fully self-contained and independent from any external sensor information. With this method, the vehicle is able to search autonomously for a landing location and perform precision landing maneuvers on the detected targets.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Team KuuKulgur waits to begin the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
NASA Astrophysics Data System (ADS)
Jankovic, Marko; Paul, Jan; Kirchner, Frank
2016-04-01
Recent studies of the space debris population in low Earth orbit (LEO) have concluded that certain regions have already reached a critical density of objects. This will eventually lead to a cascading process called the Kessler syndrome. The time may have come to seriously consider active debris removal (ADR) missions as the only viable way of preserving the space environment for future generations. Among all objects in the current environment, the SL-8 (Kosmos 3M second stages) rocket bodies (R/Bs) are some of the most suitable targets for future robotic ADR missions. However, to date, an autonomous relative navigation to and capture of an non-cooperative target has never been performed. Therefore, there is a need for more advanced, autonomous and modular systems that can cope with uncontrolled, tumbling objects. The guidance, navigation and control (GNC) system is one of the most critical ones. The main objective of this paper is to present a preliminary concept of a modular GNC architecture that should enable a safe and fuel-efficient capture of a known but uncooperative target, such as Kosmos 3M R/B. In particular, the concept was developed having in mind the most critical part of an ADR mission, i.e. close range proximity operations, and state of the art algorithms in the field of autonomous rendezvous and docking. In the end, a brief description of the hardware in the loop (HIL) testing facility is made, foreseen for the practical evaluation of the developed architecture.
NASA Astrophysics Data System (ADS)
Lu, Jiazhen; Lei, Chaohua; Yang, Yanqiang; Liu, Ming
2017-06-01
Many countries have been paying great attention to space exploration, especially about the Moon and the Mars. Autonomous and high-accuracy navigation systems are needed for probers and rovers to accomplish missions. Inertial navigation system (INS)/celestial navigation system (CNS) based navigation system has been used widely on the lunar rovers. Initialization is a particularly important step for navigation. This paper presents an in-motion alignment and positioning method for lunar rovers by INS/CNS/odometer integrated navigation. The method can estimate not only the position and attitude errors, but also the biases of the accelerometers and gyros using the standard Kalman filter. The differences between the platform star azimuth, elevation angles and the computed star azimuth, elevation angles, and the difference between the velocity measured by odometer and the velocity measured by inertial sensors are taken as measurements. The semi-physical experiments are implemented to demonstrate that the position error can reduce to 10 m and attitude error is within 2″ during 5 min. The experiment results prove that it is an effective and attractive initialization approach for lunar rovers.
MSR Fetch Rover Capability Development at the Canadian Space Agency
NASA Astrophysics Data System (ADS)
Picard, M.; Hipkin, V.; Gingras, D.; Allard, P.; Lamarche, T.; Rocheleau, S. G.; Gemme, S.
2018-04-01
Describes Fetch Rover technology testing during CSA's 2016 Mars Sample Return Analogue Deployment which demonstrated autonomous navigation to 'cache depots' of M-2020-like sample tubes, acquisition of six such tubes, and transfer to a MAV mock up.
Semi-autonomous parking for enhanced safety and efficiency.
DOT National Transportation Integrated Search
2017-06-01
This project focuses on the use of tools from a combination of computer vision and localization based navigation schemes to aid the process of efficient and safe parking of vehicles in high density parking spaces. The principles of collision avoidanc...
Autonomous terrain characterization and modelling for dynamic control of unmanned vehicles
NASA Technical Reports Server (NTRS)
Talukder, A.; Manduchi, R.; Castano, R.; Owens, K.; Matthies, L.; Castano, A.; Hogg, R.
2002-01-01
This end-to-end obstacle negotiation system is envisioned to be useful in optimized path planning and vehicle navigation in terrain conditions cluttered with vegetation, bushes, rocks, etc. Results on natural terrain with various natural materials are presented.
GPS/GLONASS RAIM augmentation to WAAS for CAT 1 precision approach
DOT National Transportation Integrated Search
1997-06-30
This paper deals with the potential use of Receiver Autonomous Integrity Monitoring @AIM) to supplement the FAAs Wide Area Augmentation System (WAAS). Integrity refers to the capability of a navigation or landing system to provide a timely warning...
First Results from a Hardware-in-the-Loop Demonstration of Closed-Loop Autonomous Formation Flying
NASA Technical Reports Server (NTRS)
Gill, E.; Naasz, Bo; Ebinuma, T.
2003-01-01
A closed-loop system for the demonstration of autonomous satellite formation flying technologies using hardware-in-the-loop has been developed. Making use of a GPS signal simulator with a dual radio frequency outlet, the system includes two GPS space receivers as well as a powerful onboard navigation processor dedicated to the GPS-based guidance, navigation, and control of a satellite formation in real-time. The closed-loop system allows realistic simulations of autonomous formation flying scenarios, enabling research in the fields of tracking and orbit control strategies for a wide range of applications. The autonomous closed-loop formation acquisition and keeping strategy is based on Lyapunov's direct control method as applied to the standard set of Keplerian elements. This approach not only assures global and asymptotic stability of the control but also maintains valuable physical insight into the applied control vectors. Furthermore, the approach can account for system uncertainties and effectively avoids a computationally expensive solution of the two point boundary problem, which renders the concept particularly attractive for implementation in onboard processors. A guidance law has been developed which strictly separates the relative from the absolute motion, thus avoiding the numerical integration of a target trajectory in the onboard processor. Moreover, upon using precise kinematic relative GPS solutions, a dynamical modeling or filtering is avoided which provides for an efficient implementation of the process on an onboard processor. A sample formation flying scenario has been created aiming at the autonomous transition of a Low Earth Orbit satellite formation from an initial along-track separation of 800 m to a target distance of 100 m. Assuming a low-thrust actuator which may be accommodated on a small satellite, a typical control accuracy of less than 5 m has been achieved which proves the applicability of autonomous formation flying techniques to formations of satellites as close as 50 m.
Interaction dynamics of multiple mobile robots with simple navigation strategies
NASA Technical Reports Server (NTRS)
Wang, P. K. C.
1989-01-01
The global dynamic behavior of multiple interacting autonomous mobile robots with simple navigation strategies is studied. Here, the effective spatial domain of each robot is taken to be a closed ball about its mass center. It is assumed that each robot has a specified cone of visibility such that interaction with other robots takes place only when they enter its visibility cone. Based on a particle model for the robots, various simple homing and collision-avoidance navigation strategies are derived. Then, an analysis of the dynamical behavior of the interacting robots in unbounded spatial domains is made. The article concludes with the results of computer simulations studies of two or more interacting robots.
NASA Astrophysics Data System (ADS)
Heck, Martijn J. R.
2017-01-01
Technologies for efficient generation and fast scanning of narrow free-space laser beams find major applications in three-dimensional (3D) imaging and mapping, like Lidar for remote sensing and navigation, and secure free-space optical communications. The ultimate goal for such a system is to reduce its size, weight, and power consumption, so that it can be mounted on, e.g. drones and autonomous cars. Moreover, beam scanning should ideally be done at video frame rates, something that is beyond the capabilities of current opto-mechanical systems. Photonic integrated circuit (PIC) technology holds the promise of achieving low-cost, compact, robust and energy-efficient complex optical systems. PICs integrate, for example, lasers, modulators, detectors, and filters on a single piece of semiconductor, typically silicon or indium phosphide, much like electronic integrated circuits. This technology is maturing fast, driven by high-bandwidth communications applications, and mature fabrication facilities. State-of-the-art commercial PICs integrate hundreds of elements, and the integration of thousands of elements has been shown in the laboratory. Over the last few years, there has been a considerable research effort to integrate beam steering systems on a PIC, and various beam steering demonstrators based on optical phased arrays have been realized. Arrays of up to thousands of coherent emitters, including their phase and amplitude control, have been integrated, and various applications have been explored. In this review paper, I will present an overview of the state of the art of this technology and its opportunities, illustrated by recent breakthroughs.
NASA Astrophysics Data System (ADS)
Maki, Toshihiro; Ura, Tamaki; Singh, Hanumant; Sakamaki, Takashi
Large-area seafloor imaging will bring significant benefits to various fields such as academics, resource survey, marine development, security, and search-and-rescue. The authors have proposed a navigation method of an autonomous underwater vehicle for seafloor imaging, and verified its performance through mapping tubeworm colonies with the area of 3,000 square meters using the AUV Tri-Dog 1 at Tagiri vent field, Kagoshima bay in Japan (Maki et al., 2008, 2009). This paper proposes a post-processing method to build a natural photo mosaic from a number of pictures taken by an underwater platform. The method firstly removes lens distortion, invariances of color and lighting from each image, and then ortho-rectification is performed based on camera pose and seafloor estimated by navigation data. The image alignment is based on both navigation data and visual characteristics, implemented as an expansion of the image based method (Pizarro et al., 2003). Using the two types of information realizes an image alignment that is consistent both globally and locally, as well as making the method applicable to data sets with little visual keys. The method was evaluated using a data set obtained by the AUV Tri-Dog 1 at the vent field in Sep. 2009. A seamless, uniformly illuminated photo mosaic covering the area of around 500 square meters was created from 391 pictures, which covers unique features of the field such as bacteria mats and tubeworm colonies.
Advancing Navigation, Timing, and Science with the Deep Space Atomic Clock
NASA Technical Reports Server (NTRS)
Ely, Todd A.; Seubert, Jill; Bell, Julia
2014-01-01
NASA's Deep Space Atomic Clock mission is developing a small, highly stable mercury ion atomic clock with an Allan deviation of at most 1e-14 at one day, and with current estimates near 3e-15. This stability enables one-way radiometric tracking data with accuracy equivalent to and, in certain conditions, better than current two-way deep space tracking data; allowing a shift to a more efficient and flexible one-way deep space navigation architecture. DSAC-enabled one-way tracking will benefit navigation and radio science by increasing the quantity and quality of tracking data. Additionally, DSAC would be a key component to fully-autonomous onboard radio navigation useful for time-sensitive situations. Potential deep space applications of DSAC are presented, including orbit determination of a Mars orbiter and gravity science on a Europa flyby mission.
Cobalt: Development and Maturation of GN&C Technologies for Precision Landing
NASA Technical Reports Server (NTRS)
Carson, John M.; Restrepo, Carolina; Seubert, Carl; Amzajerdian, Farzin
2016-01-01
The CoOperative Blending of Autonomous Landing Technologies (COBALT) instrument is a terrestrial test platform for development and maturation of guidance, navigation and control (GN&C) technologies for precision landing. The project is developing a third-generation Langley Research Center (LaRC) navigation doppler lidar (NDL) for ultra-precise velocity and range measurements, which will be integrated and tested with the Jet Propulsion Laboratory (JPL) lander vision system (LVS) for terrain relative navigation (TRN) position estimates. These technologies together provide precise navigation knowledge that is critical for a controlled and precise touchdown. The COBALT hardware will be integrated in 2017 into the GN&C subsystem of the Xodiac rocket-propulsive vertical test bed (VTB) developed by Masten Space Systems, and two terrestrial flight campaigns will be conducted: one open-loop (i.e., passive) and one closed-loop (i.e., active).
A Rover Mobility Platform with Autonomous Capability to Enable Mars Sample Return
NASA Astrophysics Data System (ADS)
Fulford, P.; Langley, C.; Shaw, A.
2018-04-01
The next step in understanding Mars is sample return. In Fall 2016, the CSA conducted an analogue deployment using the Mars Exploration Science Rover. An objective was to demonstrate the maturity of the rover's guidance, navigation, and control.
CERTAIN: City Environment Range Testing for Autonomous Integrated Navigation
NASA Technical Reports Server (NTRS)
Brown, Jill
2016-01-01
This is a presentation to the DOI UAS Interagency group that is informational in nature; just sharing publicly available and previously released information on UAS regulations and some of the UAS operations Langley Center has, to create awareness with other government agencies.
Titan Aerial Daughtercraft (TAD) for Surface Studies from a Lander or Balloon
NASA Astrophysics Data System (ADS)
Matthies, L.; Tokumaru, P.; Sherrit, S.; Beauchamp, P.
2014-06-01
Recent rapid progress on autonomous navigation of micro air vehicles for terrestrial applications opens new possibilities for a small aerial vehicle that could deploy from a Titan lander or balloon to acquire samples for analysis on the mothership.
NASA Tech Briefs, October 2011
NASA Technical Reports Server (NTRS)
2011-01-01
Topics covered include: Laser Truss Sensor for Segmented Telescope Phasing; Qualifications of Bonding Process of Temperature Sensors to Deep-Space Missions; Optical Sensors for Monitoring Gamma and Neutron Radiation; Compliant Tactile Sensors; Cytometer on a Chip; Measuring Input Thresholds on an Existing Board; Scanning and Defocusing Properties of Microstrip Reflectarray Antennas; Cable Tester Box; Programmable Oscillator; Fault-Tolerant, Radiation-Hard DSP; Sub-Shot Noise Power Source for Microelectronics; Asynchronous Message Service Reference Implementation; Zero-Copy Objects System; Delay and Disruption Tolerant Networking MACHETE Model; Contact Graph Routing; Parallel Eclipse Project Checkout; Technique for Configuring an Actively Cooled Thermal Shield in a Flight System; Use of Additives to Improve Performance of Methyl Butyrate-Based Lithium-Ion Electrolytes; Li-Ion Cells Employing Electrolytes with Methyl Propionate and Ethyl Butyrate Co-Solvents; Improved Devices for Collecting Sweat for Chemical Analysis; Tissue Photolithography; Method for Impeding Degradation of Porous Silicon Structures; External Cooling Coupled to Reduced Extremity Pressure Device; A Zero-Gravity Cup for Drinking Beverages in Microgravity; Co-Flow Hollow Cathode Technology; Programmable Aperture with MEMS Microshutter Arrays; Polished Panel Optical Receiver for Simultaneous RF/Optical Telemetry with Large DSN Antennas; Adaptive System Modeling for Spacecraft Simulation; Lidar-Based Navigation Algorithm for Safe Lunar Landing; Tracking Object Existence From an Autonomous Patrol Vehicle; Rad-Hard, Miniaturized, Scalable, High-Voltage Switching Module for Power Applications; and Architecture for a 1-GHz Digital RADAR.
Application of parallelized software architecture to an autonomous ground vehicle
NASA Astrophysics Data System (ADS)
Shakya, Rahul; Wright, Adam; Shin, Young Ho; Momin, Orko; Petkovsek, Steven; Wortman, Paul; Gautam, Prasanna; Norton, Adam
2011-01-01
This paper presents improvements made to Q, an autonomous ground vehicle designed to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2010 IGVC, Q was upgraded with a new parallelized software architecture and a new vision processor. Improvements were made to the power system reducing the number of batteries required for operation from six to one. In previous years, a single state machine was used to execute the bulk of processing activities including sensor interfacing, data processing, path planning, navigation algorithms and motor control. This inefficient approach led to poor software performance and made it difficult to maintain or modify. For IGVC 2010, the team implemented a modular parallel architecture using the National Instruments (NI) LabVIEW programming language. The new architecture divides all the necessary tasks - motor control, navigation, sensor data collection, etc. into well-organized components that execute in parallel, providing considerable flexibility and facilitating efficient use of processing power. Computer vision is used to detect white lines on the ground and determine their location relative to the robot. With the new vision processor and some optimization of the image processing algorithm used last year, two frames can be acquired and processed in 70ms. With all these improvements, Q placed 2nd in the autonomous challenge.
Constrained navigation for unmanned systems
NASA Astrophysics Data System (ADS)
Vasseur, Laurent; Gosset, Philippe; Carpentier, Luc; Marion, Vincent; Morillon, Joel G.; Ropars, Patrice
2005-05-01
The French Military Robotic Study Program (introduced in Aerosense 2003), sponsored by the French Defense Procurement Agency and managed by Thales as the prime contractor, focuses on about 15 robotic themes which can provide an immediate "operational add-on value". The paper details the "constrained navigation" study (named TEL2), which main goal is to identify and test a well-balanced task sharing between man and machine to accomplish a robotic task that cannot be performed autonomously at the moment because of technological limitations. The chosen function is "obstacle avoidance" on rough ground and quite high speed (40 km/h). State of the art algorithms have been implemented to perform autonomous obstacle avoidance and following of forest borders, using scanner laser sensor and standard localization functions. Such an "obstacle avoidance" function works well most of the time, BUT fails sometimes. The study analyzed how the remote operator can manage such failures so that the system remains fully operationally reliable; he can act according to two ways: a) finely adjust the vehicle current heading; b) take the control of the vehicle "on the fly" (without stopping) and bring it back to autonomous behavior when motion is secured again. The paper also presents the results got from the military acceptance tests performed on French 4x4 DARDS ATD.
Localization system for use in GPS denied environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trueblood, J. J.
The military uses to autonomous platforms to complete missions to provide standoff for the warfighters. However autonomous platforms rely on GPS to provide their global position. In many missions spaces the autonomous platforms may encounter GPS denied environments which limits where the platform operates and requires the warfighters to takes its place. GPS denied environments can occur due to tall building, trees, canyon wall blocking the GPS satellite signals or a lack of coverage. An Inertial Navigation System (INS) uses sensors to detect the vehicle movement and direction its traveling to calculate the vehicle. One of biggest challenges with anmore » INS system is the accuracy and accumulation of errors over time of the sensors. If these challenges can be overcome the INS would provide accurate positioning information to the autonomous vehicle in GPS denied environments and allow them to provide the desired standoff for the warfighters.« less
GPS/DR Error Estimation for Autonomous Vehicle Localization.
Lee, Byung-Hyun; Song, Jong-Hwa; Im, Jun-Hyuck; Im, Sung-Hyuck; Heo, Moon-Beom; Jee, Gyu-In
2015-08-21
Autonomous vehicles require highly reliable navigation capabilities. For example, a lane-following method cannot be applied in an intersection without lanes, and since typical lane detection is performed using a straight-line model, errors can occur when the lateral distance is estimated in curved sections due to a model mismatch. Therefore, this paper proposes a localization method that uses GPS/DR error estimation based on a lane detection method with curved lane models, stop line detection, and curve matching in order to improve the performance during waypoint following procedures. The advantage of using the proposed method is that position information can be provided for autonomous driving through intersections, in sections with sharp curves, and in curved sections following a straight section. The proposed method was applied in autonomous vehicles at an experimental site to evaluate its performance, and the results indicate that the positioning achieved accuracy at the sub-meter level.
GPS/DR Error Estimation for Autonomous Vehicle Localization
Lee, Byung-Hyun; Song, Jong-Hwa; Im, Jun-Hyuck; Im, Sung-Hyuck; Heo, Moon-Beom; Jee, Gyu-In
2015-01-01
Autonomous vehicles require highly reliable navigation capabilities. For example, a lane-following method cannot be applied in an intersection without lanes, and since typical lane detection is performed using a straight-line model, errors can occur when the lateral distance is estimated in curved sections due to a model mismatch. Therefore, this paper proposes a localization method that uses GPS/DR error estimation based on a lane detection method with curved lane models, stop line detection, and curve matching in order to improve the performance during waypoint following procedures. The advantage of using the proposed method is that position information can be provided for autonomous driving through intersections, in sections with sharp curves, and in curved sections following a straight section. The proposed method was applied in autonomous vehicles at an experimental site to evaluate its performance, and the results indicate that the positioning achieved accuracy at the sub-meter level. PMID:26307997
Wind-Based Navigation of a Hot-air Balloon on Titan: A Feasibility Study
NASA Technical Reports Server (NTRS)
Furfaro, Roberto; Lunine, Jonathan I.; Elfes, Alberto; Reh, Kim
2008-01-01
Current analysis of data streamed back to Earth by the Cassini spacecraft features Titan as one of the most exciting places in the solar system. NASA centers and universities around the US, as well as the European Space Agency, are studying the possibility of sending, as part of the next mission to this giant moon of Saturn, a hot-air balloon (Montgolfier-type) for further and more in-depth exploration. The basic idea would be to design a reliable, semi-autonomous, and yet cheap Montgolfier capable of using continuous flow of waste heat from a power source to lift the balloon and sustain its altitude in the Titan environment. In this paper we study the problem of locally navigating a hot-air balloon in the nitrogen-based Titan atmosphere. The basic idea is to define a strategy (i.e. design of a suitable guidance system) that allows autonomous and semi-autonomous navigation of the balloon using the available (and partial) knowledge of the wind structure blowing on the saturnian satellite surface. Starting from first principles we determined the appropriate thermal and dynamical models describing (a) the vertical dynamics of the balloon and (b) the dynamics of the balloon moving on a vertical plane (2-D motion). Next, various non-linear fuzzy-based control strategies have been evaluated, analyzed and implemented in MATLAB to numerically simulate the capability of the system to simultaneously maintain altitude, as well as a scientifically desirable trajectory. We also looked at the ability of the balloon to perform station keeping. The results of the simulation are encouraging and show the effectiveness of such a system to cheaply and effectively perform semiautonomous exploration of Titan.
Autonomous spacecraft maintenance study group
NASA Technical Reports Server (NTRS)
Marshall, M. H.; Low, G. D.
1981-01-01
A plan to incorporate autonomous spacecraft maintenance (ASM) capabilities into Air Force spacecraft by 1989 is outlined. It includes the successful operation of the spacecraft without ground operator intervention for extended periods of time. Mechanisms, along with a fault tolerant data processing system (including a nonvolatile backup memory) and an autonomous navigation capability, are needed to replace the routine servicing that is presently performed by the ground system. The state of the art fault handling capabilities of various spacecraft and computers are described, and a set conceptual design requirements needed to achieve ASM is established. Implementations for near term technology development needed for an ASM proof of concept demonstration by 1985, and a research agenda addressing long range academic research for an advanced ASM system for 1990s are established.
Theseus: tethered distributed robotics (TDR)
NASA Astrophysics Data System (ADS)
Digney, Bruce L.; Penzes, Steven G.
2003-09-01
The Defence Research and Development Canada's (DRDC) Autonomous Intelligent System's program conducts research to increase the independence and effectiveness of military vehicles and systems. DRDC-Suffield's Autonomous Land Systems (ALS) is creating new concept vehicles and autonomous control systems for use in outdoor areas, urban streets, urban interiors and urban subspaces. This paper will first give an overview of the ALS program and then give a specific description of the work being done for mobility in urban subspaces. Discussed will be the Theseus: Thethered Distributed Robotics (TDR) system, which will not only manage an unavoidable tether but exploit it for mobility and navigation. Also discussed will be the prototype robot called the Hedgehog, which uses conformal 3D mobility in ducts, sewer pipes, collapsed rubble voids and chimneys.
Efforts toward an autonomous wheelchair - biomed 2011.
Barrett, Steven; Streeter, Robert
2011-01-01
An autonomous wheelchair is in development to provide mobility to those with significant physical challenges. The overall goal of the project is to develop a wheelchair that is fully autonomous with the ability to navigate about an environment and negotiate obstacles. As a starting point for the project, we have reversed engineered the joystick control system of an off-the-shelf commercially available wheelchair. The joystick control has been replaced with a microcontroller based system. The microcontroller has the capability to interface with a number of subsystems currently under development including wheel odometers, obstacle avoidance sensors, and ultrasonic-based wall sensors. This paper will discuss the microcontroller based system and provide a detailed system description. Results of this study may be adapted to commercial or military robot control.
Development for SSV on a parallel processing system (PARAGON)
NASA Astrophysics Data System (ADS)
Gothard, Benny M.; Allmen, Mark; Carroll, Michael J.; Rich, Dan
1995-12-01
A goal of the surrogate semi-autonomous vehicle (SSV) program is to have multiple vehicles navigate autonomously and cooperatively with other vehicles. This paper describes the process and tools used in porting UGV/SSV (unmanned ground vehicle) autonomous mobility and target recognition algorithms from a SISD (single instruction single data) processor architecture (i.e., a Sun SPARC workstation running C/UNIX) to a MIMD (multiple instruction multiple data) parallel processor architecture (i.e., PARAGON-a parallel set of i860 processors running C/UNIX). It discusses the gains in performance and the pitfalls of such a venture. It also examines the merits of this processor architecture (based on this conceptual prototyping effort) and programming paradigm to meet the final SSV demonstration requirements.
Cheng, Xuemin; Yang, Yikang; Hao, Qun
2016-01-01
The thermal environment is an important factor in the design of optical systems. This study investigated the thermal analysis technology of optical systems for navigation guidance and control in supersonic aircraft by developing empirical equations for the front temperature gradient and rear thermal diffusion distance, and for basic factors such as flying parameters and the structure of the optical system. Finite element analysis (FEA) was used to study the relationship between flying and front dome parameters and the system temperature field. Systematic deduction was then conducted based on the effects of the temperature field on the physical geometry and ray tracing performance of the front dome and rear optical lenses, by deriving the relational expressions between the system temperature field and the spot size and positioning precision of the rear optical lens. The optical systems used for navigation guidance and control in supersonic aircraft when the flight speed is in the range of 1–5 Ma were analysed using the derived equations. Using this new method it was possible to control the precision within 10% when considering the light spot received by the four-quadrant detector, and computation time was reduced compared with the traditional method of separately analysing the temperature field of the front dome and rear optical lens using FEA. Thus, the method can effectively increase the efficiency of parameter analysis and computation in an airborne optical system, facilitating the systematic, effective and integrated thermal analysis of airborne optical systems for navigation guidance and control. PMID:27763515
Cheng, Xuemin; Yang, Yikang; Hao, Qun
2016-10-17
The thermal environment is an important factor in the design of optical systems. This study investigated the thermal analysis technology of optical systems for navigation guidance and control in supersonic aircraft by developing empirical equations for the front temperature gradient and rear thermal diffusion distance, and for basic factors such as flying parameters and the structure of the optical system. Finite element analysis (FEA) was used to study the relationship between flying and front dome parameters and the system temperature field. Systematic deduction was then conducted based on the effects of the temperature field on the physical geometry and ray tracing performance of the front dome and rear optical lenses, by deriving the relational expressions between the system temperature field and the spot size and positioning precision of the rear optical lens. The optical systems used for navigation guidance and control in supersonic aircraft when the flight speed is in the range of 1-5 Ma were analysed using the derived equations. Using this new method it was possible to control the precision within 10% when considering the light spot received by the four-quadrant detector, and computation time was reduced compared with the traditional method of separately analysing the temperature field of the front dome and rear optical lens using FEA. Thus, the method can effectively increase the efficiency of parameter analysis and computation in an airborne optical system, facilitating the systematic, effective and integrated thermal analysis of airborne optical systems for navigation guidance and control.
Low-cost compact MEMS scanning ladar system for robotic applications
NASA Astrophysics Data System (ADS)
Moss, Robert; Yuan, Ping; Bai, Xiaogang; Quesada, Emilio; Sudharsanan, Rengarajan; Stann, Barry L.; Dammann, John F.; Giza, Mark M.; Lawler, William B.
2012-06-01
Future robots and autonomous vehicles require compact low-cost Laser Detection and Ranging (LADAR) systems for autonomous navigation. Army Research Laboratory (ARL) had recently demonstrated a brass-board short-range eye-safe MEMS scanning LADAR system for robotic applications. Boeing Spectrolab is doing a tech-transfer (CRADA) of this system and has built a compact MEMS scanning LADAR system with additional improvements in receiver sensitivity, laser system, and data processing system. Improved system sensitivity, low-cost, miniaturization, and low power consumption are the main goals for the commercialization of this LADAR system. The receiver sensitivity has been improved by 2x using large-area InGaAs PIN detectors with low-noise amplifiers. The FPGA code has been updated to extend the range to 50 meters and detect up to 3 targets per pixel. Range accuracy has been improved through the implementation of an optical T-Zero input line. A compact commercially available erbium fiber laser operating at 1550 nm wavelength is used as a transmitter, thus reducing the size of the LADAR system considerably from the ARL brassboard system. The computer interface has been consolidated to allow image data and configuration data (configuration settings and system status) to pass through a single Ethernet port. In this presentation we will discuss the system architecture and future improvements to receiver sensitivity using avalanche photodiodes.
For Spacious Skies: Self-Separation with "Autonomous Flight Rules" in US Domestic Airspace
NASA Technical Reports Server (NTRS)
Wing, David J.; Cotton, William B.
2011-01-01
Autonomous Flight Rules (AFR) are proposed as a new set of operating regulations in which aircraft navigate on tracks of their choice while self-separating from traffic and weather. AFR would exist alongside Instrument and Visual Flight Rules (IFR and VFR) as one of three available flight options for any appropriately trained and qualified operator with the necessary certified equipment. Historically, ground-based separation services evolved by necessity as aircraft began operating in the clouds and were unable to see each other. Today, technologies for global precision navigation, emerging airborne surveillance, and onboard computing enable traffic conflict management to be fully integrated with navigation procedures onboard the aircraft. By self-separating, aircraft can operate with more flexibility and fewer flight restrictions than are required when using ground-based separation. The AFR concept proposes a practical means in which self-separating aircraft could share the same airspace as IFR and VFR aircraft without disrupting the ongoing processes of Air Traffic Control. The paper discusses the context and motivation for implementing self-separation in US domestic airspace. It presents a historical perspective on separation, the proposed way forward in AFR, the rationale behind mixed operations, and the expected benefits of AFR for the airspace user community.
A positional estimation technique for an autonomous land vehicle in an unstructured environment
NASA Technical Reports Server (NTRS)
Talluri, Raj; Aggarwal, J. K.
1990-01-01
This paper presents a solution to the positional estimation problem of an autonomous land vehicle navigating in an unstructured mountainous terrain. A Digital Elevation Map (DEM) of the area in which the robot is to navigate is assumed to be given. It is also assumed that the robot is equipped with a camera that can be panned and tilted, and a device to measure the elevation of the robot above the ground surface. No recognizable landmarks are assumed to be present in the environment in which the robot is to navigate. The solution presented makes use of the DEM information, and structures the problem as a heuristic search in the DEM for the possible robot location. The shape and position of the horizon line in the image plane and the known camera geometry of the perspective projection are used as parameters to search the DEM. Various heuristics drawn from the geometric constraints are used to prune the search space significantly. The algorithm is made robust to errors in the imaging process by accounting for the worst care errors. The approach is tested using DEM data of areas in Colorado and Texas. The method is suitable for use in outdoor mobile robots and planetary rovers.
Biologically-inspired navigation and flight control for Mars flyer missions
NASA Technical Reports Server (NTRS)
Thakoor, S.; Chahl, J.; Hine, B.; Zornetzer, S.
2003-01-01
Bioinspired Engineering Exploration Systems (BEES), is enabling new bioinspired sensors for autonomous exploration of Mars. The steps towards autonomy in development of these BEES flyers are described. A future set of Mars mission that are uniquely enabled by surch flyers are finally described.
A Robot to Help Make the Rounds
NASA Technical Reports Server (NTRS)
2003-01-01
This paper presents a discussion on the Pyxis HelpMate SecurePak (SP) trackless robotic courier designed by Transitions Research Corporation, to navigate autonomously throughout medical facilities, transporting pharmaceuticals, laboratory specimens, equipment, supplies, meals, medical records, and radiology films between support departments and nursing floors.
Loosely Coupled GPS-Aided Inertial Navigation System for Range Safety
NASA Technical Reports Server (NTRS)
Heatwole, Scott; Lanzi, Raymond J.
2010-01-01
The Autonomous Flight Safety System (AFSS) aims to replace the human element of range safety operations, as well as reduce reliance on expensive, downrange assets for launches of expendable launch vehicles (ELVs). The system consists of multiple navigation sensors and flight computers that provide a highly reliable platform. It is designed to ensure that single-event failures in a flight computer or sensor will not bring down the whole system. The flight computer uses a rules-based structure derived from range safety requirements to make decisions whether or not to destroy the rocket.
NASA Technical Reports Server (NTRS)
Zuraski, G. D.
1972-01-01
The functions of a laser rangefinder on board an autonomous Martian roving vehicle are discussed. The functions are: (1) navigation by means of a passive satellite and (2) mid-range path selection and obstacle avoidance. The feasibility of using a laser to make the necessary range measurements is explored and a preliminary design is presented. The two uses of the rangefinder dictate widely different operating parameters making it impossible to use the same system for both functions.
Multiple estimation channel decoupling and optimization method based on inverse system
NASA Astrophysics Data System (ADS)
Wu, Peng; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng
2018-03-01
This paper addressed the intelligent autonomous navigation request of intelligent deformation missile, based on the intelligent deformation missile dynamics and kinematics modeling, navigation subsystem solution method and error modeling, and then focuses on the corresponding data fusion and decision fusion technology, decouples the sensitive channel of the filter input through the inverse system of design dynamics to reduce the influence of sudden change of the measurement information on the filter input. Then carrying out a series of simulation experiments, which verified the feasibility of the inverse system decoupling algorithm effectiveness.
Terrain classification in navigation of an autonomous mobile robot
NASA Astrophysics Data System (ADS)
Dodds, David R.
1991-03-01
In this paper we describe a method of path planning that integrates terrain classification (by means of fractals) the certainty grid method of spatial representation Kehtarnavaz Griswold collision-zones Dubois Prade fuzzy temporal and spatial knowledge and non-point sized qualitative navigational planning. An initially planned (" end-to-end" ) path is piece-wise modified to accommodate known and inferred moving obstacles and includes attention to time-varying multiple subgoals which may influence a section of path at a time after the robot has begun traversing that planned path.
A Long Range Science Rover For Future Mars Missions
NASA Technical Reports Server (NTRS)
Hayati, Samad
1997-01-01
This paper describes the design and implementation currently underway at the Jet Propulsion Laboratory of a long range science rover for future missions to Mars. The small rover prototype, called Rocky 7, is capable of long traverse. autonomous navigation. and science instrument control, carries three science instruments, and can be commanded from any computer platform and any location using the World Wide Web. In this paper we describe the mobility system, the sampling system, the sensor suite, navigation and control, onboard science instruments. and the ground command and control system.
Visual control of navigation in insects and its relevance for robotics.
Srinivasan, Mandyam V
2011-08-01
Flying insects display remarkable agility, despite their diminutive eyes and brains. This review describes our growing understanding of how these creatures use visual information to stabilize flight, avoid collisions with objects, regulate flight speed, detect and intercept other flying insects such as mates or prey, navigate to a distant food source, and orchestrate flawless landings. It also outlines the ways in which these insights are now being used to develop novel, biologically inspired strategies for the guidance of autonomous, airborne vehicles. Copyright © 2011 Elsevier Ltd. All rights reserved.
Dual RF Astrodynamic GPS Orbital Navigator Satellite
NASA Technical Reports Server (NTRS)
Kanipe, David B.; Provence, Robert Steve; Straube, Timothy M.; Reed, Helen; Bishop, Robert; Lightsey, Glenn
2009-01-01
Dual RF Astrodynamic GPS Orbital Navigator Satellite (DRAGONSat) will demonstrate autonomous rendezvous and docking (ARD) in low Earth orbit (LEO) and gather flight data with a global positioning system (GPS) receiver strictly designed for space applications. ARD is the capability of two independent spacecraft to rendezvous in orbit and dock without crew intervention. DRAGONSat consists of two picosatellites (one built by the University of Texas and one built by Texas A and M University) and the Space Shuttle Payload Launcher (SSPL); this project will ultimately demonstrate ARD in LEO.
Design of a wheeled articulating land rover
NASA Technical Reports Server (NTRS)
Stauffer, Larry; Dilorenzo, Mathew; Yandle, Barbara
1994-01-01
The WALRUS is a wheeled articulating land rover that will provide Ames Research Center with a reliable, autonomous vehicle for demonstrating and evaluating advanced technologies. The vehicle is one component of the Ames Research Center's on-going Human Exploration Demonstration Project. Ames Research Center requested a system capable of traversing a broad spectrum of surface types and obstacles. In addition, this vehicle must have an autonomous navigation and control system on board and its own source of power. The resulting design is a rover that articulates in two planes of motion to allow for increased mobility and stability. The rover is driven by six conical shaped aluminum wheels, each with an independent, internally coupled motor. Mounted on the rover are two housings and a removable remote control system. In the housings, the motor controller board, tilt sensor, navigation circuitry, and QED board are mounted. Finally, the rover's motors and electronics are powered by thirty C-cell rechargeable batteries, which are located in the rover wheels and recharged by a specially designed battery charger.
A Long Distance Laser Altimeter for Terrain Relative Navigation and Spacecraft Landing
NASA Technical Reports Server (NTRS)
Pierrottet, Diego F.; Amzajerdian, Farzin; Barnes, Bruce W.
2014-01-01
A high precision laser altimeter was developed under the Autonomous Landing and Hazard Avoidance (ALHAT) project at NASA Langley Research Center. The laser altimeter provides slant-path range measurements from operational ranges exceeding 30 km that will be used to support surface-relative state estimation and navigation during planetary descent and precision landing. The altimeter uses an advanced time-of-arrival receiver, which produces multiple signal-return range measurements from tens of kilometers with 5 cm precision. The transmitter is eye-safe, simplifying operations and testing on earth. The prototype is fully autonomous, and able to withstand the thermal and mechanical stresses experienced during test flights conducted aboard helicopters, fixed-wing aircraft, and Morpheus, a terrestrial rocket-powered vehicle developed by NASA Johnson Space Center. This paper provides an overview of the sensor and presents results obtained during recent field experiments including a helicopter flight test conducted in December 2012 and Morpheus flight tests conducted during March of 2014.
NASA Astrophysics Data System (ADS)
Tang, Chengpan; Hu, Xiaogong; Zhou, Shanshi; Liu, Li; Pan, Junyang; Chen, Liucheng; Guo, Rui; Zhu, Lingfeng; Hu, Guangming; Li, Xiaojie; He, Feng; Chang, Zhiqiao
2018-01-01
Autonomous orbit determination is the ability of navigation satellites to estimate the orbit parameters on-board using inter-satellite link (ISL) measurements. This study mainly focuses on data processing of the ISL measurements as a new measurement type and its application on the centralized autonomous orbit determination of the new-generation Beidou navigation satellite system satellites for the first time. The ISL measurements are dual one-way measurements that follow a time division multiple access (TDMA) structure. The ranging error of the ISL measurements is less than 0.25 ns. This paper proposes a derivation approach to the satellite clock offsets and the geometric distances from TDMA dual one-way measurements without a loss of accuracy. The derived clock offsets are used for time synchronization, and the derived geometry distances are used for autonomous orbit determination. The clock offsets from the ISL measurements are consistent with the L-band two-way satellite, and time-frequency transfer clock measurements and the detrended residuals vary within 0.5 ns. The centralized autonomous orbit determination is conducted in a batch mode on a ground-capable server for the feasibility study. Constant hardware delays are present in the geometric distances and become the largest source of error in the autonomous orbit determination. Therefore, the hardware delays are estimated simultaneously with the satellite orbits. To avoid uncertainties in the constellation orientation, a ground anchor station that "observes" the satellites with on-board ISL payloads is introduced into the orbit determination. The root-mean-square values of orbit determination residuals are within 10.0 cm, and the standard deviation of the estimated ISL hardware delays is within 0.2 ns. The accuracy of the autonomous orbits is evaluated by analysis of overlap comparison and the satellite laser ranging (SLR) residuals and is compared with the accuracy of the L-band orbits. The results indicate that the radial overlap differences between the autonomous orbits are less than 15.0 cm for the inclined geosynchronous orbit (IGSO) satellites and less than 10.0 cm for the MEO satellites. The SLR residuals are approximately 15.0 cm for the IGSO satellites and approximately 10.0 cm for the MEO satellites, representing an improvement over the L-band orbits.
Advanced Integration of WiFi and Inertial Navigation Systems for Indoor Mobile Positioning
NASA Astrophysics Data System (ADS)
Evennou, Frédéric; Marx, François
2006-12-01
This paper presents an aided dead-reckoning navigation structure and signal processing algorithms for self localization of an autonomous mobile device by fusing pedestrian dead reckoning and WiFi signal strength measurements. WiFi and inertial navigation systems (INS) are used for positioning and attitude determination in a wide range of applications. Over the last few years, a number of low-cost inertial sensors have become available. Although they exhibit large errors, WiFi measurements can be used to correct the drift weakening the navigation based on this technology. On the other hand, INS sensors can interact with the WiFi positioning system as they provide high-accuracy real-time navigation. A structure based on a Kalman filter and a particle filter is proposed. It fuses the heterogeneous information coming from those two independent technologies. Finally, the benefits of the proposed architecture are evaluated and compared with the pure WiFi and INS positioning systems.
Sextant X-Ray Pulsar Navigation Demonstration: Initial On-Orbit Results
NASA Technical Reports Server (NTRS)
Mitchell, Jason W.; Winternitz, Luke M.; Hassouneh, Munther A.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wolff, Michael T.; Kerr, Matthew; Wood, Kent S.;
2018-01-01
The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a technology demonstration enhancement to the Neutron-star Interior Composition Explorer (NICER) mission. SEXTANT will be a first demonstration of in-space, autonomous, X-ray pulsar navigation (XNAV). Navigating using millisecond X-ray pulsars which could provide a GPS-like navigation capability available throughout our Solar System and beyond. NICER is a NASA Astrophysics Explorer Mission of Opportunity to the International Space Station that was launched and installed in June of 2017. During NICER's nominal 18-month base mission, SEXTANT will perform a number of experiments to demonstrate XNAV and advance the technology on a number of fronts. In this work, we review the SEXTANT, its goals, and present early results from SEXTANT experiments conducted in the first six months of operation. With these results, SEXTANT has made significant progress toward meeting its primary and secondary mission goals. We also describe the SEXTANT flight operations, calibration activities, and initial results.
Neural Network Based Sensory Fusion for Landmark Detection
NASA Technical Reports Server (NTRS)
Kumbla, Kishan -K.; Akbarzadeh, Mohammad R.
1997-01-01
NASA is planning to send numerous unmanned planetary missions to explore the space. This requires autonomous robotic vehicles which can navigate in an unstructured, unknown, and uncertain environment. Landmark based navigation is a new area of research which differs from the traditional goal-oriented navigation, where a mobile robot starts from an initial point and reaches a destination in accordance with a pre-planned path. The landmark based navigation has the advantage of allowing the robot to find its way without communication with the mission control station and without exact knowledge of its coordinates. Current algorithms based on landmark navigation however pose several constraints. First, they require large memories to store the images. Second, the task of comparing the images using traditional methods is computationally intensive and consequently real-time implementation is difficult. The method proposed here consists of three stages, First stage utilizes a heuristic-based algorithm to identify significant objects. The second stage utilizes a neural network (NN) to efficiently classify images of the identified objects. The third stage combines distance information with the classification results of neural networks for efficient and intelligent navigation.
Galileo: The Added Value for Integrity in Harsh Environments.
Borio, Daniele; Gioia, Ciro
2016-01-16
A global navigation satellite system (GNSS)-based navigation is a challenging task in a signal-degraded environments where GNSS signals are distorted by multipath and attenuated by fading effects: the navigation solution may be inaccurate or unavailable. A possible approach to improve accuracy and availability is the joint use of measurements from different GNSSs and quality check algorithms; this approach is investigated here using live GPS and Galileo signals. A modified receiver autonomous integrity monitoring (RAIM) algorithm, including geometry and separability checks, is proposed to detect and exclude erroneous measurements: the multi-constellation approach provides redundant measurements, and RAIM exploits them to exclude distorted observations. The synergy between combined GPS/Galileo navigation and RAIM is analyzed using live data; the performance is compared to the accuracy and availability of a GPS-only solution. The tests performed demonstrate that the methods developed are effective techniques for GNSS-based navigation in signal-degraded environments. The joint use of the multi-constellation approach and of modified RAIM algorithms improves the performance of the navigation system in terms of both accuracy and availability.
Galileo: The Added Value for Integrity in Harsh Environments
Borio, Daniele; Gioia, Ciro
2016-01-01
A global navigation satellite system (GNSS)-based navigation is a challenging task in a signal-degraded environments where GNSS signals are distorted by multipath and attenuated by fading effects: the navigation solution may be inaccurate or unavailable. A possible approach to improve accuracy and availability is the joint use of measurements from different GNSSs and quality check algorithms; this approach is investigated here using live GPS and Galileo signals. A modified receiver autonomous integrity monitoring (RAIM) algorithm, including geometry and separability checks, is proposed to detect and exclude erroneous measurements: the multi-constellation approach provides redundant measurements, and RAIM exploits them to exclude distorted observations. The synergy between combined GPS/Galileo navigation and RAIM is analyzed using live data; the performance is compared to the accuracy and availability of a GPS-only solution. The tests performed demonstrate that the methods developed are effective techniques for GNSS-based navigation in signal-degraded environments. The joint use of the multi-constellation approach and of modified RAIM algorithms improves the performance of the navigation system in terms of both accuracy and availability. PMID:26784205
NASA Technical Reports Server (NTRS)
Acton, C. H., Jr.; Ohtakay, H.
1975-01-01
Optical navigation uses spacecraft television pictures of a target body against a known star background in a process which relates the spacecraft trajectory to the target body. This technology was used in the Mariner-Venus-Mercury mission, with the optical data processed in near-real-time, simulating a mission critical environment. Optical data error sources were identified, and a star location error analysis was carried out. Several methods for selecting limb crossing coordinates were used, and a limb smear compensation was introduced. Omission of planetary aberration corrections was the source of large optical residuals.
A light-driven artificial flytrap
Wani, Owies M.; Zeng, Hao; Priimagi, Arri
2017-01-01
The sophistication, complexity and intelligence of biological systems is a continuous source of inspiration for mankind. Mimicking the natural intelligence to devise tiny systems that are capable of self-regulated, autonomous action to, for example, distinguish different targets, remains among the grand challenges in biomimetic micro-robotics. Herein, we demonstrate an autonomous soft device, a light-driven flytrap, that uses optical feedback to trigger photomechanical actuation. The design is based on light-responsive liquid-crystal elastomer, fabricated onto the tip of an optical fibre, which acts as a power source and serves as a contactless probe that senses the environment. Mimicking natural flytraps, this artificial flytrap is capable of autonomous closure and object recognition. It enables self-regulated actuation within the fibre-sized architecture, thus opening up avenues towards soft, autonomous small-scale devices. PMID:28534872
Crew-Aided Autonomous Navigation Project
NASA Technical Reports Server (NTRS)
Holt, Greg
2015-01-01
Manual capability to perform star/planet-limb sightings provides a cheap, simple, and robust backup navigation source for exploration missions independent from the ground. Sextant sightings from spacecraft were first exercised in Gemini and flew as the loss-of-communications backup for all Apollo missions. This study seeks to procure and characterize error sources of navigation-grade sextants for feasibility of taking star and planetary limb sightings from inside a spacecraft. A series of similar studies was performed in the early/mid-1960s in preparation for Apollo missions, and one goal of this study is to modernize and update those findings. This technique has the potential to deliver significant risk mitigation, validation, and backup to more complex low-TRL automated systems under development involving cameras.
Autonomous self-navigating drug-delivery vehicles: from science fiction to reality.
Petrenko, Valery A
2017-12-01
Low efficacy of targeted nanomedicines in biological experiments enforced us to challenge the traditional concept of drug targeting and suggest a paradigm of 'addressed self-navigating drug-delivery vehicles,' in which affinity selection of targeting peptides and vasculature-directed in vivo phage screening is replaced by the migration selection, which explores ability of 'promiscuous' phages and their proteins to migrate through the tumor-surrounding cellular barriers, using a 'hub and spoke' delivery strategy, and penetrate into the tumor affecting the diverse tumor cell population. The 'self-navigating' drug-delivery paradigm can be used as a theoretical and technical platform in design of a novel generation of molecular medications and imaging probes for precise and personal medicine. [Formula: see text].
NASA Technical Reports Server (NTRS)
Bell, Jerome A.; Stephens, Elaine; Barton, Gregg
1991-01-01
An overview is provided of the Space Exploration Initiative (SEI) concepts for telecommunications, information systems, and navigation (TISN), and engineering and architecture issues are discussed. The SEI program data system is reviewed to identify mission TISN interfaces, and reference TISN concepts are described for nominal, degraded, and mission-critical data services. The infrastructures reviewed include telecommunications for robotics support, autonomous navigation without earth-based support, and information networks for tracking and data acquisition. Four options for TISN support architectures are examined which relate to unique SEI exploration strategies. Detailed support estimates are given for: (1) a manned stay on Mars; (2) permanent lunar and Martian settlements; short-duration missions; and (4) systematic exploration of the moon and Mars.
Beacons for supporting lunar landing navigation
NASA Astrophysics Data System (ADS)
Theil, Stephan; Bora, Leonardo
2017-03-01
Current and future planetary exploration missions involve a landing on the target celestial body. Almost all of these landing missions are currently relying on a combination of inertial and optical sensor measurements to determine the current flight state with respect to the target body and the desired landing site. As soon as an infrastructure at the landing site exists, the requirements as well as conditions change for vehicles landing close to this existing infrastructure. This paper investigates the options for ground-based infrastructure supporting the onboard navigation system and analyzes the impact on the achievable navigation accuracy. For that purpose, the paper starts with an existing navigation architecture based on optical navigation and extends it with measurements to support navigation with ground infrastructure. A scenario of lunar landing is simulated and the provided functions of the ground infrastructure as well as the location with respect to the landing site are evaluated. The results are analyzed and discussed.
Systems analysis for ground-based optical navigation
NASA Technical Reports Server (NTRS)
Null, G. W.; Owen, W. M., Jr.; Synnott, S. P.
1992-01-01
Deep-space telecommunications systems will eventually operate at visible or near-infrared regions to provide increased information return from interplanetary spacecraft. This would require an onboard laser transponder in place of (or in addition to) the usual microwave transponder, as well as a network of ground-based and/or space-based optical observing stations. This article examines the expected navigation systems to meet these requirements. Special emphasis is given to optical astrometric (angular) measurements of stars, solar system target bodies, and (when available) laser-bearing spacecraft, since these observations can potentially provide the locations of both spacecraft and target bodies. The role of astrometry in the navigation system and the development options for astrometric observing systems are also discussed.
Single-Frequency GPS Relative Navigation in a High Ionosphere Orbital Environment
NASA Technical Reports Server (NTRS)
Conrad, Patrick R.; Naasz, Bo J.
2007-01-01
The Global Positioning System (GPS) provides a convenient source for space vehicle relative navigation measurements, especially for low Earth orbit formation flying and autonomous rendezvous mission concepts. For single-frequency GPS receivers, ionospheric path delay can be a significant error source if not properly mitigated. In particular, ionospheric effects are known to cause significant radial position error bias and add dramatically to relative state estimation error if the onboard navigation software does not force the use of measurements from common or shared GPS space vehicles. Results from GPS navigation simulations are presented for a pair of space vehicles flying in formation and using GPS pseudorange measurements to perform absolute and relative orbit determination. With careful measurement selection techniques relative state estimation accuracy to less than 20 cm with standard GPS pseudorange processing and less than 10 cm with single-differenced pseudorange processing is shown.
Polarized skylight navigation.
Hamaoui, Moshe
2017-01-20
Vehicle state estimation is an essential prerequisite for navigation. The present approach seeks to use skylight polarization to facilitate state estimation under autonomous unconstrained flight conditions. Atmospheric scattering polarizes incident sunlight such that solar position is mathematically encoded in the resulting skylight polarization pattern. Indeed, several species of insects are able to sense skylight polarization and are believed to navigate polarimetrically. Sun-finding methodologies for polarized skylight navigation (PSN) have been proposed in the literature but typically rely on calibration updates to account for changing atmospheric conditions and/or are limited to 2D operation. To address this technology gap, a gradient-based PSN solution is developed based upon the Rayleigh sky model. The solution is validated in simulation, and effects of measurement error and changing atmospheric conditions are investigated. Finally, an experimental effort is described wherein polarimetric imagery is collected, ground-truth is established through independent imager-attitude measurement, the gradient-based PSN solution is applied, and results are analyzed.
3D Reconfigurable MPSoC for Unmanned Spacecraft Navigation
NASA Astrophysics Data System (ADS)
Dekoulis, George
2016-07-01
This paper describes the design of a new lightweight spacecraft navigation system for unmanned space missions. The system addresses the demands for more efficient autonomous navigation in the near-Earth environment or deep space. The proposed instrumentation is directly suitable for unmanned systems operation and testing of new airborne prototypes for remote sensing applications. The system features a new sensor technology and significant improvements over existing solutions. Fluxgate type sensors have been traditionally used in unmanned defense systems such as target drones, guided missiles, rockets and satellites, however, the guidance sensors' configurations exhibit lower specifications than the presented solution. The current implementation is based on a recently developed material in a reengineered optimum sensor configuration for unprecedented low-power consumption. The new sensor's performance characteristics qualify it for spacecraft navigation applications. A major advantage of the system is the efficiency in redundancy reduction achieved in terms of both hardware and software requirements.
An Outdoor Navigation Platform with a 3D Scanner and Gyro-assisted Odometry
NASA Astrophysics Data System (ADS)
Yoshida, Tomoaki; Irie, Kiyoshi; Koyanagi, Eiji; Tomono, Masahiro
This paper proposes a light-weight navigation platform that consists of gyro-assisted odometry, a 3D laser scanner and map-based localization for human-scale robots. The gyro-assisted odometry provides highly accurate positioning only by dead-reckoning. The 3D laser scanner has a wide field of view and uniform measuring-point distribution. The map-based localization is robust and computationally inexpensive by utilizing a particle filter on a 2D grid map generated by projecting 3D points on to the ground. The system uses small and low-cost sensors, and can be applied to a variety of mobile robots in human-scale environments. Outdoor navigation experiments were conducted at the Tsukuba Challenge held in 2009 and 2010, which is an open proving ground for human-scale robots. Our robot successfully navigated the assigned 1-km courses in a fully autonomous mode multiple times.
Slime mold uses an externalized spatial “memory” to navigate in complex environments
Reid, Chris R.; Latty, Tanya; Dussutour, Audrey; Beekman, Madeleine
2012-01-01
Spatial memory enhances an organism’s navigational ability. Memory typically resides within the brain, but what if an organism has no brain? We show that the brainless slime mold Physarum polycephalum constructs a form of spatial memory by avoiding areas it has previously explored. This mechanism allows the slime mold to solve the U-shaped trap problem—a classic test of autonomous navigational ability commonly used in robotics—requiring the slime mold to reach a chemoattractive goal behind a U-shaped barrier. Drawn into the trap, the organism must rely on other methods than gradient-following to escape and reach the goal. Our data show that spatial memory enhances the organism’s ability to navigate in complex environments. We provide a unique demonstration of a spatial memory system in a nonneuronal organism, supporting the theory that an externalized spatial memory may be the functional precursor to the internal memory of higher organisms. PMID:23045640
Slime mold uses an externalized spatial "memory" to navigate in complex environments.
Reid, Chris R; Latty, Tanya; Dussutour, Audrey; Beekman, Madeleine
2012-10-23
Spatial memory enhances an organism's navigational ability. Memory typically resides within the brain, but what if an organism has no brain? We show that the brainless slime mold Physarum polycephalum constructs a form of spatial memory by avoiding areas it has previously explored. This mechanism allows the slime mold to solve the U-shaped trap problem--a classic test of autonomous navigational ability commonly used in robotics--requiring the slime mold to reach a chemoattractive goal behind a U-shaped barrier. Drawn into the trap, the organism must rely on other methods than gradient-following to escape and reach the goal. Our data show that spatial memory enhances the organism's ability to navigate in complex environments. We provide a unique demonstration of a spatial memory system in a nonneuronal organism, supporting the theory that an externalized spatial memory may be the functional precursor to the internal memory of higher organisms.
Meng, Zhijun; Yang, Jun; Guo, Xiye; Zhou, Yongbin
2017-01-01
Global Navigation Satellite System performance can be significantly enhanced by introducing inter-satellite links (ISLs) in navigation constellation. The improvement in position, velocity, and time accuracy as well as the realization of autonomous functions requires ISL distance measurement data as the original input. To build a high-performance ISL, the ranging consistency among navigation satellites is an urgent problem to be solved. In this study, we focus on the variation in the ranging delay caused by the sensitivity of the ISL payload equipment to the ambient temperature in space and propose a simple and low-power temperature-sensing ranging compensation sensor suitable for onboard equipment. The experimental results show that, after the temperature-sensing ranging compensation of the ISL payload equipment, the ranging consistency becomes less than 0.2 ns when the temperature change is 90 °C. PMID:28608809
Navigation system for a mobile robot with a visual sensor using a fish-eye lens
NASA Astrophysics Data System (ADS)
Kurata, Junichi; Grattan, Kenneth T. V.; Uchiyama, Hironobu
1998-02-01
Various position sensing and navigation systems have been proposed for the autonomous control of mobile robots. Some of these systems have been installed with an omnidirectional visual sensor system that proved very useful in obtaining information on the environment around the mobile robot for position reckoning. In this article, this type of navigation system is discussed. The sensor is composed of one TV camera with a fish-eye lens, using a reference target on a ceiling and hybrid image processing circuits. The position of the robot, with respect to the floor, is calculated by integrating the information obtained from a visual sensor and a gyroscope mounted in the mobile robot, and the use of a simple algorithm based on PTP control for guidance is discussed. An experimental trial showed that the proposed system was both valid and useful for the navigation of an indoor vehicle.
NASA Astrophysics Data System (ADS)
Yang, Zhixiao; Ito, Kazuyuki; Saijo, Kazuhiko; Hirotsune, Kazuyuki; Gofuku, Akio; Matsuno, Fumitoshi
This paper aims at constructing an efficient interface being similar to those widely used in human daily life, to fulfill the need of many volunteer rescuers operating rescue robots at large-scale disaster sites. The developed system includes a force feedback steering wheel interface and an artificial neural network (ANN) based mouse-screen interface. The former consists of a force feedback steering control and a six monitors’ wall. It provides a manual operation like driving cars to navigate a rescue robot. The latter consists of a mouse and a camera’s view displayed in a monitor. It provides a semi-autonomous operation by mouse clicking to navigate a rescue robot. Results of experiments show that a novice volunteer can skillfully navigate a tank rescue robot through both interfaces after 20 to 30 minutes of learning their operation respectively. The steering wheel interface has high navigating speed in open areas, without restriction of terrains and surface conditions of a disaster site. The mouse-screen interface is good at exact navigation in complex structures, while bringing little tension to operators. The two interfaces are designed to switch into each other at any time to provide a combined efficient navigation method.
Natural Models for Autonomous Control of Spatial Navigation, Sensing, and Guidance
2013-06-26
mantis shrimp, inspired largely by our efforts, can be found at: The Oatmeal - http://theoatmeal.com/ comics /mantis_shrimp. Courtesy of these guys...Ecology and Environmental Education (20-21 January, Tainan, Taiwan). 14 M How, NJ Marshall 2012 Polarisation vision, an unexplored channel for
Cancellation of the Army’s Autonomous Navigation System
2012-08-02
Auto/Truck Various Vehicle Leader/Follower, Road Following Google Driverless Vehicle Google Road Following Source: GAO presentation of data from Red...both of which are estimated to cost over $300,000 per system. However, Google’s Driverless Vehicle and the Southwest Research Institute’s Mobile
Natural Models for Autonomous Control of Spatial Navigation, Sensing, and Guidance, Part 1
2011-06-16
polarization sensitivity ten times more acute than previously documented in cephalopods (cuttlefish and octopus ) and crustaceans (mantis shrimps and crabs...responded to perceived looming stimuli in a number of ways, including swimming movements , skin colour and texture changes, which were recorded using