Sample records for ugv autonomous navigation

  1. PointCom: semi-autonomous UGV control with intuitive interface

    NASA Astrophysics Data System (ADS)

    Rohde, Mitchell M.; Perlin, Victor E.; Iagnemma, Karl D.; Lupa, Robert M.; Rohde, Steven M.; Overholt, James; Fiorani, Graham

    2008-04-01

    Unmanned ground vehicles (UGVs) will play an important role in the nation's next-generation ground force. Advances in sensing, control, and computing have enabled a new generation of technologies that bridge the gap between manual UGV teleoperation and full autonomy. In this paper, we present current research on a unique command and control system for UGVs named PointCom (Point-and-Go Command). PointCom is a semi-autonomous command system for one or multiple UGVs. The system, when complete, will be easy to operate and will enable significant reduction in operator workload by utilizing an intuitive image-based control framework for UGV navigation and allowing a single operator to command multiple UGVs. The project leverages new image processing algorithms for monocular visual servoing and odometry to yield a unique, high-performance fused navigation system. Human Computer Interface (HCI) techniques from the entertainment software industry are being used to develop video-game style interfaces that require little training and build upon the navigation capabilities. By combining an advanced navigation system with an intuitive interface, a semi-autonomous control and navigation system is being created that is robust, user friendly, and less burdensome than many current generation systems. mand).

  2. UGV navigation in wireless sensor and actuator network environments

    NASA Astrophysics Data System (ADS)

    Zhang, Guyu; Li, Jianfeng; Duncan, Christian A.; Kanno, Jinko; Selmic, Rastko R.

    2012-06-01

    We consider a navigation problem in a distributed, self-organized and coordinate-free Wireless Sensor and Ac- tuator Network (WSAN). We rst present navigation algorithms that are veried using simulation results. Con- sidering more than one destination and multiple mobile Unmanned Ground Vehicles (UGVs), we introduce a distributed solution to the Multi-UGV, Multi-Destination navigation problem. The objective of the solution to this problem is to eciently allocate UGVs to dierent destinations and carry out navigation in the network en- vironment that minimizes total travel distance. The main contribution of this paper is to develop a solution that does not attempt to localize either the UGVs or the sensor and actuator nodes. Other than some connectivity as- sumptions about the communication graph, we consider that no prior information about the WSAN is available. The solution presented here is distributed, and the UGV navigation is solely based on feedback from neigh- boring sensor and actuator nodes. One special case discussed in the paper, the Single-UGV, Multi-Destination navigation problem, is essentially equivalent to the well-known and dicult Traveling Salesman Problem (TSP). Simulation results are presented that illustrate the navigation distance traveled through the network. We also introduce an experimental testbed for the realization of coordinate-free and localization-free UGV navigation. We use the Cricket platform as the sensor and actuator network and a Pioneer 3-DX robot as the UGV. The experiments illustrate the UGV navigation in a coordinate-free WSAN environment where the UGV successfully arrives at the assigned destinations.

  3. Landmark-based robust navigation for tactical UGV control in GPS-denied communication-degraded environments

    NASA Astrophysics Data System (ADS)

    Endo, Yoichiro; Balloch, Jonathan C.; Grushin, Alexander; Lee, Mun Wai; Handelman, David

    2016-05-01

    Control of current tactical unmanned ground vehicles (UGVs) is typically accomplished through two alternative modes of operation, namely, low-level manual control using joysticks and high-level planning-based autonomous control. Each mode has its own merits as well as inherent mission-critical disadvantages. Low-level joystick control is vulnerable to communication delay and degradation, and high-level navigation often depends on uninterrupted GPS signals and/or energy-emissive (non-stealth) range sensors such as LIDAR for localization and mapping. To address these problems, we have developed a mid-level control technique where the operator semi-autonomously drives the robot relative to visible landmarks that are commonly recognizable by both humans and machines such as closed contours and structured lines. Our novel solution relies solely on optical and non-optical passive sensors and can be operated under GPS-denied, communication-degraded environments. To control the robot using these landmarks, we developed an interactive graphical user interface (GUI) that allows the operator to select landmarks in the robot's view and direct the robot relative to one or more of the landmarks. The integrated UGV control system was evaluated based on its ability to robustly navigate through indoor environments. The system was successfully field tested with QinetiQ North America's TALON UGV and Tactical Robot Controller (TRC), a ruggedized operator control unit (OCU). We found that the proposed system is indeed robust against communication delay and degradation, and provides the operator with steady and reliable control of the UGV in realistic tactical scenarios.

  4. A Navigation and Decision Making Architecture for Unmanned Ground Vehicles: Implementation and Results with the Raptor UGV

    DTIC Science & Technology

    2007-12-01

    the Raptor UGV J. Giesbrecht, J. Collier, G . Broten, S. Monckton, and D. Mackay A Navigation and Decision Making Architecture for Unmanned...Ground Vehicles Implementation and Results with the Raptor UGV J. Giesbrecht, J. Collier, G . Broten, S. Monckton, and D. Mackay Defence R&D Canada...parcours, l’évitement d’obstacles, la planification de parcours et des modules de prises de décision. Ce rapport présente des détails concernant les

  5. Stereo-vision-based terrain mapping for off-road autonomous navigation

    NASA Astrophysics Data System (ADS)

    Rankin, Arturo L.; Huertas, Andres; Matthies, Larry H.

    2009-05-01

    Successful off-road autonomous navigation by an unmanned ground vehicle (UGV) requires reliable perception and representation of natural terrain. While perception algorithms are used to detect driving hazards, terrain mapping algorithms are used to represent the detected hazards in a world model a UGV can use to plan safe paths. There are two primary ways to detect driving hazards with perception sensors mounted to a UGV: binary obstacle detection and traversability cost analysis. Binary obstacle detectors label terrain as either traversable or non-traversable, whereas, traversability cost analysis assigns a cost to driving over a discrete patch of terrain. In uncluttered environments where the non-obstacle terrain is equally traversable, binary obstacle detection is sufficient. However, in cluttered environments, some form of traversability cost analysis is necessary. The Jet Propulsion Laboratory (JPL) has explored both approaches using stereo vision systems. A set of binary detectors has been implemented that detect positive obstacles, negative obstacles, tree trunks, tree lines, excessive slope, low overhangs, and water bodies. A compact terrain map is built from each frame of stereo images. The mapping algorithm labels cells that contain obstacles as nogo regions, and encodes terrain elevation, terrain classification, terrain roughness, traversability cost, and a confidence value. The single frame maps are merged into a world map where temporal filtering is applied. In previous papers, we have described our perception algorithms that perform binary obstacle detection. In this paper, we summarize the terrain mapping capabilities that JPL has implemented during several UGV programs over the last decade and discuss some challenges to building terrain maps with stereo range data.

  6. Stereo Vision Based Terrain Mapping for Off-Road Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.; Huertas, Andres; Matthies, Larry H.

    2009-01-01

    Successful off-road autonomous navigation by an unmanned ground vehicle (UGV) requires reliable perception and representation of natural terrain. While perception algorithms are used to detect driving hazards, terrain mapping algorithms are used to represent the detected hazards in a world model a UGV can use to plan safe paths. There are two primary ways to detect driving hazards with perception sensors mounted to a UGV: binary obstacle detection and traversability cost analysis. Binary obstacle detectors label terrain as either traversable or non-traversable, whereas, traversability cost analysis assigns a cost to driving over a discrete patch of terrain. In uncluttered environments where the non-obstacle terrain is equally traversable, binary obstacle detection is sufficient. However, in cluttered environments, some form of traversability cost analysis is necessary. The Jet Propulsion Laboratory (JPL) has explored both approaches using stereo vision systems. A set of binary detectors has been implemented that detect positive obstacles, negative obstacles, tree trunks, tree lines, excessive slope, low overhangs, and water bodies. A compact terrain map is built from each frame of stereo images. The mapping algorithm labels cells that contain obstacles as no-go regions, and encodes terrain elevation, terrain classification, terrain roughness, traversability cost, and a confidence value. The single frame maps are merged into a world map where temporal filtering is applied. In previous papers, we have described our perception algorithms that perform binary obstacle detection. In this paper, we summarize the terrain mapping capabilities that JPL has implemented during several UGV programs over the last decade and discuss some challenges to building terrain maps with stereo range data.

  7. INL Autonomous Navigation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2005-03-30

    The INL Autonomous Navigation System provides instructions for autonomously navigating a robot. The system permits high-speed autonomous navigation including obstacle avoidance, waypoing navigation and path planning in both indoor and outdoor environments.

  8. Detecting personnel around UGVs using stereo vision

    NASA Astrophysics Data System (ADS)

    Bajracharya, Max; Moghaddam, Baback; Howard, Andrew; Matthies, Larry H.

    2008-04-01

    Detecting people around unmanned ground vehicles (UGVs) to facilitate safe operation of UGVs is one of the highest priority issues in the development of perception technology for autonomous navigation. Research to date has not achieved the detection ranges or reliability needed in deployed systems to detect upright pedestrians in flat, relatively uncluttered terrain, let alone in more complex environments and with people in postures that are more difficult to detect. Range data is essential to solve this problem. Combining range data with high resolution imagery may enable higher performance than range data alone because image appearance can complement shape information in range data and because cameras may offer higher angular resolution than typical range sensors. This makes stereo vision a promising approach for several reasons: image resolution is high and will continue to increase, the physical size and power dissipation of the cameras and computers will continue to decrease, and stereo cameras provide range data and imagery that are automatically spatially and temporally registered. We describe a stereo vision-based pedestrian detection system, focusing on recent improvements to a shape-based classifier applied to the range data, and present frame-level performance results that show great promise for the overall approach.

  9. Autonomous Navigation Using Celestial Objects

    NASA Technical Reports Server (NTRS)

    Folta, David; Gramling, Cheryl; Leung, Dominic; Belur, Sheela; Long, Anne

    1999-01-01

    In the twenty-first century, National Aeronautics and Space Administration (NASA) Enterprises envision frequent low-cost missions to explore the solar system, observe the universe, and study our planet. Satellite autonomy is a key technology required to reduce satellite operating costs. The Guidance, Navigation, and Control Center (GNCC) at the Goddard Space Flight Center (GSFC) currently sponsors several initiatives associated with the development of advanced spacecraft systems to provide autonomous navigation and control. Autonomous navigation has the potential both to increase spacecraft navigation system performance and to reduce total mission cost. By eliminating the need for routine ground-based orbit determination and special tracking services, autonomous navigation can streamline spacecraft ground systems. Autonomous navigation products can be included in the science telemetry and forwarded directly to the scientific investigators. In addition, autonomous navigation products are available onboard to enable other autonomous capabilities, such as attitude control, maneuver planning and orbit control, and communications signal acquisition. Autonomous navigation is required to support advanced mission concepts such as satellite formation flying. GNCC has successfully developed high-accuracy autonomous navigation systems for near-Earth spacecraft using NASA's space and ground communications systems and the Global Positioning System (GPS). Recently, GNCC has expanded its autonomous navigation initiative to include satellite orbits that are beyond the regime in which use of GPS is possible. Currently, GNCC is assessing the feasibility of using standard spacecraft attitude sensors and communication components to provide autonomous navigation for missions including: libration point, gravity assist, high-Earth, and interplanetary orbits. The concept being evaluated uses a combination of star, Sun, and Earth sensor measurements along with forward-link Doppler

  10. Soldier experiments and assessments using SPEAR speech control system for UGVs

    NASA Astrophysics Data System (ADS)

    Brown, Jonathan; Blanco, Chris; Czerniak, Jeffrey; Hoffman, Brian; Hoffman, Orin; Juneja, Amit; Ngia, Lester; Pruthi, Tarun; Liu, Dongqing

    2010-04-01

    This paper reports on a Soldier Experiment performed by the Army Research Lab's Human Research Engineering Directorate (HRED) Field Element located at the Maneuver Center of Excellence, Ft. Benning, and a Limited Use Assessment conducted by the Marine Corps Forces Pacific Command Experimentation Center (MEC) at Camp Pendleton evaluating the effectiveness of using speech commands to control an Unmanned Ground Vehicle. SPEAR, developed by Think-A-Move, Ltd., provides speech control of UGVs. SPEAR detects user speech in the ear canal with an earpiece containing an in-ear microphone. The system design provides up to 30 dB of passive noise reduction, enabling it to work well in high-noise environments, where traditional speech systems, using external microphones, fail; it also utilizes a proprietary speech recognition engine. SPEAR has been integrated with iRobot's PackBot 510 with FasTac Kit, and with Multi-Robot Operator Control Unit (MOCU), developed by SPAWAR Systems Center Pacific. These integrated systems allow speech to supplement the hand-controller for multi-modal control of different UGV functions simultaneously. HRED's experiment measured the impact of SPEAR on reducing the cognitive load placed on UGV Operators and the time to complete specific tasks. Army NCOs and Officer School Candidates participated in this experiment, which found that speech control was faster than manual control to complete tasks requiring menu navigation, as well as reducing the cognitive load on UGV Operators. The MEC assessment examined speech commands used for two different missions: Route Clearance and Cordon and Search; participants included Explosive Ordnance Disposal Technicians and Combat Engineers. The majority of the Marines thought it was easier to complete the mission scenarios with SPEAR than with only using manual controls, and that using SPEAR improved their situational awareness. Overall results of these Assessments are reported in the paper, along with possible

  11. Team VaCAS Design and Development of Cooperative UGV System

    DTIC Science & Technology

    2011-02-04

    Mapping ( SLAM ) [24]. Similar to such work, the technique to be used in the project will also (1) use the last reliably available data as the reference...Losada1, D., Matia1, F., Pedraza1, L., Jimenez A. and Galan, R., Consistency of SLAM -EKF Algorithms for Indoor Environments, Journal of Intelligent and...mounted on the UGV 1 include GPS for outdoor navigation, LiDAR for obstacle avoidance and mapping and camera for OOI detection and localization. UGVs 2

  12. Compact autonomous navigation system (CANS)

    NASA Astrophysics Data System (ADS)

    Hao, Y. C.; Ying, L.; Xiong, K.; Cheng, H. Y.; Qiao, G. D.

    2017-11-01

    Autonomous navigation of Satellite and constellation has series of benefits, such as to reduce operation cost and ground station workload, to avoid the event of crises of war and natural disaster, to increase spacecraft autonomy, and so on. Autonomous navigation satellite is independent of ground station support. Many systems are developed for autonomous navigation of satellite in the past 20 years. Along them American MANS (Microcosm Autonomous Navigation System) [1] of Microcosm Inc. and ERADS [2] [3] (Earth Reference Attitude Determination System) of Honeywell Inc. are well known. The systems anticipate a series of good features of autonomous navigation and aim low cost, integrated structure, low power consumption and compact layout. The ERADS is an integrated small 3-axis attitude sensor system with low cost and small volume. It has the Earth center measurement accuracy higher than the common IR sensor because the detected ultraviolet radiation zone of the atmosphere has a brightness gradient larger than that of the IR zone. But the ERADS is still a complex system because it has to eliminate many problems such as making of the sapphire sphere lens, birefringence effect of sapphire, high precision image transfer optical fiber flattener, ultraviolet intensifier noise, and so on. The marginal sphere FOV of the sphere lens of the ERADS is used to star imaging that may be bring some disadvantages., i.e. , the image energy and attitude measurements accuracy may be reduced due to the tilt image acceptance end of the fiber flattener in the FOV. Besides Japan, Germany and Russia developed visible earth sensor for GEO [4] [5]. Do we have a way to develop a cheaper/easier and more accurate autonomous navigation system that can be used to all LEO spacecraft, especially, to LEO small and micro satellites? To return this problem we provide a new type of the system—CANS (Compact Autonomous Navigation System) [6].

  13. Applicability of Deep-Learning Technology for Relative Object-Based Navigation

    DTIC Science & Technology

    2017-09-01

    burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing...possible selections for navigating an unmanned ground vehicle (UGV) is through real- time visual odometry. To navigate in such an environment, the UGV...UGV) is through real- time visual odometry. To navigate in such an environment, the UGV needs to be able to detect, identify, and relate the static

  14. Terrain Navigation Concepts for Autonomous Vehicles,

    DTIC Science & Technology

    1984-06-01

    AD-fi144 994 TERRAIN NAVIGATION CONCEPTS FOR AUTONOMOUS VEHICLES (U) i/i I ARMY ENGINEER OPOGRAPHIC LABS FORT BELVOIR VA R D LEIGHTY JUN 84 ETL-R@65...FUNCTIONS The pacing problem for developing autonomous vehicles that can efficiently move to designated locations in the real world in the perfor- mance...autonomous functions can serve as general terrain navigation requirements for our discussion of autonomous vehicles . LEIGHTY Can we build a vehicular system

  15. Semi-autonomous unmanned ground vehicle control system

    NASA Astrophysics Data System (ADS)

    Anderson, Jonathan; Lee, Dah-Jye; Schoenberger, Robert; Wei, Zhaoyi; Archibald, James

    2006-05-01

    Unmanned Ground Vehicles (UGVs) have advantages over people in a number of different applications, ranging from sentry duty, scouting hazardous areas, convoying goods and supplies over long distances, and exploring caves and tunnels. Despite recent advances in electronics, vision, artificial intelligence, and control technologies, fully autonomous UGVs are still far from being a reality. Currently, most UGVs are fielded using tele-operation with a human in the control loop. Using tele-operations, a user controls the UGV from the relative safety and comfort of a control station and sends commands to the UGV remotely. It is difficult for the user to issue higher level commands such as patrol this corridor or move to this position while avoiding obstacles. As computer vision algorithms are implemented in hardware, the UGV can easily become partially autonomous. As Field Programmable Gate Arrays (FPGAs) become larger and more powerful, vision algorithms can run at frame rate. With the rapid development of CMOS imagers for consumer electronics, frame rate can reach as high as 200 frames per second with a small size of the region of interest. This increase in the speed of vision algorithm processing allows the UGVs to become more autonomous, as they are able to recognize and avoid obstacles in their path, track targets, or move to a recognized area. The user is able to focus on giving broad supervisory commands and goals to the UGVs, allowing the user to control multiple UGVs at once while still maintaining the convenience of working from a central base station. In this paper, we will describe a novel control system for the control of semi-autonomous UGVs. This control system combines a user interface similar to a simple tele-operation station along with a control package, including the FPGA and multiple cameras. The control package interfaces with the UGV and provides the necessary control to guide the UGV.

  16. Autonomous Deep-Space Optical Navigation Project

    NASA Technical Reports Server (NTRS)

    D'Souza, Christopher

    2014-01-01

    This project will advance the Autonomous Deep-space navigation capability applied to Autonomous Rendezvous and Docking (AR&D) Guidance, Navigation and Control (GNC) system by testing it on hardware, particularly in a flight processor, with a goal of limited testing in the Integrated Power, Avionics and Software (IPAS) with the ARCM (Asteroid Retrieval Crewed Mission) DRO (Distant Retrograde Orbit) Autonomous Rendezvous and Docking (AR&D) scenario. The technology, which will be harnessed, is called 'optical flow', also known as 'visual odometry'. It is being matured in the automotive and SLAM (Simultaneous Localization and Mapping) applications but has yet to be applied to spacecraft navigation. In light of the tremendous potential of this technique, we believe that NASA needs to design a optical navigation architecture that will use this technique. It is flexible enough to be applicable to navigating around planetary bodies, such as asteroids.

  17. LABRADOR: a learning autonomous behavior-based robot for adaptive detection and object retrieval

    NASA Astrophysics Data System (ADS)

    Yamauchi, Brian; Moseley, Mark; Brookshire, Jonathan

    2013-01-01

    As part of the TARDEC-funded CANINE (Cooperative Autonomous Navigation in a Networked Environment) Program, iRobot developed LABRADOR (Learning Autonomous Behavior-based Robot for Adaptive Detection and Object Retrieval). LABRADOR was based on the rugged, man-portable, iRobot PackBot unmanned ground vehicle (UGV) equipped with an explosives ordnance disposal (EOD) manipulator arm and a custom gripper. For LABRADOR, we developed a vision-based object learning and recognition system that combined a TLD (track-learn-detect) filter based on object shape features with a color-histogram-based object detector. Our vision system was able to learn in real-time to recognize objects presented to the robot. We also implemented a waypoint navigation system based on fused GPS, IMU (inertial measurement unit), and odometry data. We used this navigation capability to implement autonomous behaviors capable of searching a specified area using a variety of robust coverage strategies - including outward spiral, random bounce, random waypoint, and perimeter following behaviors. While the full system was not integrated in time to compete in the CANINE competition event, we developed useful perception, navigation, and behavior capabilities that may be applied to future autonomous robot systems.

  18. Autonomous Relative Navigation for Formation-Flying Satellites Using GPS

    NASA Technical Reports Server (NTRS)

    Gramling, Cheryl; Carpenter, J. Russell; Long, Anne; Kelbel, David; Lee, Taesul

    2000-01-01

    The Goddard Space Flight Center is currently developing advanced spacecraft systems to provide autonomous navigation and control of formation flyers. This paper discusses autonomous relative navigation performance for a formation of four eccentric, medium-altitude Earth-orbiting satellites using Global Positioning System (GPS) Standard Positioning Service (SPS) and "GPS-like " intersatellite measurements. The performance of several candidate relative navigation approaches is evaluated. These analyses indicate that an autonomous relative navigation position accuracy of 1meter root-mean-square can be achieved by differencing high-accuracy filtered solutions if only measurements from common GPS space vehicles are used in the independently estimated solutions.

  19. Using probabilistic model as feature descriptor on a smartphone device for autonomous navigation of unmanned ground vehicles

    NASA Astrophysics Data System (ADS)

    Desai, Alok; Lee, Dah-Jye

    2013-12-01

    There has been significant research on the development of feature descriptors in the past few years. Most of them do not emphasize real-time applications. This paper presents the development of an affine invariant feature descriptor for low resource applications such as UAV and UGV that are equipped with an embedded system with a small microprocessor, a field programmable gate array (FPGA), or a smart phone device. UAV and UGV have proven suitable for many promising applications such as unknown environment exploration, search and rescue operations. These applications required on board image processing for obstacle detection, avoidance and navigation. All these real-time vision applications require a camera to grab images and match features using a feature descriptor. A good feature descriptor will uniquely describe a feature point thus allowing it to be correctly identified and matched with its corresponding feature point in another image. A few feature description algorithms are available for a resource limited system. They either require too much of the device's resource or too much simplification on the algorithm, which results in reduction in performance. This research is aimed at meeting the needs of these systems without sacrificing accuracy. This paper introduces a new feature descriptor called PRObabilistic model (PRO) for UGV navigation applications. It is a compact and efficient binary descriptor that is hardware-friendly and easy for implementation.

  20. Integrated polarization-dependent sensor for autonomous navigation

    NASA Astrophysics Data System (ADS)

    Liu, Ze; Zhang, Ran; Wang, Zhiwen; Guan, Le; Li, Bin; Chu, Jinkui

    2015-01-01

    Based on the navigation strategy of insects utilizing the polarized skylight, an integrated polarization-dependent sensor for autonomous navigation is presented. The navigation sensor has the features of compact structure, high precision, strong robustness, and a simple manufacture technique. The sensor is composed by integrating a complementary-metal-oxide-semiconductor sensor with a multiorientation nanowire grid polarizer. By nanoimprint lithography, the multiorientation nanowire polarizer is fabricated in one step and the alignment error is eliminated. The statistical theory is added to the interval-division algorithm to calculate the polarization angle of the incident light. The laboratory and outdoor tests for the navigation sensor are implemented and the errors of the measured angle are ±0.02 deg and ±1.3 deg, respectively. The results show that the proposed sensor has potential for application in autonomous navigation.

  1. A System for Fast Navigation of Autonomous Vehicles

    DTIC Science & Technology

    1991-09-01

    AD-A243 523 4, jj A System for Fast Navigation of Autonomous Vehicles Sanjiv Singh, Dai Feng, Paul Keller, Gary Shaffer, Wen Fan Shi, Dong Hun Shin...FUNDING NUMBERS A System for Fast Navigation of Autonomous Vehicles 6. AUTHOR(S) S. Singh, D. Feng, P. Keller, G. Shaffer, W.F. Shi, D.H. Shin, J. West...common in the control of autonomous vehicles to establish the necessary kinematic models but to ignore an explicit representation of the vehicle dynamics

  2. Autonomous Navigation Improvements for High-Earth Orbiters Using GPS

    NASA Technical Reports Server (NTRS)

    Long, Anne; Kelbel, David; Lee, Taesul; Garrison, James; Carpenter, J. Russell; Bauer, F. (Technical Monitor)

    2000-01-01

    The Goddard Space Flight Center is currently developing autonomous navigation systems for satellites in high-Earth orbits where acquisition of the GPS signals is severely limited This paper discusses autonomous navigation improvements for high-Earth orbiters and assesses projected navigation performance for these satellites using Global Positioning System (GPS) Standard Positioning Service (SPS) measurements. Navigation performance is evaluated as a function of signal acquisition threshold, measurement errors, and dynamic modeling errors using realistic GPS signal strength and user antenna models. These analyses indicate that an autonomous navigation position accuracy of better than 30 meters root-mean-square (RMS) can be achieved for high-Earth orbiting satellites using a GPS receiver with a very stable oscillator. This accuracy improves to better than 15 meters RMS if the GPS receiver's signal acquisition threshold can be reduced by 5 dB-Hertz to track weaker signals.

  3. Autonomous navigation - The ARMMS concept. [Autonomous Redundancy and Maintenance Management Subsystem

    NASA Technical Reports Server (NTRS)

    Wood, L. J.; Jones, J. B.; Mease, K. D.; Kwok, J. H.; Goltz, G. L.; Kechichian, J. A.

    1984-01-01

    A conceptual design is outlined for the navigation subsystem of the Autonomous Redundancy and Maintenance Management Subsystem (ARMMS). The principal function of this navigation subsystem is to maintain the spacecraft over a specified equatorial longitude to within + or - 3 deg. In addition, the navigation subsystem must detect and correct internal faults. It comprises elements for a navigation executive and for orbit determination, trajectory, maneuver planning, and maneuver command. Each of these elements is described. The navigation subsystem is to be used in the DSCS III spacecraft.

  4. Mobile Robot Designed with Autonomous Navigation System

    NASA Astrophysics Data System (ADS)

    An, Feng; Chen, Qiang; Zha, Yanfang; Tao, Wenyin

    2017-10-01

    With the rapid development of robot technology, robots appear more and more in all aspects of life and social production, people also ask more requirements for the robot, one is that robot capable of autonomous navigation, can recognize the road. Take the common household sweeping robot as an example, which could avoid obstacles, clean the ground and automatically find the charging place; Another example is AGV tracking car, which can following the route and reach the destination successfully. This paper introduces a new type of robot navigation scheme: SLAM, which can build the environment map in a totally strange environment, and at the same time, locate its own position, so as to achieve autonomous navigation function.

  5. Autonomous Navigation for Deep Space Missions

    NASA Technical Reports Server (NTRS)

    Bhaskaran, Shyam

    2012-01-01

    Navigation (determining where the spacecraft is at any given time, controlling its path to achieve desired targets), performed using ground-in- the-loop techniques: (1) Data includes 2-way radiometric (Doppler, range), interferometric (Delta- Differential One-way Range), and optical (images of natural bodies taken by onboard camera) (2) Data received on the ground, processed to determine orbit, commands sent to execute maneuvers to control orbit. A self-contained, onboard, autonomous navigation system can: (1) Eliminate delays due to round-trip light time (2) Eliminate the human factors in ground-based processing (3) Reduce turnaround time from navigation update to minutes, down to seconds (4) React to late-breaking data. At JPL, we have developed the framework and computational elements of an autonomous navigation system, called AutoNav. It was originally developed as one of the technologies for the Deep Space 1 mission, launched in 1998; subsequently used on three other spacecraft, for four different missions. The primary use has been on comet missions to track comets during flybys, and impact one comet.

  6. Angles-only navigation for autonomous orbital rendezvous

    NASA Astrophysics Data System (ADS)

    Woffinden, David C.

    The proposed thesis of this dissertation has both a practical element and theoretical component which aim to answer key questions related to the use of angles-only navigation for autonomous orbital rendezvous. The first and fundamental principle to this work argues that an angles-only navigation filter can determine the relative position and orientation (pose) between two spacecraft to perform the necessary maneuvers and close proximity operations for autonomous orbital rendezvous. Second, the implementation of angles-only navigation for on-orbit applications is looked upon with skeptical eyes because of its perceived limitation of determining the relative range between two vehicles. This assumed, yet little understood subtlety can be formally characterized with a closed-form analytical observability criteria which specifies the necessary and sufficient conditions for determining the relative position and velocity with only angular measurements. With a mathematical expression of the observability criteria, it can be used to (1) identify the orbital rendezvous trajectories and maneuvers that ensure the relative position and velocity are observable for angles-only navigation, (2) quantify the degree or level of observability and (3) compute optimal maneuvers that maximize observability. In summary, the objective of this dissertation is to provide both a practical and theoretical foundation for the advancement of autonomous orbital rendezvous through the use of angles-only navigation.

  7. Learning for autonomous navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter.

  8. Multisensor Equipped Uav/ugv for Automated Exploration

    NASA Astrophysics Data System (ADS)

    Batzdorfer, S.; Bobbe, M.; Becker, M.; Harms, H.; Bestmann, U.

    2017-08-01

    The usage of unmanned systems for exploring disaster scenarios has become more and more important in recent times as a supporting system for action forces. These systems have to offer a well-balanced relationship between the quality of support and additional workload. Therefore within the joint research project ANKommEn - german acronym for Automated Navigation and Communication for Exploration - a system for exploration of disaster scenarios is build-up using multiple UAV und UGV controlled via a central ground station. The ground station serves as user interface for defining missions and tasks conducted by the unmanned systems, equipped with different environmental sensors like cameras - RGB as well as IR - or LiDAR. Depending on the exploration task results, in form of pictures, 2D stitched orthophoto or LiDAR point clouds will be transmitted via datalinks and displayed online at the ground station or will be processed in short-term after a mission, e.g. 3D photogrammetry. For mission planning and its execution, UAV/UGV monitoring and georeferencing of environmental sensor data, reliable positioning and attitude information is required. This is gathered using an integrated GNSS/IMU positioning system. In order to increase availability of positioning information in GNSS challenging scenarios, a GNSS-Multiconstellation based approach is used, amongst others. The present paper focuses on the overall system design including the ground station and sensor setups on the UAVs and UGVs, the underlying positioning techniques as well as 2D and 3D exploration based on a RGB camera mounted on board the UAV and its evaluation based on real world field tests.

  9. Navigation for the new millennium: Autonomous navigation for Deep Space 1

    NASA Technical Reports Server (NTRS)

    Reidel, J. E.; Bhaskaran, S.; Synnott, S. P.; Desai, S. D.; Bollman, W. E.; Dumont, P. J.; Halsell, C. A.; Han, D.; Kennedy, B. M.; Null, G. W.; hide

    1997-01-01

    The autonomous optical navigation system technology for the Deep Space 1 (DS1) mission is reported on. The DS1 navigation system will be the first to use autonomous navigation in deep space. The systems tasks are to: perform interplanetary cruise orbit determination using images of distant asteroids; control and maintain the orbit of the spacecraft with an ion propulsion system and conventional thrusters, and perform late knowledge updates of target position during close flybys in order to facilitate high quality data return from asteroid MaAuliffe and comet West-Kohoutek-Ikemura. To accomplish these tasks, the following functions are required: picture planning; image processing; dynamical modeling and integration; planetary ephemeris and star catalog handling; orbit determination; data filtering and estimation; maneuver estimation, and spacecraft ephemeris updating. These systems and functions are described and preliminary performance data are presented.

  10. A SINS/SRS/GNS Autonomous Integrated Navigation System Based on Spectral Redshift Velocity Measurements.

    PubMed

    Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke

    2018-04-09

    In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology.

  11. A SINS/SRS/GNS Autonomous Integrated Navigation System Based on Spectral Redshift Velocity Measurements

    PubMed Central

    Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke

    2018-01-01

    In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology. PMID:29642549

  12. Autonomous assistance navigation for robotic wheelchairs in confined spaces.

    PubMed

    Cheein, Fernando Auat; Carelli, Ricardo; De la Cruz, Celso; Muller, Sandra; Bastos Filho, Teodiano F

    2010-01-01

    In this work, a visual interface for the assistance of a robotic wheelchair's navigation is presented. The visual interface is developed for the navigation in confined spaces such as narrows corridors or corridor-ends. The interface performs two navigation modus: non-autonomous and autonomous. The non-autonomous driving of the robotic wheelchair is made by means of a hand-joystick. The joystick directs the motion of the vehicle within the environment. The autonomous driving is performed when the user of the wheelchair has to turn (90, 90 or 180 degrees) within the environment. The turning strategy is performed by a maneuverability algorithm compatible with the kinematics of the wheelchair and by the SLAM (Simultaneous Localization and Mapping) algorithm. The SLAM algorithm provides the interface with the information concerning the environment disposition and the pose -position and orientation-of the wheelchair within the environment. Experimental and statistical results of the interface are also shown in this work.

  13. Autonomous satellite navigation using starlight refraction angle measurements

    NASA Astrophysics Data System (ADS)

    Ning, Xiaolin; Wang, Longhua; Bai, Xinbei; Fang, Jiancheng

    2013-05-01

    An on-board autonomous navigation capability is required to reduce the operation costs and enhance the navigation performance of future satellites. Autonomous navigation by stellar refraction is a type of autonomous celestial navigation method that uses high-accuracy star sensors instead of Earth sensors to provide information regarding Earth's horizon. In previous studies, the refraction apparent height has typically been used for such navigation. However, the apparent height cannot be measured directly by a star sensor and can only be calculated by the refraction angle and an atmospheric refraction model. Therefore, additional errors are introduced by the uncertainty and nonlinearity of atmospheric refraction models, which result in reduced navigation accuracy and reliability. A new navigation method based on the direct measurement of the refraction angle is proposed to solve this problem. Techniques for the determination of the refraction angle are introduced, and a measurement model for the refraction angle is established. The method is tested and validated by simulations. When the starlight refraction height ranges from 20 to 50 km, a positioning accuracy of better than 100 m can be achieved for a low-Earth-orbit (LEO) satellite using the refraction angle, while the positioning accuracy of the traditional method using the apparent height is worse than 500 m under the same conditions. Furthermore, an analysis of the factors that affect navigation accuracy, including the measurement accuracy of the refraction angle, the number of visible refracted stars per orbit and the installation azimuth of star sensor, is presented. This method is highly recommended for small satellites in particular, as no additional hardware besides two star sensors is required.

  14. Autonomous navigation system based on GPS and magnetometer data

    NASA Technical Reports Server (NTRS)

    Julie, Thienel K. (Inventor); Richard, Harman R. (Inventor); Bar-Itzhack, Itzhack Y. (Inventor)

    2004-01-01

    This invention is drawn to an autonomous navigation system using Global Positioning System (GPS) and magnetometers for low Earth orbit satellites. As a magnetometer is reliable and always provides information on spacecraft attitude, rate, and orbit, the magnetometer-GPS configuration solves GPS initialization problem, decreasing the convergence time for navigation estimate and improving the overall accuracy. Eventually the magnetometer-GPS configuration enables the system to avoid costly and inherently less reliable gyro for rate estimation. Being autonomous, this invention would provide for black-box spacecraft navigation, producing attitude, orbit, and rate estimates without any ground input with high accuracy and reliability.

  15. Enabling Autonomous Navigation for Affordable Scooters.

    PubMed

    Liu, Kaikai; Mulky, Rajathswaroop

    2018-06-05

    Despite the technical success of existing assistive technologies, for example, electric wheelchairs and scooters, they are still far from effective enough in helping those in need navigate to their destinations in a hassle-free manner. In this paper, we propose to improve the safety and autonomy of navigation by designing a cutting-edge autonomous scooter, thus allowing people with mobility challenges to ambulate independently and safely in possibly unfamiliar surroundings. We focus on indoor navigation scenarios for the autonomous scooter where the current location, maps, and nearby obstacles are unknown. To achieve semi-LiDAR functionality, we leverage the gyros-based pose data to compensate the laser motion in real time and create synthetic mapping of simple environments with regular shapes and deep hallways. Laser range finders are suitable for long ranges with limited resolution. Stereo vision, on the other hand, provides 3D structural data of nearby complex objects. To achieve simultaneous fine-grained resolution and long range coverage in the mapping of cluttered and complex environments, we dynamically fuse the measurements from the stereo vision camera system, the synthetic laser scanner, and the LiDAR. We propose solutions to self-correct errors in data fusion and create a hybrid map to assist the scooter in achieving collision-free navigation in an indoor environment.

  16. Autonomous satellite navigation by stellar refraction

    NASA Technical Reports Server (NTRS)

    Gounley, R.; White, R.; Gai, E.

    1983-01-01

    This paper describes an error analysis of an autonomous navigator using refraction measurements of starlight passing through the upper atmosphere. The analysis is based on a discrete linear Kalman filter. The filter generated steady-state values of navigator performance for a variety of test cases. Results of these simulations show that in low-earth orbit position-error standard deviations of less than 0.100 km may be obtained using only 40 star sightings per orbit.

  17. Autonomous navigation using lunar beacons

    NASA Technical Reports Server (NTRS)

    Khatib, A. R.; Ellis, J.; French, J.; Null, G.; Yunck, T.; Wu, S.

    1983-01-01

    The concept of using lunar beacon signal transmission for on-board navigation for earth satellites and near-earth spacecraft is described. The system would require powerful transmitters on the earth-side of the moon's surface and black box receivers with antennae and microprocessors placed on board spacecraft for autonomous navigation. Spacecraft navigation requires three position and three velocity elements to establish location coordinates. Two beacons could be soft-landed on the lunar surface at the limits of allowable separation and each would transmit a wide-beam signal with cones reaching GEO heights and be strong enough to be received by small antennae in near-earth orbit. The black box processor would perform on-board computation with one-way Doppler/range data and dynamical models. Alternatively, GEO satellites such as the GPS or TDRSS spacecraft can be used with interferometric techniques to provide decimeter-level accuracy for aircraft navigation.

  18. Developing Autonomous Vehicles That Learn to Navigate by Mimicking Human Behavior

    DTIC Science & Technology

    2006-09-28

    navigate in an unstructured environment to a specific target or location. 15. SUBJECT TERMS autonomous vehicles , fuzzy logic, learning behavior...ANSI-Std Z39-18 Developing Autonomous Vehicles That Learn to Navigate by Mimicking Human Behavior FINAL REPORT 9/28/2006 Dean B. Edwards Department...the future, as greater numbers of autonomous vehicles are employed, it is hoped that lower LONG-TERM GOALS Use LAGR (Learning Applied to Ground Robots

  19. Autonomous RPRV Navigation, Guidance and Control

    NASA Technical Reports Server (NTRS)

    Johnston, Donald E.; Myers, Thomas T.; Zellner, John W.

    1983-01-01

    Dryden Flight Research Center has the responsibility for flight testing of advanced remotely piloted research vehicles (RPRV) to explore highly maneuverable aircraft technology, and to test advanced structural concepts, and related aeronautical technologies which can yield important research results with significant cost benefits. The primary purpose is to provide the preliminary design of an upgraded automatic approach and landing control system and flight director display to improve landing performance and reduce pilot workload. A secondary purpose is to determine the feasibility of an onboard autonomous navigation, orbit, and landing capability for safe vehicle recovery in the event of loss of telemetry uplink communication with the vehicles. The current RPRV approach and landing method, the proposed automatic and manual approach and autoland system, and an autonomous navigation, orbit, and landing system concept which is based on existing operational technology are described.

  20. Autonomous Navigation of Small Uavs Based on Vehicle Dynamic Model

    NASA Astrophysics Data System (ADS)

    Khaghani, M.; Skaloud, J.

    2016-03-01

    This paper presents a novel approach to autonomous navigation for small UAVs, in which the vehicle dynamic model (VDM) serves as the main process model within the navigation filter. The proposed method significantly increases the accuracy and reliability of autonomous navigation, especially for small UAVs with low-cost IMUs on-board. This is achieved with no extra sensor added to the conventional INS/GNSS setup. This improvement is of special interest in case of GNSS outages, where inertial coasting drifts very quickly. In the proposed architecture, the solution to VDM equations provides the estimate of position, velocity, and attitude, which is updated within the navigation filter based on available observations, such as IMU data or GNSS measurements. The VDM is also fed with the control input to the UAV, which is available within the control/autopilot system. The filter is capable of estimating wind velocity and dynamic model parameters, in addition to navigation states and IMU sensor errors. Monte Carlo simulations reveal major improvements in navigation accuracy compared to conventional INS/GNSS navigation system during the autonomous phase, when satellite signals are not available due to physical obstruction or electromagnetic interference for example. In case of GNSS outages of a few minutes, position and attitude accuracy experiences improvements of orders of magnitude compared to inertial coasting. It means that during such scenario, the position-velocity-attitude (PVA) determination is sufficiently accurate to navigate the UAV to a home position without any signal that depends on vehicle environment.

  1. An interactive physics-based unmanned ground vehicle simulator leveraging open source gaming technology: progress in the development and application of the virtual autonomous navigation environment (VANE) desktop

    NASA Astrophysics Data System (ADS)

    Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.

    2009-05-01

    It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.

  2. Autonomous navigation and obstacle avoidance for unmanned surface vehicles

    NASA Astrophysics Data System (ADS)

    Larson, Jacoby; Bruch, Michael; Ebken, John

    2006-05-01

    The US Navy and other Department of Defense (DoD) and Department of Homeland Security (DHS) organizations are increasingly interested in the use of unmanned surface vehicles (USVs) for a variety of missions and applications. In order for USVs to fill these roles, they must be capable of a relatively high degree of autonomous navigation. Space and Naval Warfare Systems Center, San Diego is developing core technologies required for robust USV operation in a real-world environment, primarily focusing on autonomous navigation, obstacle avoidance, and path planning.

  3. Autonomous satellite navigation with the Global Positioning System

    NASA Technical Reports Server (NTRS)

    Fuchs, A. J.; Wooden, W. H., II; Long, A. C.

    1977-01-01

    This paper discusses the potential of using the Global Positioning System (GPS) to provide autonomous navigation capability to NASA satellites in the 1980 era. Some of the driving forces motivating autonomous navigation are presented. These include such factors as advances in attitude control systems, onboard science annotation, and onboard gridding of imaging data. Simulation results which demonstrate baseline orbit determination accuracies using GPS data on Seasat, Landsat-D, and the Solar Maximum Mission are presented. Emphasis is placed on identifying error sources such as GPS time, GPS ephemeris, user timing biases, and user orbit dynamics, and in a parametric sense on evaluating their contribution to the orbit determination accuracies.

  4. Visual Odometry for Autonomous Deep-Space Navigation

    NASA Technical Reports Server (NTRS)

    Robinson, Shane; Pedrotty, Sam

    2016-01-01

    Visual Odometry fills two critical needs shared by all future exploration architectures considered by NASA: Autonomous Rendezvous and Docking (AR&D), and autonomous navigation during loss of comm. To do this, a camera is combined with cutting-edge algorithms (called Visual Odometry) into a unit that provides accurate relative pose between the camera and the object in the imagery. Recent simulation analyses have demonstrated the ability of this new technology to reliably, accurately, and quickly compute a relative pose. This project advances this technology by both preparing the system to process flight imagery and creating an activity to capture said imagery. This technology can provide a pioneering optical navigation platform capable of supporting a wide variety of future missions scenarios: deep space rendezvous, asteroid exploration, loss-of-comm.

  5. Learning for Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Robotic ground vehicles for outdoor applications have achieved some remarkable successes, notably in autonomous highway following (Dickmanns, 1987), planetary exploration (1), and off-road navigation on Earth (1). Nevertheless, major challenges remain to enable reliable, high-speed, autonomous navigation in a wide variety of complex, off-road terrain. 3-D perception of terrain geometry with imaging range sensors is the mainstay of off-road driving systems. However, the stopping distance at high speed exceeds the effective lookahead distance of existing range sensors. Prospects for extending the range of 3-D sensors is strongly limited by sensor physics, eye safety of lasers, and related issues. Range sensor limitations also allow vehicles to enter large cul-de-sacs even at low speed, leading to long detours. Moreover, sensing only terrain geometry fails to reveal mechanical properties of terrain that are critical to assessing its traversability, such as potential for slippage, sinkage, and the degree of compliance of potential obstacles. Rovers in the Mars Exploration Rover (MER) mission have got stuck in sand dunes and experienced significant downhill slippage in the vicinity of large rock hazards. Earth-based off-road robots today have very limited ability to discriminate traversable vegetation from non-traversable vegetation or rough ground. It is impossible today to preprogram a system with knowledge of these properties for all types of terrain and weather conditions that might be encountered.

  6. Microcantilever sensor platform for UGV-based detection

    NASA Astrophysics Data System (ADS)

    Lawrence, Tyson T.; Halleck, A. E.; Schuler, Peter S.; Mahmud, K. K.; Hicks, David R.

    2010-04-01

    The increased use of Unmanned Ground Vehicles (UGVs) drives the need for new lightweight, low cost sensors. Microelectromechanical System (MEMS) based microcantilever sensors are a promising technology to meet this need, because they can be manufactured at low cost on a mass scale, and are easily integrated into a UGV platform for detection of explosives and other threat agents. While the technology is extremely sensitive, selectivity is a major challenge and the response modes are not well understood. This work summarizes advances in characterizing ultrasensitive microcantilever responses, sampling considerations, and sensor design and cantilever coating methodologies consistent with UGV point detector needs.

  7. Autonomous vision-based navigation for proximity operations around binary asteroids

    NASA Astrophysics Data System (ADS)

    Gil-Fernandez, Jesus; Ortega-Hernando, Guillermo

    2018-02-01

    Future missions to small bodies demand higher level of autonomy in the Guidance, Navigation and Control system for higher scientific return and lower operational costs. Different navigation strategies have been assessed for ESA's asteroid impact mission (AIM). The main objective of AIM is the detailed characterization of binary asteroid Didymos. The trajectories for the proximity operations shall be intrinsically safe, i.e., no collision in presence of failures (e.g., spacecraft entering safe mode), perturbations (e.g., non-spherical gravity field), and errors (e.g., maneuver execution error). Hyperbolic arcs with sufficient hyperbolic excess velocity are designed to fulfil the safety, scientific, and operational requirements. The trajectory relative to the asteroid is determined using visual camera images. The ground-based trajectory prediction error at some points is comparable to the camera Field Of View (FOV). Therefore, some images do not contain the entire asteroid. Autonomous navigation can update the state of the spacecraft relative to the asteroid at higher frequency. The objective of the autonomous navigation is to improve the on-board knowledge compared to the ground prediction. The algorithms shall fit in off-the-shelf, space-qualified avionics. This note presents suitable image processing and relative-state filter algorithms for autonomous navigation in proximity operations around binary asteroids.

  8. Autonomous vision-based navigation for proximity operations around binary asteroids

    NASA Astrophysics Data System (ADS)

    Gil-Fernandez, Jesus; Ortega-Hernando, Guillermo

    2018-06-01

    Future missions to small bodies demand higher level of autonomy in the Guidance, Navigation and Control system for higher scientific return and lower operational costs. Different navigation strategies have been assessed for ESA's asteroid impact mission (AIM). The main objective of AIM is the detailed characterization of binary asteroid Didymos. The trajectories for the proximity operations shall be intrinsically safe, i.e., no collision in presence of failures (e.g., spacecraft entering safe mode), perturbations (e.g., non-spherical gravity field), and errors (e.g., maneuver execution error). Hyperbolic arcs with sufficient hyperbolic excess velocity are designed to fulfil the safety, scientific, and operational requirements. The trajectory relative to the asteroid is determined using visual camera images. The ground-based trajectory prediction error at some points is comparable to the camera Field Of View (FOV). Therefore, some images do not contain the entire asteroid. Autonomous navigation can update the state of the spacecraft relative to the asteroid at higher frequency. The objective of the autonomous navigation is to improve the on-board knowledge compared to the ground prediction. The algorithms shall fit in off-the-shelf, space-qualified avionics. This note presents suitable image processing and relative-state filter algorithms for autonomous navigation in proximity operations around binary asteroids.

  9. Precise laser gyroscope for autonomous inertial navigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, A G; Molchanov, A V; Izmailov, E A

    2015-01-31

    Requirements to gyroscopes of strapdown inertial navigation systems for aircraft application are formulated. The construction of a ring helium – neon laser designed for autonomous navigation is described. The processes that determine the laser service life and the relation between the random error of the angular velocity measurement and the surface relief features of the cavity mirrors are analysed. The results of modelling one of the promising approaches to processing the laser gyroscope signals are presented. (laser gyroscopes)

  10. Autonomous Navigation for Autonomous Underwater Vehicles Based on Information Filters and Active Sensing

    PubMed Central

    He, Bo; Zhang, Hongjin; Li, Chao; Zhang, Shujing; Liang, Yan; Yan, Tianhong

    2011-01-01

    This paper addresses an autonomous navigation method for the autonomous underwater vehicle (AUV) C-Ranger applying information-filter-based simultaneous localization and mapping (SLAM), and its sea trial experiments in Tuandao Bay (Shangdong Province, P.R. China). Weak links in the information matrix in an extended information filter (EIF) can be pruned to achieve an efficient approach-sparse EIF algorithm (SEIF-SLAM). All the basic update formulae can be implemented in constant time irrespective of the size of the map; hence the computational complexity is significantly reduced. The mechanical scanning imaging sonar is chosen as the active sensing device for the underwater vehicle, and a compensation method based on feedback of the AUV pose is presented to overcome distortion of the acoustic images due to the vehicle motion. In order to verify the feasibility of the navigation methods proposed for the C-Ranger, a sea trial was conducted in Tuandao Bay. Experimental results and analysis show that the proposed navigation approach based on SEIF-SLAM improves the accuracy of the navigation compared with conventional method; moreover the algorithm has a low computational cost when compared with EKF-SLAM. PMID:22346682

  11. Autonomous navigation for autonomous underwater vehicles based on information filters and active sensing.

    PubMed

    He, Bo; Zhang, Hongjin; Li, Chao; Zhang, Shujing; Liang, Yan; Yan, Tianhong

    2011-01-01

    This paper addresses an autonomous navigation method for the autonomous underwater vehicle (AUV) C-Ranger applying information-filter-based simultaneous localization and mapping (SLAM), and its sea trial experiments in Tuandao Bay (Shangdong Province, P.R. China). Weak links in the information matrix in an extended information filter (EIF) can be pruned to achieve an efficient approach-sparse EIF algorithm (SEIF-SLAM). All the basic update formulae can be implemented in constant time irrespective of the size of the map; hence the computational complexity is significantly reduced. The mechanical scanning imaging sonar is chosen as the active sensing device for the underwater vehicle, and a compensation method based on feedback of the AUV pose is presented to overcome distortion of the acoustic images due to the vehicle motion. In order to verify the feasibility of the navigation methods proposed for the C-Ranger, a sea trial was conducted in Tuandao Bay. Experimental results and analysis show that the proposed navigation approach based on SEIF-SLAM improves the accuracy of the navigation compared with conventional method; moreover the algorithm has a low computational cost when compared with EKF-SLAM.

  12. Fuzzy Behavior Modulation with Threshold Activation for Autonomous Vehicle Navigation

    NASA Technical Reports Server (NTRS)

    Tunstel, Edward

    2000-01-01

    This paper describes fuzzy logic techniques used in a hierarchical behavior-based architecture for robot navigation. An architectural feature for threshold activation of fuzzy-behaviors is emphasized, which is potentially useful for tuning navigation performance in real world applications. The target application is autonomous local navigation of a small planetary rover. Threshold activation of low-level navigation behaviors is the primary focus. A preliminary assessment of its impact on local navigation performance is provided based on computer simulations.

  13. Autonomous Collision-Free Navigation of Microvehicles in Complex and Dynamically Changing Environments.

    PubMed

    Li, Tianlong; Chang, Xiaocong; Wu, Zhiguang; Li, Jinxing; Shao, Guangbin; Deng, Xinghong; Qiu, Jianbin; Guo, Bin; Zhang, Guangyu; He, Qiang; Li, Longqiu; Wang, Joseph

    2017-09-26

    Self-propelled micro- and nanoscale robots represent a rapidly emerging and fascinating robotics research area. However, designing autonomous and adaptive control systems for operating micro/nanorobotics in complex and dynamically changing environments, which is a highly demanding feature, is still an unmet challenge. Here we describe a smart microvehicle for precise autonomous navigation in complicated environments and traffic scenarios. The fully autonomous navigation system of the smart microvehicle is composed of a microscope-coupled CCD camera, an artificial intelligence planner, and a magnetic field generator. The microscope-coupled CCD camera provides real-time localization of the chemically powered Janus microsphere vehicle and environmental detection for path planning to generate optimal collision-free routes, while the moving direction of the microrobot toward a reference position is determined by the external electromagnetic torque. Real-time object detection offers adaptive path planning in response to dynamically changing environments. We demonstrate that the autonomous navigation system can guide the vehicle movement in complex patterns, in the presence of dynamically changing obstacles, and in complex biological environments. Such a navigation system for micro/nanoscale vehicles, relying on vision-based close-loop control and path planning, is highly promising for their autonomous operation in complex dynamic settings and unpredictable scenarios expected in a variety of realistic nanoscale scenarios.

  14. Development of autonomous grasping and navigating robot

    NASA Astrophysics Data System (ADS)

    Kudoh, Hiroyuki; Fujimoto, Keisuke; Nakayama, Yasuichi

    2015-01-01

    The ability to find and grasp target items in an unknown environment is important for working robots. We developed an autonomous navigating and grasping robot. The operations are locating a requested item, moving to where the item is placed, finding the item on a shelf or table, and picking the item up from the shelf or the table. To achieve these operations, we designed the robot with three functions: an autonomous navigating function that generates a map and a route in an unknown environment, an item position recognizing function, and a grasping function. We tested this robot in an unknown environment. It achieved a series of operations: moving to a destination, recognizing the positions of items on a shelf, picking up an item, placing it on a cart with its hand, and returning to the starting location. The results of this experiment show the applicability of reducing the workforce with robots.

  15. The development of a UGV-mounted automated refueling system for VTOL UAVs

    NASA Astrophysics Data System (ADS)

    Wills, Mike; Burmeister, Aaron; Nelson, Travis; Denewiler, Thomas; Mullens, Kathy

    2006-05-01

    , such as the AAI iSTAR ducted-fan vehicle and small helicopter UAVs. Finally, a common command-and-control architecture which supports the UAV, UGV, and AUMS must be developed and interfaced with these systems to allow fully autonomous collaborative behaviors. Funded by the Joint Robotics Program, AUMS is part of a joint effort with the Air Force Research Laboratory and the Army Missile Research Development and Engineering Command. The objective is to develop and demonstrate UGVUAV teaming concepts and work with the warfighter to ensure that future upgrades are focused on operational requirements. This paper describes the latest achievements in AUMS development and some of the military program and first responder situations that could benefit from this system.

  16. An Autonomous Navigation Algorithm for High Orbit Satellite Using Star Sensor and Ultraviolet Earth Sensor

    PubMed Central

    Baohua, Li; Wenjie, Lai; Yun, Chen; Zongming, Liu

    2013-01-01

    An autonomous navigation algorithm using the sensor that integrated the star sensor (FOV1) and ultraviolet earth sensor (FOV2) is presented. The star images are sampled by FOV1, and the ultraviolet earth images are sampled by the FOV2. The star identification algorithm and star tracking algorithm are executed at FOV1. Then, the optical axis direction of FOV1 at J2000.0 coordinate system is calculated. The ultraviolet image of earth is sampled by FOV2. The center vector of earth at FOV2 coordinate system is calculated with the coordinates of ultraviolet earth. The autonomous navigation data of satellite are calculated by integrated sensor with the optical axis direction of FOV1 and the center vector of earth from FOV2. The position accuracy of the autonomous navigation for satellite is improved from 1000 meters to 300 meters. And the velocity accuracy of the autonomous navigation for satellite is improved from 100 m/s to 20 m/s. At the same time, the period sine errors of the autonomous navigation for satellite are eliminated. The autonomous navigation for satellite with a sensor that integrated ultraviolet earth sensor and star sensor is well robust. PMID:24250261

  17. An autonomous navigation algorithm for high orbit satellite using star sensor and ultraviolet earth sensor.

    PubMed

    Baohua, Li; Wenjie, Lai; Yun, Chen; Zongming, Liu

    2013-01-01

    An autonomous navigation algorithm using the sensor that integrated the star sensor (FOV1) and ultraviolet earth sensor (FOV2) is presented. The star images are sampled by FOV1, and the ultraviolet earth images are sampled by the FOV2. The star identification algorithm and star tracking algorithm are executed at FOV1. Then, the optical axis direction of FOV1 at J2000.0 coordinate system is calculated. The ultraviolet image of earth is sampled by FOV2. The center vector of earth at FOV2 coordinate system is calculated with the coordinates of ultraviolet earth. The autonomous navigation data of satellite are calculated by integrated sensor with the optical axis direction of FOV1 and the center vector of earth from FOV2. The position accuracy of the autonomous navigation for satellite is improved from 1000 meters to 300 meters. And the velocity accuracy of the autonomous navigation for satellite is improved from 100 m/s to 20 m/s. At the same time, the period sine errors of the autonomous navigation for satellite are eliminated. The autonomous navigation for satellite with a sensor that integrated ultraviolet earth sensor and star sensor is well robust.

  18. New vision system and navigation algorithm for an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Tann, Hokchhay; Shakya, Bicky; Merchen, Alex C.; Williams, Benjamin C.; Khanal, Abhishek; Zhao, Jiajia; Ahlgren, David J.

    2013-12-01

    Improvements were made to the intelligence algorithms of an autonomously operating ground vehicle, Q, which competed in the 2013 Intelligent Ground Vehicle Competition (IGVC). The IGVC required the vehicle to first navigate between two white lines on a grassy obstacle course, then pass through eight GPS waypoints, and pass through a final obstacle field. Modifications to Q included a new vision system with a more effective image processing algorithm for white line extraction. The path-planning algorithm adopted the vision system, creating smoother, more reliable navigation. With these improvements, Q successfully completed the basic autonomous navigation challenge, finishing tenth out of over 50 teams.

  19. Reactive navigation for autonomous guided vehicle using neuro-fuzzy techniques

    NASA Astrophysics Data System (ADS)

    Cao, Jin; Liao, Xiaoqun; Hall, Ernest L.

    1999-08-01

    A Neuro-fuzzy control method for navigation of an Autonomous Guided Vehicle robot is described. Robot navigation is defined as the guiding of a mobile robot to a desired destination or along a desired path in an environment characterized by as terrain and a set of distinct objects, such as obstacles and landmarks. The autonomous navigate ability and road following precision are mainly influenced by its control strategy and real-time control performance. Neural network and fuzzy logic control techniques can improve real-time control performance for mobile robot due to its high robustness and error-tolerance ability. For a mobile robot to navigate automatically and rapidly, an important factor is to identify and classify mobile robots' currently perceptual environment. In this paper, a new approach of the current perceptual environment feature identification and classification, which are based on the analysis of the classifying neural network and the Neuro- fuzzy algorithm, is presented. The significance of this work lies in the development of a new method for mobile robot navigation.

  20. Spatial abstraction for autonomous robot navigation.

    PubMed

    Epstein, Susan L; Aroor, Anoop; Evanusa, Matthew; Sklar, Elizabeth I; Parsons, Simon

    2015-09-01

    Optimal navigation for a simulated robot relies on a detailed map and explicit path planning, an approach problematic for real-world robots that are subject to noise and error. This paper reports on autonomous robots that rely on local spatial perception, learning, and commonsense rationales instead. Despite realistic actuator error, learned spatial abstractions form a model that supports effective travel.

  1. UGV: security analysis of subsystem control network

    NASA Astrophysics Data System (ADS)

    Abbott-McCune, Sam; Kobezak, Philip; Tront, Joseph; Marchany, Randy; Wicks, Al

    2013-05-01

    Unmanned Ground vehicles (UGVs) are becoming prolific in the heterogeneous superset of robotic platforms. The sensors which provide odometry, localization, perception, and vehicle diagnostics are fused to give the robotic platform a sense of the environment it is traversing. The automotive industry CAN bus has dominated the industry due to the fault tolerance and the message structure allowing high priority messages to reach the desired node in a real time environment. UGVs are being researched and produced at an accelerated rate to preform arduous, repetitive, and dangerous missions that are associated with a military action in a protracted conflict. The technology and applications of the research will inevitably be turned into dual-use platforms to aid civil agencies in the performance of their various operations. Our motivation is security of the holistic system; however as subsystems are outsourced in the design, the overall security of the system may be diminished. We will focus on the CAN bus topology and the vulnerabilities introduced in UGVs and recognizable security vulnerabilities that are inherent in the communications architecture. We will show how data can be extracted from an add-on CAN bus that can be customized to monitor subsystems. The information can be altered or spoofed to force the vehicle to exhibit unwanted actions or render the UGV unusable for the designed mission. The military relies heavily on technology to maintain information dominance, and the security of the information introduced onto the network by UGVs must be safeguarded from vulnerabilities that can be exploited.

  2. Developing UGVs for the FCS program

    NASA Astrophysics Data System (ADS)

    Kamsickas, Gary M.; Ward, John N.

    2003-09-01

    The FCS Operational Requirements Document (ORD) identifies unmanned systems as a key component of the FCS Unit of Action. FCS unmanned systems include Unmanned Aerial Vehicles (UAV), Unmanned Ground Vehicles (UGV), Unattended Ground Sensors (UGS) and Unattended Munitions (UM). Unmanned systems are intended to enhance the Unit of Action across the full range of operations when integrated with manned platforms. Unmanned systems will provide the commander with tools to gather battlespace information while significantly reducing overall soldier risk. Unmanned systems will be used in some cases to augment or replace human intervention to perform many of the dirty, dull and dangerous missions presently performed by soldiers and to serve as a combat multiplier for mission performance, force protection and survivability. This paper focuses on the application of UGVs within the FCS Unit of Action. There are three different UGVs planned to support the FCS Unit of Action; the Soldier Unmanned Ground Vehicle (SUGV); The Multi-role Utility Logistics Equipment (MULE) platform; and the Armed Robotic Vehicle (ARV).

  3. Recursive Gradient Estimation Using Splines for Navigation of Autonomous Vehicles.

    DTIC Science & Technology

    1985-07-01

    AUTONOMOUS VEHICLES C. N. SHEN DTIC " JULY 1985 SEP 1 219 85 V US ARMY ARMAMENT RESEARCH AND DEVELOPMENT CENTER LARGE CALIBER WEAPON SYSTEMS LABORATORY I...GRADIENT ESTIMATION USING SPLINES FOR NAVIGATION OF AUTONOMOUS VEHICLES Final S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(q) 8. CONTRACT OR GRANT NUMBER...which require autonomous vehicles . Essential to these robotic vehicles is an adequate and efficient computer vision system. A potentially more

  4. Guidance and Navigation Software Architecture Design for the Autonomous Multi-Agent Physically Interacting Spacecraft (AMPHIS) Test Bed

    DTIC Science & Technology

    2006-12-01

    NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI

  5. Autonomous GPS/INS navigation experiment for Space Transfer Vehicle

    NASA Technical Reports Server (NTRS)

    Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. W.

    1993-01-01

    An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.

  6. Autonomous GPS/INS navigation experiment for Space Transfer Vehicle

    NASA Astrophysics Data System (ADS)

    Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. W.

    1993-07-01

    An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.

  7. Unmanned Ground Vehicle Navigation and Coverage Hole Patching in Wireless Sensor Networks

    ERIC Educational Resources Information Center

    Zhang, Guyu

    2013-01-01

    This dissertation presents a study of an Unmanned Ground Vehicle (UGV) navigation and coverage hole patching in coordinate-free and localization-free Wireless Sensor Networks (WSNs). Navigation and coverage maintenance are related problems since coverage hole patching requires effective navigation in the sensor network environment. A…

  8. UGV technology for urban environments

    NASA Astrophysics Data System (ADS)

    Christensen, Henrik I.; Folkeson, John; Hedstrom, Andreas; Lundberg, Carl

    2004-09-01

    Deployment of humans in an urban setting for search and rescue type missions poses a major risk to the personnel. In rescue missions the risk can stem from debris, gas, etc. and in a strategic setting the risk can stem from snipers, mines, gas, etc. There is consequently a natural interest in studies of how UGV technology can be deployed for tasks such as reconnaissance, retrieval of objects (bombs, injured people, etc.). Today most vehicles used by the military and bomb squads are tele-operated and without any autonomony. This implies that operation of the vehicles is a stressful and demanding task. Part of this stress can be removed through introduction of autonomous functionality. Autonomy implicitly requires use of map information to allow the system to localize and traverse a particular area in addition autonomous mapping of an area is a valuable functionality as part of reconnaissance missions to provide an initial inventory of a new area. A host of different sensory modalities can be used for mapping. In general no single modality is, however, sufficient for robust and efficient mapping. In the present study GPS, Inertial Cues, Laser ranging and Odometry is used for simultaneous mapping and localization in urban environments. The mapping is carried out autonomously using a coverage strategy to ensure full mapping of a particular area. In relation to mapping another important issue is the design of an efficient user interface that allows a regular rescue worker, or a soldier, to operate the vehicle without detailed knowledge about robotics. A number of different designs for user interfaces will be presented and results from studies with a range of end-users (soldiers) will also be reported. The complete system has been tested in an urban warfare facility outside of Stockholm. Detailed results will be reposted from two different test facilities.

  9. Target Trailing With Safe Navigation With Colregs for Maritime Autonomous Surface Vehicles

    NASA Technical Reports Server (NTRS)

    Kuwata, Yoshiaki (Inventor); Aghazarian, Hrand (Inventor); Huntsberger, Terrance L. (Inventor); Howard, Andrew B. (Inventor); Wolf, Michael T. (Inventor); Zarzhitsky, Dimitri V. (Inventor)

    2014-01-01

    Systems and methods for operating autonomous waterborne vessels in a safe manner. The systems include hardware for identifying the locations and motions of other vessels, as well as the locations of stationary objects that represent navigation hazards. By applying a computational method that uses a maritime navigation algorithm for avoiding hazards and obeying COLREGS using Velocity Obstacles to the data obtained, the autonomous vessel computes a safe and effective path to be followed in order to accomplish a desired navigational end result, while operating in a manner so as to avoid hazards and to maintain compliance with standard navigational procedures defined by international agreement. The systems and methods have been successfully demonstrated on water with radar and stereo cameras as the perception sensors, and integrated with a higher level planner for trailing a maneuvering target.

  10. Fully autonomous navigation for the NASA cargo transfer vehicle

    NASA Technical Reports Server (NTRS)

    Wertz, James R.; Skulsky, E. David

    1991-01-01

    A great deal of attention has been paid to navigation during the close approach (less than or equal to 1 km) phase of spacecraft rendezvous. However, most spacecraft also require a navigation system which provides the necessary accuracy for placing both satellites within the range of the docking sensors. The Microcosm Autonomous Navigation System (MANS) is an on-board system which uses Earth-referenced attitude sensing hardware to provide precision orbit and attitude determination. The system is capable of functioning from LEO to GEO and beyond. Performance depends on the number of available sensors as well as mission geometry; however, extensive simulations have shown that MANS will provide 100 m to 400 m (3(sigma)) position accuracy and 0.03 to 0.07 deg (3(sigma)) attitude accuracy in low Earth orbit. The system is independent of any external source, including GPS. MANS is expected to have a significant impact on ground operations costs, mission definition and design, survivability, and the potential development of very low-cost, fully autonomous spacecraft.

  11. Integrated long-range UAV/UGV collaborative target tracking

    NASA Astrophysics Data System (ADS)

    Moseley, Mark B.; Grocholsky, Benjamin P.; Cheung, Carol; Singh, Sanjiv

    2009-05-01

    Coordinated operations between unmanned air and ground assets allow leveraging of multi-domain sensing and increase opportunities for improving line of sight communications. While numerous military missions would benefit from coordinated UAV-UGV operations, foundational capabilities that integrate stove-piped tactical systems and share available sensor data are required and not yet available. iRobot, AeroVironment, and Carnegie Mellon University are working together, partially SBIR-funded through ARDEC's small unit network lethality initiative, to develop collaborative capabilities for surveillance, targeting, and improved communications based on PackBot UGV and Raven UAV platforms. We integrate newly available technologies into computational, vision, and communications payloads and develop sensing algorithms to support vision-based target tracking. We first simulated and then applied onto real tactical platforms an implementation of Decentralized Data Fusion, a novel technique for fusing track estimates from PackBot and Raven platforms for a moving target in an open environment. In addition, system integration with AeroVironment's Digital Data Link onto both air and ground platforms has extended our capabilities in communications range to operate the PackBot as well as in increased video and data throughput. The system is brought together through a unified Operator Control Unit (OCU) for the PackBot and Raven that provides simultaneous waypoint navigation and traditional teleoperation. We also present several recent capability accomplishments toward PackBot-Raven coordinated operations, including single OCU display design and operation, early target track results, and Digital Data Link integration efforts, as well as our near-term capability goals.

  12. Vision Based Navigation for Autonomous Cooperative Docking of CubeSats

    NASA Astrophysics Data System (ADS)

    Pirat, Camille; Ankersen, Finn; Walker, Roger; Gass, Volker

    2018-05-01

    A realistic rendezvous and docking navigation solution applicable to CubeSats is investigated. The scalability analysis of the ESA Autonomous Transfer Vehicle Guidance, Navigation & Control (GNC) performances and the Russian docking system, shows that the docking of two CubeSats would require a lateral control performance of the order of 1 cm. Line of sight constraints and multipath effects affecting Global Navigation Satellite System (GNSS) measurements in close proximity prevent the use of this sensor for the final approach. This consideration and the high control accuracy requirement led to the use of vision sensors for the final 10 m of the rendezvous and docking sequence. A single monocular camera on the chaser satellite and various sets of Light-Emitting Diodes (LEDs) on the target vehicle ensure the observability of the system throughout the approach trajectory. The simple and novel formulation of the measurement equations allows differentiating unambiguously rotations from translations between the target and chaser docking port and allows a navigation performance better than 1 mm at docking. Furthermore, the non-linear measurement equations can be solved in order to provide an analytic navigation solution. This solution can be used to monitor the navigation filter solution and ensure its stability, adding an extra layer of robustness for autonomous rendezvous and docking. The navigation filter initialization is addressed in detail. The proposed method is able to differentiate LEDs signals from Sun reflections as demonstrated by experimental data. The navigation filter uses a comprehensive linearised coupled rotation/translation dynamics, describing the chaser to target docking port motion. The handover, between GNSS and vision sensor measurements, is assessed. The performances of the navigation function along the approach trajectory is discussed.

  13. INS integrated motion analysis for autonomous vehicle navigation

    NASA Technical Reports Server (NTRS)

    Roberts, Barry; Bazakos, Mike

    1991-01-01

    The use of inertial navigation system (INS) measurements to enhance the quality and robustness of motion analysis techniques used for obstacle detection is discussed with particular reference to autonomous vehicle navigation. The approach to obstacle detection used here employs motion analysis of imagery generated by a passive sensor. Motion analysis of imagery obtained during vehicle travel is used to generate range measurements to points within the field of view of the sensor, which can then be used to provide obstacle detection. Results obtained with an INS integrated motion analysis approach are reviewed.

  14. Autonomous Navigation Results from the Mars Exploration Rover (MER) Mission

    NASA Technical Reports Server (NTRS)

    Maimone, Mark; Johnson, Andrew; Cheng, Yang; Willson, Reg; Matthies, Larry H.

    2004-01-01

    In January, 2004, the Mars Exploration Rover (MER) mission landed two rovers, Spirit and Opportunity, on the surface of Mars. Several autonomous navigation capabilities were employed in space for the first time in this mission. ]n the Entry, Descent, and Landing (EDL) phase, both landers used a vision system called the, Descent Image Motion Estimation System (DIMES) to estimate horizontal velocity during the last 2000 meters (m) of descent, by tracking features on the ground with a downlooking camera, in order to control retro-rocket firing to reduce horizontal velocity before impact. During surface operations, the rovers navigate autonomously using stereo vision for local terrain mapping and a local, reactive planning algorithm called Grid-based Estimation of Surface Traversability Applied to Local Terrain (GESTALT) for obstacle avoidance. ]n areas of high slip, stereo vision-based visual odometry has been used to estimate rover motion, As of mid-June, Spirit had traversed 3405 m, of which 1253 m were done autonomously; Opportunity had traversed 1264 m, of which 224 m were autonomous. These results have contributed substantially to the success of the mission and paved the way for increased levels of autonomy in future missions.

  15. Autonomous celestial navigation based on Earth ultraviolet radiance and fast gradient statistic feature extraction

    NASA Astrophysics Data System (ADS)

    Lu, Shan; Zhang, Hanmo

    2016-01-01

    To meet the requirement of autonomous orbit determination, this paper proposes a fast curve fitting method based on earth ultraviolet features to obtain accurate earth vector direction, in order to achieve the high precision autonomous navigation. Firstly, combining the stable characters of earth ultraviolet radiance and the use of transmission model software of atmospheric radiation, the paper simulates earth ultraviolet radiation model on different time and chooses the proper observation band. Then the fast improved edge extracting method combined Sobel operator and local binary pattern (LBP) is utilized, which can both eliminate noises efficiently and extract earth ultraviolet limb features accurately. And earth's centroid locations on simulated images are estimated via the least square fitting method using part of the limb edges. Taken advantage of the estimated earth vector direction and earth distance, Extended Kalman Filter (EKF) is applied to realize the autonomous navigation finally. Experiment results indicate the proposed method can achieve a sub-pixel earth centroid location estimation and extremely enhance autonomous celestial navigation precision.

  16. Autonomous GPS/INS navigation experiment for Space Transfer Vehicle (STV)

    NASA Technical Reports Server (NTRS)

    Upadhyay, Triveni N.; Cotterill, Stephen; Deaton, A. Wayne

    1991-01-01

    An experiment to validate the concept of developing an autonomous integrated spacecraft navigation system using on board Global Positioning System (GPS) and Inertial Navigation System (INS) measurements is described. The feasibility of integrating GPS measurements with INS measurements to provide a total improvement in spacecraft navigation performance, i.e. improvement in position, velocity and attitude information, was previously demonstrated. An important aspect of this research is the automatic real time reconfiguration capability of the system designed to respond to changes in a spacecraft mission under the control of an expert system.

  17. Linked Autonomous Interplanetary Satellite Orbit Navigation

    NASA Technical Reports Server (NTRS)

    Parker, Jeffrey S.; Anderson, Rodney L.; Born, George H.; Leonard, Jason M.; McGranaghan, Ryan M.; Fujimoto, Kohei

    2013-01-01

    A navigation technology known as LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation) has been known to produce very impressive navigation results for scenarios involving two or more cooperative satellites near the Moon, such that at least one satellite must be in an orbit significantly perturbed by the Earth, such as a lunar halo orbit. The two (or more) satellites track each other using satellite-to-satellite range and/or range-rate measurements. These relative measurements yield absolute orbit navigation when one of the satellites is in a lunar halo orbit, or the like. The geometry between a lunar halo orbiter and a GEO satellite continuously changes, which dramatically improves the information content of a satellite-to-satellite tracking signal. The geometrical variations include significant out-of-plane shifts, as well as inplane shifts. Further, the GEO satellite is almost continuously in view of a lunar halo orbiter. High-fidelity simulations demonstrate that LiAISON technology improves the navigation of GEO orbiters by an order of magnitude, relative to standard ground tracking. If a GEO satellite is navigated using LiAISON- only tracking measurements, its position is typically known to better than 10 meters. If LiAISON measurements are combined with simple radiometric ground observations, then the satellite s position is typically known to better than 3 meters, which is substantially better than the current state of GEO navigation. There are two features of LiAISON that are novel and advantageous compared with conventional satellite navigation. First, ordinary satellite-to-satellite tracking data only provides relative navigation of each satellite. The novelty is the placement of one navigation satellite in an orbit that is significantly perturbed by both the Earth and the Moon. A navigation satellite can track other satellites elsewhere in the Earth-Moon system and acquire knowledge about both satellites absolute positions and velocities

  18. Toward a generic UGV autopilot

    NASA Astrophysics Data System (ADS)

    Moore, Kevin L.; Whitehorn, Mark; Weinstein, Alejandro J.; Xia, Junjun

    2009-05-01

    Much of the success of small unmanned air vehicles (UAVs) has arguably been due to the widespread availability of low-cost, portable autopilots. While the development of unmanned ground vehicles (UGVs) has led to significant achievements, as typified by recent grand challenge events, to date the UGV equivalent of the UAV autopilot is not available. In this paper we describe our recent research aimed at the development of a generic UGV autopilot. Assuming we are given a drive-by-wire vehicle that accepts as inputs steering, brake, and throttle commands, we present a system that adds sonar ranging sensors, GPS/IMU/odometry, stereo camera, and scanning laser sensors, together with a variety of interfacing and communication hardware. The system also includes a finite state machine-based software architecture as well as a graphical user interface for the operator control unit (OCU). Algorithms are presented that enable an end-to-end scenario whereby an operator can view stereo images as seen by the vehicle and can input GPS waypoints either from a map or in the vehicle's scene-view image, at which point the system uses the environmental sensors as inputs to a Kalman filter for pose estimation and then computes control actions to move through the waypoint list, while avoiding obstacles. The long-term goal of the research is a system that is generically applicable to any drive-by-wire unmanned ground vehicle.

  19. Vegetation Versus Man-Made Object Detection from Imagery for Unmanned Vehicles in Off-Road Environments

    DTIC Science & Technology

    2013-05-01

    saliency, natural scene statistics 1. INTRODUCTION Research into the area of autonomous navigation for unmanned ground vehicles (UGV) has accelerated in...recent years. This is partly due to the success of programs such as the DARPA Grand Challenge1 and the dream of driverless cars ,2 but is also due to the...NOTES 14. ABSTRACT There have been several major advances in autonomous navigation for unmanned ground vehicles in controlled urban environments in

  20. Improved obstacle avoidance and navigation for an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Giri, Binod; Cho, Hyunsu; Williams, Benjamin C.; Tann, Hokchhay; Shakya, Bicky; Bharam, Vishal; Ahlgren, David J.

    2015-01-01

    This paper presents improvements made to the intelligence algorithms employed on Q, an autonomous ground vehicle, for the 2014 Intelligent Ground Vehicle Competition (IGVC). In 2012, the IGVC committee combined the formerly separate autonomous and navigation challenges into a single AUT-NAV challenge. In this new challenge, the vehicle is required to navigate through a grassy obstacle course and stay within the course boundaries (a lane of two white painted lines) that guide it toward a given GPS waypoint. Once the vehicle reaches this waypoint, it enters an open course where it is required to navigate to another GPS waypoint while avoiding obstacles. After reaching the final waypoint, the vehicle is required to traverse another obstacle course before completing the run. Q uses modular parallel software architecture in which image processing, navigation, and sensor control algorithms run concurrently. A tuned navigation algorithm allows Q to smoothly maneuver through obstacle fields. For the 2014 competition, most revisions occurred in the vision system, which detects white lines and informs the navigation component. Barrel obstacles of various colors presented a new challenge for image processing: the previous color plane extraction algorithm would not suffice. To overcome this difficulty, laser range sensor data were overlaid on visual data. Q also participates in the Joint Architecture for Unmanned Systems (JAUS) challenge at IGVC. For 2014, significant updates were implemented: the JAUS component accepted a greater variety of messages and showed better compliance to the JAUS technical standard. With these improvements, Q secured second place in the JAUS competition.

  1. Robust Planning for Autonomous Navigation of Mobile Robots in Unstructured, Dynamic Environments: An LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    EISLER, G. RICHARD

    This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstratemore » the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.« less

  2. A Feedforward Control Approach to the Local Navigation Problem for Autonomous Vehicles

    DTIC Science & Technology

    1994-05-02

    AD-A282 787 " A Feedforward Control Approach to the Local Navigation Problem for Autonomous Vehicles Alonzo Kelly CMU-RI-TR-94-17 The Robotics...follow, or a direction to prefer, it cannot generate its own strategic goals. Therefore, it solves the local planning problem for autonomous vehicles . The... autonomous vehicles . It is intelligent because it uses range images that are generated from either a laser rangefinder or a stereo triangulation

  3. Lunar far side surface navigation using Linked Autonomous Interplanetary Satellite Orbit Navigation (LiAISON)

    NASA Astrophysics Data System (ADS)

    Hesar, Siamak G.; Parker, Jeffrey S.; Leonard, Jason M.; McGranaghan, Ryan M.; Born, George H.

    2015-12-01

    We study the application of Linked Autonomous Interplanetary Satellite Orbit Navigation (LiAISON) to track vehicles on the far side of the lunar surface. The LiAISON architecture is demonstrated to achieve accurate orbit determination solutions for various mission scenarios in the Earth-Moon system. Given the proper description of the force field, LiAISON is capable of producing absolute orbit determination solutions using relative satellite-to-satellite tracking observations alone. The lack of direct communication between Earth-based tracking stations and the far side of the Moon provides an ideal opportunity for implementing LiAISON. This paper presents a novel approach to use the LiAISON architecture to perform autonomous navigation of assets on the lunar far side surface. Relative measurements between a spacecraft placed in an EML-2 halo orbit and lunar surface asset(s) are simulated and processed. Comprehensive simulation results show that absolute states of the surface assets are observable with an achieved accuracy of the position estimate on the order of tens of meters.

  4. Autonomous vehicle navigation utilizing fuzzy controls concepts for a next generation wheelchair.

    PubMed

    Hansen, J D; Barrett, S F; Wright, C H G; Wilcox, M

    2008-01-01

    Three different positioning techniques were investigated to create an autonomous vehicle that could accurately navigate towards a goal: Global Positioning System (GPS), compass dead reckoning, and Ackerman steering. Each technique utilized a fuzzy logic controller that maneuvered a four-wheel car towards a target. The reliability and the accuracy of the navigation methods were investigated by modeling the algorithms in software and implementing them in hardware. To implement the techniques in hardware, positioning sensors were interfaced to a remote control car and a microprocessor. The microprocessor utilized the sensor measurements to orient the car with respect to the target. Next, a fuzzy logic control algorithm adjusted the front wheel steering angle to minimize the difference between the heading and bearing. After minimizing the heading error, the car maintained a straight steering angle along its path to the final destination. The results of this research can be used to develop applications that require precise navigation. The design techniques can also be implemented on alternate platforms such as a wheelchair to assist with autonomous navigation.

  5. Autonomous Navigation Above the GNSS Constellations and Beyond: GPS Navigation for the Magnetospheric Multiscale Mission and SEXTANT Pulsar Navigation Demonstration

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke

    2017-01-01

    This talk will describe two first-of-their-kind technology demonstrations attached to ongoing NASA science missions, both of which aim to extend the range of autonomous spacecraft navigation far from the Earth. First, we will describe the onboard GPS navigation system for the Magnetospheric Multiscale (MMS) mission which is currently operating in elliptic orbits reaching nearly halfway to the Moon. The MMS navigation system is a key outgrowth of a larger effort at NASA Goddard Space Flight Center to advance high-altitude Global Navigation Satellite System (GNSS) navigation on multiple fronts, including developing Global Positioning System receivers and onboard navigation software, running simulation studies, and leading efforts to characterize and protect signals at high-altitude in the so-called GNSS Space-Service Volume (SSV). In the second part of the talk, we will describe the Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) mission that aims to make the first in-space demonstration of X-ray pulsar navigation (XNAV). SEXTANT is attached to the NASA astrophysics mission Neutron-star Interior Composition ExploreR (NICER) whose International Space Station mounted X-ray telescope is investigating the fundamental physics of extremes in gravity, material density, and electromagnetic fields found in neutron stars, and whose instrument provides a nearly ideal navigation sensor for XNAV.

  6. An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles.

    PubMed

    Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo

    2017-03-25

    Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance.

  7. Autonomous Navigation Error Propagation Assessment for Lunar Surface Mobility Applications

    NASA Technical Reports Server (NTRS)

    Welch, Bryan W.; Connolly, Joseph W.

    2006-01-01

    The NASA Vision for Space Exploration is focused on the return of astronauts to the Moon. While navigation systems have already been proven in the Apollo missions to the moon, the current exploration campaign will involve more extensive and extended missions requiring new concepts for lunar navigation. In this document, the results of an autonomous navigation error propagation assessment are provided. The analysis is intended to be the baseline error propagation analysis for which Earth-based and Lunar-based radiometric data are added to compare these different architecture schemes, and quantify the benefits of an integrated approach, in how they can handle lunar surface mobility applications when near the Lunar South pole or on the Lunar Farside.

  8. An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles

    PubMed Central

    Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo

    2017-01-01

    Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance. PMID:28346346

  9. ATON (Autonomous Terrain-based Optical Navigation) for exploration missions: recent flight test results

    NASA Astrophysics Data System (ADS)

    Theil, S.; Ammann, N.; Andert, F.; Franz, T.; Krüger, H.; Lehner, H.; Lingenauber, M.; Lüdtke, D.; Maass, B.; Paproth, C.; Wohlfeil, J.

    2018-03-01

    Since 2010 the German Aerospace Center is working on the project Autonomous Terrain-based Optical Navigation (ATON). Its objective is the development of technologies which allow autonomous navigation of spacecraft in orbit around and during landing on celestial bodies like the Moon, planets, asteroids and comets. The project developed different image processing techniques and optical navigation methods as well as sensor data fusion. The setup—which is applicable to many exploration missions—consists of an inertial measurement unit, a laser altimeter, a star tracker and one or multiple navigation cameras. In the past years, several milestones have been achieved. It started with the setup of a simulation environment including the detailed simulation of camera images. This was continued by hardware-in-the-loop tests in the Testbed for Robotic Optical Navigation (TRON) where images were generated by real cameras in a simulated downscaled lunar landing scene. Data were recorded in helicopter flight tests and post-processed in real-time to increase maturity of the algorithms and to optimize the software. Recently, two more milestones have been achieved. In late 2016, the whole navigation system setup was flying on an unmanned helicopter while processing all sensor information onboard in real time. For the latest milestone the navigation system was tested in closed-loop on the unmanned helicopter. For that purpose the ATON navigation system provided the navigation state for the guidance and control of the unmanned helicopter replacing the GPS-based standard navigation system. The paper will give an introduction to the ATON project and its concept. The methods and algorithms of ATON are briefly described. The flight test results of the latest two milestones are presented and discussed.

  10. Structured Kernel Subspace Learning for Autonomous Robot Navigation.

    PubMed

    Kim, Eunwoo; Choi, Sungjoon; Oh, Songhwai

    2018-02-14

    This paper considers two important problems for autonomous robot navigation in a dynamic environment, where the goal is to predict pedestrian motion and control a robot with the prediction for safe navigation. While there are several methods for predicting the motion of a pedestrian and controlling a robot to avoid incoming pedestrians, it is still difficult to safely navigate in a dynamic environment due to challenges, such as the varying quality and complexity of training data with unwanted noises. This paper addresses these challenges simultaneously by proposing a robust kernel subspace learning algorithm based on the recent advances in nuclear-norm and l 1 -norm minimization. We model the motion of a pedestrian and the robot controller using Gaussian processes. The proposed method efficiently approximates a kernel matrix used in Gaussian process regression by learning low-rank structured matrix (with symmetric positive semi-definiteness) to find an orthogonal basis, which eliminates the effects of erroneous and inconsistent data. Based on structured kernel subspace learning, we propose a robust motion model and motion controller for safe navigation in dynamic environments. We evaluate the proposed robust kernel learning in various tasks, including regression, motion prediction, and motion control problems, and demonstrate that the proposed learning-based systems are robust against outliers and outperform existing regression and navigation methods.

  11. A Self-Tuning Kalman Filter for Autonomous Spacecraft Navigation

    NASA Technical Reports Server (NTRS)

    Truong, Son H.

    1998-01-01

    Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman Filter and Global Positioning System (GPS) data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. Current techniques of Kalman filtering, however, still rely on manual tuning from analysts, and cannot help in optimizing autonomy without compromising accuracy and performance. This paper presents an approach to produce a high accuracy autonomous navigation system fully integrated with the flight system. The resulting system performs real-time state estimation by using an Extended Kalman Filter (EKF) implemented with high-fidelity state dynamics model, as does the GPS Enhanced Orbit Determination Experiment (GEODE) system developed by the NASA Goddard Space Flight Center. Augmented to the EKF is a sophisticated neural-fuzzy system, which combines the explicit knowledge representation of fuzzy logic with the learning power of neural networks. The fuzzy-neural system performs most of the self-tuning capability and helps the navigation system recover from estimation errors. The core requirement is a method of state estimation that handles uncertainties robustly, capable of identifying estimation problems, flexible enough to make decisions and adjustments to recover from these problems, and compact enough to run on flight hardware. The resulting system can be extended to support geosynchronous spacecraft and high-eccentricity orbits. Mathematical methodology, systems and operations concepts, and implementation of a system prototype are presented in this paper. Results from the use of the prototype to evaluate optimal control algorithms implemented are discussed. Test data and major control issues (e.g., how to define specific roles for fuzzy logic to support the self-learning capability) are also

  12. Autonomous Vision Navigation for Spacecraft in Lunar Orbit

    NASA Astrophysics Data System (ADS)

    Bader, Nolan A.

    NASA aims to achieve unprecedented navigational reliability for the first manned lunar mission of the Orion spacecraft in 2023. A technique for accomplishing this is to integrate autonomous feature tracking as an added means of improving position and velocity estimation. In this thesis, a template matching algorithm and optical sensor are tested onboard three simulated lunar trajectories using linear covariance techniques under various conditions. A preliminary characterization of the camera gives insight into its ability to determine azimuth and elevation angles to points on the surface of the Moon. A navigation performance analysis shows that an optical camera sensor can aid in decreasing position and velocity errors, particularly in a loss of communication scenario. Furthermore, it is found that camera quality and computational capability are driving factors affecting the performance of such a system.

  13. GPS navigation algorithms for Autonomous Airborne Refueling of Unmanned Air Vehicles

    NASA Astrophysics Data System (ADS)

    Khanafseh, Samer Mahmoud

    Unmanned Air Vehicles (UAVs) have recently generated great interest because of their potential to perform hazardous missions without risking loss of life. If autonomous airborne refueling is possible for UAVs, mission range and endurance will be greatly enhanced. However, concerns about UAV-tanker proximity, dynamic mobility and safety demand that the relative navigation system meets stringent requirements on accuracy, integrity, and continuity. In response, this research focuses on developing high-performance GPS-based navigation architectures for Autonomous Airborne Refueling (AAR) of UAVs. The AAR mission is unique because of the potentially severe sky blockage introduced by the tanker. To address this issue, a high-fidelity dynamic sky blockage model was developed and experimentally validated. In addition, robust carrier phase differential GPS navigation algorithms were derived, including a new method for high-integrity reacquisition of carrier cycle ambiguities for recently-blocked satellites. In order to evaluate navigation performance, world-wide global availability and sensitivity covariance analyses were conducted. The new navigation algorithms were shown to be sufficient for turn-free scenarios, but improvement in performance was necessary to meet the difficult requirements for a general refueling mission with banked turns. Therefore, several innovative methods were pursued to enhance navigation performance. First, a new theoretical approach was developed to quantify the position-domain integrity risk in cycle ambiguity resolution problems. A mechanism to implement this method with partially-fixed cycle ambiguity vectors was derived, and it was used to define tight upper bounds on AAR navigation integrity risk. A second method, where a new algorithm for optimal fusion of measurements from multiple antennas was developed, was used to improve satellite coverage in poor visibility environments such as in AAR. Finally, methods for using data-link extracted

  14. A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots.

    PubMed

    Sherwin, Tyrone; Easte, Mikala; Chen, Andrew Tzer-Yeu; Wang, Kevin I-Kai; Dai, Wenbin

    2018-02-14

    Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system.

  15. A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots

    PubMed Central

    Sherwin, Tyrone; Easte, Mikala; Wang, Kevin I-Kai; Dai, Wenbin

    2018-01-01

    Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system. PMID:29443906

  16. Navigation strategies for multiple autonomous mobile robots moving in formation

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1991-01-01

    The problem of deriving navigation strategies for a fleet of autonomous mobile robots moving in formation is considered. Here, each robot is represented by a particle with a spherical effective spatial domain and a specified cone of visibility. The global motion of each robot in the world space is described by the equations of motion of the robot's center of mass. First, methods for formation generation are discussed. Then, simple navigation strategies for robots moving in formation are derived. A sufficient condition for the stability of a desired formation pattern for a fleet of robots each equipped with the navigation strategy based on nearest neighbor tracking is developed. The dynamic behavior of robot fleets consisting of three or more robots moving in formation in a plane is studied by means of computer simulation.

  17. Autonomous Navigation by a Mobile Robot

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Aghazarian, Hrand

    2005-01-01

    ROAMAN is a computer program for autonomous navigation of a mobile robot on a long (as much as hundreds of meters) traversal of terrain. Developed for use aboard a robotic vehicle (rover) exploring the surface of a remote planet, ROAMAN could also be adapted to similar use on terrestrial mobile robots. ROAMAN implements a combination of algorithms for (1) long-range path planning based on images acquired by mast-mounted, wide-baseline stereoscopic cameras, and (2) local path planning based on images acquired by body-mounted, narrow-baseline stereoscopic cameras. The long-range path-planning algorithm autonomously generates a series of waypoints that are passed to the local path-planning algorithm, which plans obstacle-avoiding legs between the waypoints. Both the long- and short-range algorithms use an occupancy-grid representation in computations to detect obstacles and plan paths. Maps that are maintained by the long- and short-range portions of the software are not shared because substantial localization errors can accumulate during any long traverse. ROAMAN is not guaranteed to generate an optimal shortest path, but does maintain the safety of the rover.

  18. High accuracy autonomous navigation using the global positioning system (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, Son H.; Hart, Roger C.; Shoan, Wendy C.; Wood, Terri; Long, Anne C.; Oza, Dipak H.; Lee, Taesul

    1997-01-01

    The application of global positioning system (GPS) technology to the improvement of the accuracy and economy of spacecraft navigation, is reported. High-accuracy autonomous navigation algorithms are currently being qualified in conjunction with the GPS attitude determination flyer (GADFLY) experiment for the small satellite technology initiative Lewis spacecraft. Preflight performance assessments indicated that these algorithms are able to provide a real time total position accuracy of better than 10 m and a velocity accuracy of better than 0.01 m/s, with selective availability at typical levels. It is expected that the position accuracy will be increased to 2 m if corrections are provided by the GPS wide area augmentation system.

  19. Navigation d'un vehicule autonome autour d'un asteroide

    NASA Astrophysics Data System (ADS)

    Dionne, Karine

    Les missions d'exploration planetaire utilisent des vehicules spatiaux pour acquerir les donnees scientifiques qui font avancer notre connaissance du systeme solaire. Depuis les annees 90, ces missions ciblent non seulement les planetes, mais aussi les corps celestes de plus petite taille comme les asteroides. Ces astres representent un defi particulier du point de vue des systemes de navigation, car leur environnement dynamique est complexe. Une sonde spatiale doit reagir rapidement face aux perturbations gravitationnelles en presence, sans quoi sa securite pourrait etre compromise. Les delais de communication avec la Terre pouvant souvent atteindre plusieurs dizaines de minutes, il est necessaire de developper des logiciels permettant une plus grande autonomie d'operation pour ce type de mission. Ce memoire presente un systeme de navigation autonome qui determine la position et la vitesse d'un satellite en orbite autour d'un asteroide. Il s'agit d'un filtre de Kalman etendu adaptatif a trois degres de liberte. Le systeme propose se base sur l'imagerie optique pour detecter des " points de reperes " qui ont ete prealablement cartographies. Il peut s'agir de crateres, de rochers ou de n'importe quel trait physique discernable a la camera. Les travaux de recherche realises se concentrent sur les techniques d'estimation d'etat propres a la navigation autonome. Ainsi, on suppose l'existence d'un logiciel approprie qui realise les fonctions de traitement d'image. La principale contribution de recherche consiste en l'inclusion, a chaque cycle d'estimation, d'une mesure de distance afin d'ameliorer les performances de navigation. Un estimateur d'etat de type adaptatif est necessaire pour le traitement de ces mesures, car leur precision varie dans le temps en raison de l'erreur de pointage. Les contributions secondaires de recherche sont liees a l'analyse de l'observabilite du systeme ainsi qu'a une analyse de sensibilite pour six parametres principaux de conception. Les

  20. Sign detection for autonomous navigation

    NASA Astrophysics Data System (ADS)

    Goodsell, Thomas G.; Snorrason, Magnus S.; Cartwright, Dustin; Stube, Brian; Stevens, Mark R.; Ablavsky, Vitaly X.

    2003-09-01

    Mobile robots currently cannot detect and read arbitrary signs. This is a major hindrance to mobile robot usability, since they cannot be tasked using directions that are intuitive to humans. It also limits their ability to report their position relative to intuitive landmarks. Other researchers have demonstrated some success on traffic sign recognition, but using template based methods limits the set of recognizable signs. There is a clear need for a sign detection and recognition system that can process a much wider variety of signs: traffic signs, street signs, store-name signs, building directories, room signs, etc. We are developing a system for Sign Understanding in Support of Autonomous Navigation (SUSAN), that detects signs from various cues common to most signs: vivid colors, compact shape, and text. We have demonstrated the feasibility of our approach on a variety of signs in both indoor and outdoor locations.

  1. Autonomous navigation system and method

    DOEpatents

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID

    2009-09-08

    A robot platform includes perceptors, locomotors, and a system controller, which executes instructions for autonomously navigating a robot. The instructions repeat, on each iteration through an event timing loop, the acts of defining an event horizon based on the robot's current velocity, detecting a range to obstacles around the robot, testing for an event horizon intrusion by determining if any range to the obstacles is within the event horizon, and adjusting rotational and translational velocity of the robot accordingly. If the event horizon intrusion occurs, rotational velocity is modified by a proportion of the current rotational velocity reduced by a proportion of the range to the nearest obstacle and translational velocity is modified by a proportion of the range to the nearest obstacle. If no event horizon intrusion occurs, translational velocity is set as a ratio of a speed factor relative to a maximum speed.

  2. Unmanned Ground Vehicle for Autonomous Non-Destructive Testing of FRP Bridge Decks

    NASA Astrophysics Data System (ADS)

    Klinkhachorn, P.; Mercer, A. Scott; Halabe, Udaya B.; GangaRao, Hota V. S.

    2007-03-01

    Current non-destructive techniques for defect analysis of FRP bridge decks have a narrow scope. These techniques are very good at detecting certain types of defects but are not robust enough to detect all defects by themselves. For example, infrared thermography (IRT) can detect air filled defects and Ground Penetrating Radar (GPR) is good at detecting water filled ones. These technologies can be combined to create a more robust defect detection scheme. To accomplish this, an Unmanned Ground Vehicle (UGV) has been designed that incorporates both IR and GPR analysis to create a comprehensive defect map of a bridge deck. The UGV autonomously surveys the deck surface and acquires data. The UGV has two 1.5 GHz ground coupled GPR antennas that are mounted on the front of the UGV to collect GPR data. It also incorporates an active heating source and a radiometric IR camera to capture IR images of the deck, even in less than ideal weather scenarios such as cold cloudy days. The UGV is designed so that it can collect data in an assembly line fashion. It moves in 1 foot increments. When moving, it collects GPR data from the two antennas. When it stops it heats a section of the deck. The next time it stops to heat a section, the IR camera is analyzing the preheated deck section while preparing for the next section. Because the data is being continually collected using this method, the UGV can survey the entire deck in an efficient and timely manner.

  3. Algorithm for covert convoy of a moving target using a group of autonomous robots

    NASA Astrophysics Data System (ADS)

    Polyakov, Igor; Shvets, Evgeny

    2018-04-01

    An important application of autonomous robot systems is to substitute human personnel in dangerous environments to reduce their involvement and subsequent risk on human lives. In this paper we solve the problem of covertly convoying a civilian in a dangerous area with a group of unmanned ground vehicles (UGVs) using social potential fields. The novelty of our work lies in the usage of UGVs as compared to the unmanned aerial vehicles typically employed for this task in the approaches described in literature. Additionally, in our paper we assume that the group of UGVs should simultaneously solve the problem of patrolling to detect intruders on the area. We develop a simulation system to test our algorithms, provide numerical results and give recommendations on how to tune the potentials governing robots' behaviour to prioritize between patrolling and convoying tasks.

  4. Local navigation and fuzzy control realization for autonomous guided vehicle

    NASA Astrophysics Data System (ADS)

    El-Konyaly, El-Sayed H.; Saraya, Sabry F.; Shehata, Raef S.

    1996-10-01

    This paper addresses the problem of local navigation for an autonomous guided vehicle (AGV) in a structured environment that contains static and dynamic obstacles. Information about the environment is obtained via a CCD camera. The problem is formulated as a dynamic feedback control problem in which speed and steering decisions are made on the fly while the AGV is moving. A decision element (DE) that uses local information is proposed. The DE guides the vehicle in the environment by producing appropriate navigation decisions. Dynamic models of a three-wheeled vehicle for driving and steering mechanisms are derived. The interaction between them is performed via the local feedback DE. A controller, based on fuzzy logic, is designed to drive the vehicle safely in an intelligent and human-like manner. The effectiveness of the navigation and control strategies in driving the AGV is illustrated and evaluated.

  5. Unmanned Ground Vehicle (UGV) Lessons Learned

    DTIC Science & Technology

    2001-11-01

    iii 1. INTRODUCTION ....................................................................................................... 1 1.1... INTRODUCTION 1.1 PURPOSE The purpose of this effort is to compile Lessons Learned from the unmanned ground vehicle (UGV) programs that could be relevant to... introduction of gunpowder, this lesson was no longer valid. Castles crumbled and new lessons had to be learned. One such lesson was that the faster

  6. Autonomous navigation of structured city roads

    NASA Astrophysics Data System (ADS)

    Aubert, Didier; Kluge, Karl C.; Thorpe, Chuck E.

    1991-03-01

    Autonomous road following is a domain which spans a range of complexity from poorly defined unmarked dirt roads to well defined well marked highly struc-. tured highways. The YARF system (for Yet Another Road Follower) is designed to operate in the middle of this range of complexity driving on urban streets. Our research program has focused on the use of feature- and situation-specific segmentation techniques driven by an explicit model of the appearance and geometry of the road features in the environment. We report results in robust detection of white and yellow painted stripes fitting a road model to detected feature locations to determine vehicle position and local road geometry and automatic location of road features in an initial image. We also describe our planned extensions to include intersection navigation.

  7. Perception system and functions for autonomous navigation in a natural environment

    NASA Technical Reports Server (NTRS)

    Chatila, Raja; Devy, Michel; Lacroix, Simon; Herrb, Matthieu

    1994-01-01

    This paper presents the approach, algorithms, and processes we developed for the perception system of a cross-country autonomous robot. After a presentation of the tele-programming context we favor for intervention robots, we introduce an adaptive navigation approach, well suited for the characteristics of complex natural environments. This approach lead us to develop a heterogeneous perception system that manages several different terrain representatives. The perception functionalities required during navigation are listed, along with the corresponding representations we consider. The main perception processes we developed are presented. They are integrated within an on-board control architecture we developed. First results of an ambitious experiment currently underway at LAAS are then presented.

  8. Autonomous navigation and control of a Mars rover

    NASA Technical Reports Server (NTRS)

    Miller, D. P.; Atkinson, D. J.; Wilcox, B. H.; Mishkin, A. H.

    1990-01-01

    A Mars rover will need to be able to navigate autonomously kilometers at a time. This paper outlines the sensing, perception, planning, and execution monitoring systems that are currently being designed for the rover. The sensing is based around stereo vision. The interpretation of the images use a registration of the depth map with a global height map provided by an orbiting spacecraft. Safe, low energy paths are then planned through the map, and expectations of what the rover's articulation sensors should sense are generated. These expectations are then used to ensure that the planned path is correctly being executed.

  9. A traffic priority language for collision-free navigation of autonomous mobile robots in dynamic environments.

    PubMed

    Bourbakis, N G

    1997-01-01

    This paper presents a generic traffic priority language, called KYKLOFORTA, used by autonomous robots for collision-free navigation in a dynamic unknown or known navigation space. In a previous work by X. Grossmman (1988), a set of traffic control rules was developed for the navigation of the robots on the lines of a two-dimensional (2-D) grid and a control center coordinated and synchronized their movements. In this work, the robots are considered autonomous: they are moving anywhere and in any direction inside the free space, and there is no need of a central control to coordinate and synchronize them. The requirements for each robot are i) visual perception, ii) range sensors, and iii) the ability of each robot to detect other moving objects in the same free navigation space, define the other objects perceived size, their velocity and their directions. Based on these assumptions, a traffic priority language is needed for each robot, making it able to decide during the navigation and avoid possible collision with other moving objects. The traffic priority language proposed here is based on a set of primitive traffic priority alphabet and rules which compose pattern of corridors for the application of the traffic priority rules.

  10. Autonomous optical navigation using nanosatellite-class instruments: a Mars approach case study

    NASA Astrophysics Data System (ADS)

    Enright, John; Jovanovic, Ilija; Kazemi, Laila; Zhang, Harry; Dzamba, Tom

    2018-02-01

    This paper examines the effectiveness of small star trackers for orbital estimation. Autonomous optical navigation has been used for some time to provide local estimates of orbital parameters during close approach to celestial bodies. These techniques have been used extensively on spacecraft dating back to the Voyager missions, but often rely on long exposures and large instrument apertures. Using a hyperbolic Mars approach as a reference mission, we present an EKF-based navigation filter suitable for nanosatellite missions. Observations of Mars and its moons allow the estimator to correct initial errors in both position and velocity. Our results show that nanosatellite-class star trackers can produce good quality navigation solutions with low position (<300 {m}) and velocity (<0.15 {m/s}) errors as the spacecraft approaches periapse.

  11. Autonomous urban reconnaissance ingress system (AURIS): providing a tactically relevant autonomous door-opening kit for unmanned ground vehicles

    NASA Astrophysics Data System (ADS)

    Shane, David J.; Rufo, Michael A.; Berkemeier, Matthew D.; Alberts, Joel A.

    2012-06-01

    The Autonomous Urban Reconnaissance Ingress System (AURIS™) addresses a significant limitation of current military and first responder robotics technology: the inability of reconnaissance robots to open doors. Leveraging user testing as a baseline, the program has derived specifications necessary for military personnel to open doors with fielded UGVs (Unmanned Ground Vehicles), and evaluates the technology's impact on operational mission areas: duration, timing, and user patience in developing a tactically relevant, safe, and effective system. Funding is provided through the US ARMY Tank Automotive Research, Development and Engineering Center (TARDEC) and the project represents a leap forward in perception, autonomy, robotic implements, and coordinated payload operation in UGVs. This paper describes high level details of specification generation, status of the last phase of development, an advanced view of the system autonomy capability, and a short look ahead towards the ongoing work on this compelling and important technology.

  12. Autonomous Navigation With Ground Station One-Way Forward-Link Doppler Data

    NASA Technical Reports Server (NTRS)

    Horstkamp, G. M.; Niklewski, D. J.; Gramling, C. J.

    1996-01-01

    The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) has spent several years developing operational onboard navigation systems (ONS's) to provide real time autonomous, highly accurate navigation products for spacecraft using NASA's space and ground communication systems. The highly successful Tracking and Data Relay Satellite (TDRSS) ONS (TONS) experiment on the Explorer Platform/Extreme Ultraviolet (EP/EUV) spacecraft, launched on June 7, 1992, flight demonstrated the ONS for high accuracy navigation using TDRSS forward link communication services. In late 1994, a similar ONS experiment was performed using EP/EUV flight hardware (the ultrastable oscillator and Doppler extractor card in one of the TDRSS transponders) and ground system software to demonstrate the feasibility of using an ONS with ground station forward link communication services. This paper provides a detailed evaluation of ground station-based ONS performance of data collected over a 20 day period. The ground station ONS (GONS) experiment results are used to project the expected performance of an operational system. The GONS processes Doppler data derived from scheduled ground station forward link services using a sequential estimation algorithm enhanced by a sophisticated process noise model to provide onboard orbit and frequency determination. Analysis of the GONS experiment performance indicates that real time onboard position accuracies of better than 125 meters (1 sigma) are achievable with two or more 5-minute contacts per day for the EP/EUV 525 kilometer altitude, 28.5 degree inclination orbit. GONS accuracy is shown to be a function of the fidelity of the onboard propagation model, the frequency/geometry of the tracking contacts, and the quality of the tracking measurements. GONS provides a viable option for using autonomous navigation to reduce operational costs for upcoming spacecraft missions with moderate position accuracy requirements.

  13. Geomagnetic Navigation of Autonomous Underwater Vehicle Based on Multi-objective Evolutionary Algorithm.

    PubMed

    Li, Hong; Liu, Mingyong; Zhang, Feihu

    2017-01-01

    This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments.

  14. Geomagnetic Navigation of Autonomous Underwater Vehicle Based on Multi-objective Evolutionary Algorithm

    PubMed Central

    Li, Hong; Liu, Mingyong; Zhang, Feihu

    2017-01-01

    This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments. PMID:28747884

  15. An unmanned ground vehicle for landmine remediation

    NASA Astrophysics Data System (ADS)

    Wasson, Steven R.; Guilberto, Jose; Ogg, Wade; Wedeward, Kevin; Bruder, Stephen; El-Osery, Aly

    2004-09-01

    Anti-tank (AT) landmines slow down and endanger military advances and present sizeable humanitarian problems. The remediation of these mines by direct human intervention is both dangerous and costly. The Intelligent Systems & Robotics Group (ISRG) at New Mexico Tech has provided a partial solution to this problem by developing an Unmanned Ground Vehicle (UGV) to remediate these mines without endangering human lives. This paper presents an overview of the design and operation of this UGV. Current results and future work are also described herein. To initiate the remediation process the UGV is given the GPS coordinates of previously detected landmines. Once the UGV autonomously navigates to an acceptable proximity of the landmine, a remote operator acquires control over a wireless network link using a joystick on a base station. Utilizing two cameras mounted on the UGV, the operator is able to accurately position the UGV directly over the landmine. The UGV houses a self-contained drill system equipped with its own processing resources, sensors, and actuators. The drill system deploys a neutralizing device over the landmine to neutralize it. One such device, developed by Science Applications International Corporation (SAIC), employs incendiary materials to melt through the container of the landmine and slowly burn the explosive material, thereby safely and remotely disabling the landmine.

  16. Water Detection Based on Object Reflections

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.; Matthies, Larry H.

    2012-01-01

    Water bodies are challenging terrain hazards for terrestrial unmanned ground vehicles (UGVs) for several reasons. Traversing through deep water bodies could cause costly damage to the electronics of UGVs. Additionally, a UGV that is either broken down due to water damage or becomes stuck in a water body during an autonomous operation will require rescue, potentially drawing critical resources away from the primary operation and increasing the operation cost. Thus, robust water detection is a critical perception requirement for UGV autonomous navigation. One of the properties useful for detecting still water bodies is that their surface acts as a horizontal mirror at high incidence angles. Still water bodies in wide-open areas can be detected by geometrically locating the exact pixels in the sky that are reflecting on candidate water pixels on the ground, predicting if ground pixels are water based on color similarity to the sky and local terrain features. But in cluttered areas where reflections of objects in the background dominate the appearance of the surface of still water bodies, detection based on sky reflections is of marginal value. Specifically, this software attempts to solve the problem of detecting still water bodies on cross-country terrain in cluttered areas at low cost.

  17. Real-Time and High-Fidelity Simulation Environment for Autonomous Ground Vehicle Dynamics

    NASA Technical Reports Server (NTRS)

    Cameron, Jonathan; Myint, Steven; Kuo, Calvin; Jain, Abhi; Grip, Havard; Jayakumar, Paramsothy; Overholt, Jim

    2013-01-01

    This paper reports on a collaborative project between U.S. Army TARDEC and Jet Propulsion Laboratory (JPL) to develop a unmanned ground vehicle (UGV) simulation model using the ROAMS vehicle modeling framework. Besides modeling the physical suspension of the vehicle, the sensing and navigation of the HMMWV vehicle are simulated. Using models of urban and off-road environments, the HMMWV simulation was tested in several ways, including navigation in an urban environment with obstacle avoidance and the performance of a lane change maneuver.

  18. Autonomous Underwater Navigation and Optical Mapping in Unknown Natural Environments.

    PubMed

    Hernández, Juan David; Istenič, Klemen; Gracias, Nuno; Palomeras, Narcís; Campos, Ricard; Vidal, Eduard; García, Rafael; Carreras, Marc

    2016-07-26

    We present an approach for navigating in unknown environments while, simultaneously, gathering information for inspecting underwater structures using an autonomous underwater vehicle (AUV). To accomplish this, we first use our pipeline for mapping and planning collision-free paths online, which endows an AUV with the capability to autonomously acquire optical data in close proximity. With that information, we then propose a reconstruction pipeline to create a photo-realistic textured 3D model of the inspected area. These 3D models are also of particular interest to other fields of study in marine sciences, since they can serve as base maps for environmental monitoring, thus allowing change detection of biological communities and their environment over time. Finally, we evaluate our approach using the Sparus II, a torpedo-shaped AUV, conducting inspection missions in a challenging, real-world and natural scenario.

  19. Autonomous Underwater Navigation and Optical Mapping in Unknown Natural Environments

    PubMed Central

    Hernández, Juan David; Istenič, Klemen; Gracias, Nuno; Palomeras, Narcís; Campos, Ricard; Vidal, Eduard; García, Rafael; Carreras, Marc

    2016-01-01

    We present an approach for navigating in unknown environments while, simultaneously, gathering information for inspecting underwater structures using an autonomous underwater vehicle (AUV). To accomplish this, we first use our pipeline for mapping and planning collision-free paths online, which endows an AUV with the capability to autonomously acquire optical data in close proximity. With that information, we then propose a reconstruction pipeline to create a photo-realistic textured 3D model of the inspected area. These 3D models are also of particular interest to other fields of study in marine sciences, since they can serve as base maps for environmental monitoring, thus allowing change detection of biological communities and their environment over time. Finally, we evaluate our approach using the Sparus II, a torpedo-shaped AUV, conducting inspection missions in a challenging, real-world and natural scenario. PMID:27472337

  20. Embedded Relative Navigation Sensor Fusion Algorithms for Autonomous Rendezvous and Docking Missions

    NASA Technical Reports Server (NTRS)

    DeKock, Brandon K.; Betts, Kevin M.; McDuffie, James H.; Dreas, Christine B.

    2008-01-01

    bd Systems (a subsidiary of SAIC) has developed a suite of embedded relative navigation sensor fusion algorithms to enable NASA autonomous rendezvous and docking (AR&D) missions. Translational and rotational Extended Kalman Filters (EKFs) were developed for integrating measurements based on the vehicles' orbital mechanics and high-fidelity sensor error models and provide a solution with increased accuracy and robustness relative to any single relative navigation sensor. The filters were tested tinough stand-alone covariance analysis, closed-loop testing with a high-fidelity multi-body orbital simulation, and hardware-in-the-loop (HWIL) testing in the Marshall Space Flight Center (MSFC) Flight Robotics Laboratory (FRL).

  1. Flight Analysis of an Autonomously Navigated Experimental Lander

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey; Niehaus, Justin; Goodenow, Debra; Dunker, Storm; Montague, David

    2016-01-01

    First steps have been taken to qualify a family of parafoil systems capable of increasing the survivability and reusability of high-altitude balloon payloads. The research is motivated by the common risk facing balloon payloads where expensive flight hardware can often land in inaccessible areas that make them difficult or impossible to recover. The Autonomously Navigated Experimental Lander (ANGEL) flight test introduced a commercial Guided Parachute Aerial Delivery System (GPADS) to a previously untested environment at 108,000 feet Mean Sea Level (MSL) to determine its high-altitude survivability and capabilities. Following release, ANGEL descended under a drogue until approximately 25,000 feet, at which point the drogue was jettisoned and the main parachute was deployed, commencing navigation. Multiple data acquisition platforms were used to characterize the return-to-point technology performance and help determine its suitability for returning future scientific payloads ranging from 180 to 10,000 pounds to safer and more convenient landing locations. This report describes the test vehicle design, and summarizes the captured sensor data. Various post-flight analyses are used to quantify the systems performance, gondola load data, and serve as a reference point for subsequent missions.

  2. Scene Segmentation For Autonomous Robotic Navigation Using Sequential Laser Projected Structured Light

    NASA Astrophysics Data System (ADS)

    Brown, C. David; Ih, Charles S.; Arce, Gonzalo R.; Fertell, David A.

    1987-01-01

    Vision systems for mobile robots or autonomous vehicles navigating in an unknown terrain environment must provide a rapid and accurate method of segmenting the scene ahead into regions of pathway and background. A major distinguishing feature between the pathway and background is the three dimensional texture of these two regions. Typical methods of textural image segmentation are very computationally intensive, often lack the required robustness, and are incapable of sensing the three dimensional texture of various regions of the scene. A method is presented where scanned laser projected lines of structured light, viewed by a stereoscopically located single video camera, resulted in an image in which the three dimensional characteristics of the scene were represented by the discontinuity of the projected lines. This image was conducive to processing with simple regional operators to classify regions as pathway or background. Design of some operators and application methods, and demonstration on sample images are presented. This method provides rapid and robust scene segmentation capability that has been implemented on a microcomputer in near real time, and should result in higher speed and more reliable robotic or autonomous navigation in unstructured environments.

  3. Research of autonomous celestial navigation based on new measurement model of stellar refraction

    NASA Astrophysics Data System (ADS)

    Yu, Cong; Tian, Hong; Zhang, Hui; Xu, Bo

    2014-09-01

    Autonomous celestial navigation based on stellar refraction has attracted widespread attention for its high accuracy and full autonomy.In this navigation method, establishment of accurate stellar refraction measurement model is the fundament and key issue to achieve high accuracy navigation. However, the existing measurement models are limited due to the uncertainty of atmospheric parameters. Temperature, pressure and other factors which affect the stellar refraction within the height of earth's stratosphere are researched, and the varying model of atmosphere with altitude is derived on the basis of standard atmospheric data. Furthermore, a novel measurement model of stellar refraction in a continuous range of altitudes from 20 km to 50 km is produced by modifying the fixed altitude (25 km) measurement model, and equation of state with the orbit perturbations is established, then a simulation is performed using the improved Extended Kalman Filter. The results show that the new model improves the navigation accuracy, which has a certain practical application value.

  4. Navigation of autonomous vehicles for oil spill cleaning in dynamic and uncertain environments

    NASA Astrophysics Data System (ADS)

    Jin, Xin; Ray, Asok

    2014-04-01

    In the context of oil spill cleaning by autonomous vehicles in dynamic and uncertain environments, this paper presents a multi-resolution algorithm that seamlessly integrates the concepts of local navigation and global navigation based on the sensory information; the objective here is to enable adaptive decision making and online replanning of vehicle paths. The proposed algorithm provides a complete coverage of the search area for clean-up of the oil spills and does not suffer from the problem of having local minima, which is commonly encountered in potential-field-based methods. The efficacy of the algorithm is tested on a high-fidelity player/stage simulator for oil spill cleaning in a harbour, where the underlying oil weathering process is modelled as 2D random-walk particle tracking. A preliminary version of this paper was presented by X. Jin and A. Ray as 'Coverage Control of Autonomous Vehicles for Oil Spill Cleaning in Dynamic and Uncertain Environments', Proceedings of the American Control Conference, Washington, DC, June 2013, pp. 2600-2605.

  5. POSTMAN: Point of Sail Tacking for Maritime Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance L.; Reinhart, Felix

    2012-01-01

    Waves apply significant forces to small boats, in particular when such vessels are moving at a high speed in severe sea conditions. In addition, small high-speed boats run the risk of diving with the bow into the next wave crest during operations in the wavelengths and wave speeds that are typical for shallow water. In order to mitigate the issues of autonomous navigation in rough water, a hybrid controller called POSTMAN combines the concept of POS (point of sail) tack planning from the sailing domain with a standard PID (proportional-integral-derivative) controller that implements reliable target reaching for the motorized small boat control task. This is an embedded, adaptive software controller that uses look-ahead sensing in a closed loop method to perform path planning for safer navigation in rough waters. State-of-the-art controllers for small boats are based on complex models of the vessel's kinematics and dynamics. They enable the vessel to follow preplanned paths accurately and can theoretically control all of the small boat s six degrees of freedom. However, the problems of bow diving and other undesirable incidents are not addressed, and it is questionable if a six-DOF controller with basically a single actuator is possible at all. POSTMAN builds an adaptive capability into the controller based on sensed wave characteristics. This software will bring a muchneeded capability to unmanned small boats moving at high speeds. Previously, this class of boat was limited to wave heights of less than one meter in the sea states in which it could operate. POSTMAN is a major advance in autonomous safety for small maritime craft.

  6. Ribbon networks for modeling navigable paths of autonomous agents in virtual environments.

    PubMed

    Willemsen, Peter; Kearney, Joseph K; Wang, Hongling

    2006-01-01

    This paper presents the Environment Description Framework (EDF) for modeling complex networks of intersecting roads and pathways in virtual environments. EDF represents information about the layout of streets and sidewalks, the rules that govern behavior on roads and walkways, and the locations of agents with respect to navigable structures. The framework serves as the substrate on which behavior programs for autonomous vehicles and pedestrians are built. Pathways are modeled as ribbons in space. The ribbon structure provides a natural coordinate frame for defining the local geometry of navigable surfaces. EDF includes a powerful runtime interface supported by robust and efficient code for locating objects on the ribbon network, for mapping between Cartesian and ribbon coordinates, and for determining behavioral constraints imposed by the environment.

  7. Change detection on UGV patrols with respect to a reference tour using VIS imagery

    NASA Astrophysics Data System (ADS)

    Müller, Thomas

    2015-05-01

    Autonomous driving robots (UGVs, Unmanned Ground Vehicles) equipped with visual-optical (VIS) cameras offer a high potential to automatically detect suspicious occurrences and dangerous or threatening situations on patrol. In order to explore this potential, the scene of interest is recorded first on a reference tour representing the 'everything okay' situation. On further patrols changes are detected with respect to the reference in a two step processing scheme. In the first step, an image retrieval is done to find the reference images that are closest to the current camera image on patrol. This is done efficiently based on precalculated image-to-image registrations of the reference by optimizing image overlap in a local reference search (after a global search when that is needed). In the second step, a robust spatio-temporal change detection is performed that widely compensates 3-D parallax according to variations of the camera position. Various results document the performance of the presented approach.

  8. Testing of Unmanned Ground Vehicle (UGV) Systems

    DTIC Science & Technology

    2009-02-12

    Emissions - Intra-system EMC TOP 1-2-51253 TOP 1-2-51154 TOP 2-2-61355 Determines whether the item tested meets the electromagnetic radiation ...effects, static electricity, and lightning criteria and the maximum electromagnetic radiation environment to which the test item may be exposed...Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39-18 2-2-540 Testing of Unmanned Ground Vehicle (UGV) Systems 5c. PROGRAM ELEMENT NUMBER 5d

  9. UGV Interoperability Profile (IOP) Communications Profile, Version 0

    DTIC Science & Technology

    2011-12-21

    some UGV systems employ Orthogonal Frequency Division Multiplexing ( OFDM ) or Coded Orthogonal Frequency Division Multiplexing (COFDM) waveforms which...other portions of the IOP. Attribute Paragraph Title Values Waveform 3.3 Air Interface/ Waveform OFDM , COFDM, DDL, CDL, None OCU to Platform...Sight MANET Mobile Ad-hoc Network Mbps Megabits per second MC/PM Master Controller/ Payload Manager MHz Megahertz MIMO Multiple Input Multiple

  10. Intelligence algorithms for autonomous navigation in a ground vehicle

    NASA Astrophysics Data System (ADS)

    Petkovsek, Steve; Shakya, Rahul; Shin, Young Ho; Gautam, Prasanna; Norton, Adam; Ahlgren, David J.

    2012-01-01

    This paper will discuss the approach to autonomous navigation used by "Q," an unmanned ground vehicle designed by the Trinity College Robot Study Team to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2011 competition, Q's intelligence was upgraded in several different areas, resulting in a more robust decision-making process and a more reliable system. In 2010-2011, the software of Q was modified to operate in a modular parallel manner, with all subtasks (including motor control, data acquisition from sensors, image processing, and intelligence) running simultaneously in separate software processes using the National Instruments (NI) LabVIEW programming language. This eliminated processor bottlenecks and increased flexibility in the software architecture. Though overall throughput was increased, the long runtime of the image processing process (150 ms) reduced the precision of Q's realtime decisions. Q had slow reaction times to obstacles detected only by its cameras, such as white lines, and was limited to slow speeds on the course. To address this issue, the image processing software was simplified and also pipelined to increase the image processing throughput and minimize the robot's reaction times. The vision software was also modified to detect differences in the texture of the ground, so that specific surfaces (such as ramps and sand pits) could be identified. While previous iterations of Q failed to detect white lines that were not on a grassy surface, this new software allowed Q to dynamically alter its image processing state so that appropriate thresholds could be applied to detect white lines in changing conditions. In order to maintain an acceptable target heading, a path history algorithm was used to deal with local obstacle fields and GPS waypoints were added to provide a global target heading. These modifications resulted in Q placing 5th in the autonomous challenge and 4th in the navigation challenge at IGVC.

  11. Flight Mechanics/Estimation Theory Symposium. [with application to autonomous navigation and attitude/orbit determination

    NASA Technical Reports Server (NTRS)

    Fuchs, A. J. (Editor)

    1979-01-01

    Onboard and real time image processing to enhance geometric correction of the data is discussed with application to autonomous navigation and attitude and orbit determination. Specific topics covered include: (1) LANDSAT landmark data; (2) star sensing and pattern recognition; (3) filtering algorithms for Global Positioning System; and (4) determining orbital elements for geostationary satellites.

  12. TARDEC Annual Report 2010

    DTIC Science & Technology

    2011-06-15

    capable of engaging threats while interacting with system operators. Through autonomous perception and navigation, intelligent tactical behavior... systems integration approach. TARDEC’s role is to assess the best way to apply the VICTORY architecture to future tactical wheeled vehicles and...Track tops Thrown Object Protection System traDoc U.S. Army Training and Doctrine Command twVs Tactical Wheeled Vehicle Survivability ugV Unmanned

  13. GPS World, Innovation: Autonomous Navigation at High Earth Orbits

    NASA Technical Reports Server (NTRS)

    Bamford, William; Winternitz, Luke; Hay, Curtis

    2005-01-01

    Calculating a spacecraft's precise location at high orbital altitudes-22,000 miles (35,800 km) and beyond-is an important and challenging problem. New and exciting opportunities become possible if satellites are able to autonomously determine their own orbits. First, the repetitive task of periodically collecting range measurements from terrestrial antennas to high altitude spacecraft becomes less important-this lessens competition for control facilities and saves money by reducing operational costs. Also, autonomous navigation at high orbital altitudes introduces the possibility of autonomous station keeping. For example, if a geostationary satellite begins to drift outside of its designated slot it can make orbit adjustments without requiring commands from the ground. Finally, precise onboard orbit determination opens the door to satellites flying in formation-an emerging concept for many scientific space applications. The realization of these benefits is not a trivial task. While the navigation signals broadcast by GPS satellites are well suited for orbit and attitude determination at lower altitudes, acquiring and using these signals at geostationary (GEO) and highly elliptical orbits is much more difficult. The light blue trace describes the GPS orbit at approximately 12,550 miles (20,200 km) altitude. GPS satellites were designed to provide navigation signals to terrestrial users-consequently the antenna array points directly toward the earth. GEO and HE0 orbits, however, are well above the operational GPS constellation, making signal reception at these altitudes more challenging. The nominal beamwidth of a Block II/IIA GPS satellite antenna array is approximately 42.6 degrees. At GEO and HE0 altitudes, most of these primary beam transmissions are blocked by the Earth, leaving only a narrow region of nominal signal visibility near opposing limbs of the earth. This region is highlighted in gray. If GPS receivers at GEO and HE0 orbits were designed to use these

  14. Reactive Sequencing for Autonomous Navigation Evolving from Phoenix Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Grasso, Christopher A.; Riedel, Joseph E.; Vaughan, Andrew T.

    2010-01-01

    Virtual Machine Language (VML) is an award-winning advanced procedural sequencing language in use on NASA deep-space missions since 1997, and was used for the successful entry, descent, and landing (EDL) of the Phoenix spacecraft onto the surface of Mars. Phoenix EDL utilized a state-oriented operations architecture which executed within the constraints of the existing VML 2.0 flight capability, compatible with the linear "land or die" nature of the mission. The intricacies of Phoenix EDL included the planned discarding of portions of the vehicle, the complex communications management for relay through on-orbit assets, the presence of temporally indeterminate physical events, and the need to rapidly catch up four days of sequencing should a reboot of the spacecraft flight computer occur shortly before atmospheric entry. These formidable operational challenges led to new techniques for packaging and coordinating reusable sequences called blocks using one-way synchronization via VML sequencing global variable events. The coordinated blocks acted as an ensemble to land the spacecraft, while individually managing various elements in as simple a fashion as possible. This paper outlines prototype VML 2.1 flight capabilities that have evolved from the one-way synchronization techniques in order to implement even more ambitious autonomous mission capabilities. Target missions for these new capabilities include autonomous touch-and-go sampling of cometary and asteroidal bodies, lunar landing of robotic missions, and ultimately landing of crewed lunar vehicles. Close proximity guidance, navigation, and control operations, on-orbit rendezvous, and descent and landing events featured in these missions require elaborate abort capability, manifesting highly non-linear scenarios that are so complex as to overtax traditional sequencing, or even the sort of one-way coordinated sequencing used during EDL. Foreseeing advanced command and control needs for small body and lunar landing

  15. Advocates and critics for tactical behaviors in UGV navigation

    NASA Astrophysics Data System (ADS)

    Hussain, Talib S.; Vidaver, Gordon; Berliner, Jeffrey

    2005-05-01

    Critical to the development of unmanned ground vehicle platforms is the incorporation of adaptive tactical behaviors for the planning of high-level navigation and tactical actions. BBN Technologies recently completed a simulation-based project for the Army Research Lab (ARL) in which we applied an evolutionary computation approach to navigating through a terrain to capture flag objectives while faced with one or more mobile enemies. Our Advocates and Critics for Tactical Behaviors (ACTB) system evolves plans for the vehicle that control its movement goals (in the form of waypoints), and its future actions (e.g., pointing cameras). We apply domain-specific, state-dependent genetic operators called advocates that promote specific tactical behaviors (e.g., adapt a plan to stay closer to walls). We define the fitness function as a weighted sum of a number of independent, domain-specific, state-dependent evaluation components called critics. Critics reward plans based upon specific tactical criteria, such as minimizing risk of exposure or time to the flags. Additionally, the ACTB system provides the capability for a human commander to specify the "rules of engagement" under which the vehicle will operate. The rules of engagement determine the planning emphasis required under different tactical situations (e.g., discovery of an enemy), and provide a mechanism for automatically adapting the relative selection probabilities of the advocates, the weights of the critics, and the depth of planning in response to tactical events. The ACTB system demonstrated highly effective performance in a head-to-head testing event, held by ARL, against two competing tactical behavior systems.

  16. PRIMUS: autonomous navigation in open terrain with a tracked vehicle

    NASA Astrophysics Data System (ADS)

    Schaub, Guenter W.; Pfaendner, Alfred H.; Schaefer, Christoph

    2004-09-01

    The German experimental robotics program PRIMUS (PRogram for Intelligent Mobile Unmanned Systems) is focused on solutions for autonomous driving in unknown open terrain, over several project phases under specific realization aspects for more than 12 years. The main task of the program is to develop algorithms for a high degree of autonomous navigation skills with off-the-shelf available hardware/sensor technology and to integrate this into military vehicles. For obstacle detection a Dornier-3D-LADAR is integrated on a tracked vehicle "Digitized WIESEL 2". For road-following a digital video camera and a visual perception module from the Universitaet der Bundeswehr Munchen (UBM) has been integrated. This paper gives an overview of the PRIMUS program with a focus on the last program phase D (2001 - 2003). This includes the system architecture, the description of the modes of operation and the technology development with the focus on obstacle avoidance and obstacle classification using a 3-D LADAR. A collection of experimental results and a short look at the next steps in the German robotics program will conclude the paper.

  17. Levels of Autonomy and Autonomous System Performance Assessment for Intelligent Unmanned Systems

    DTIC Science & Technology

    2014-04-01

    LIDAR and camera sensors that is driven entirely by teleoperation would be AL 0. If that same robot used its LIDAR and camera data to generate a...obstacle detection, mapping, path planning 3 CMMAD semi- autonomous counter- mine system (Few 2010) Talon UGV, camera, LIDAR , metal detector...NCAP framework are performed on individual UMS components and do not require mission level evaluations. For example, bench testing of camera, LIDAR

  18. Autonomous Navigation Apparatus With Neural Network for a Mobile Vehicle

    NASA Technical Reports Server (NTRS)

    Quraishi, Naveed (Inventor)

    1996-01-01

    An autonomous navigation system for a mobile vehicle arranged to move within an environment includes a plurality of sensors arranged on the vehicle and at least one neural network including an input layer coupled to the sensors, a hidden layer coupled to the input layer, and an output layer coupled to the hidden layer. The neural network produces output signals representing respective positions of the vehicle, such as the X coordinate, the Y coordinate, and the angular orientation of the vehicle. A plurality of patch locations within the environment are used to train the neural networks to produce the correct outputs in response to the distances sensed.

  19. Design, Development and Testing of the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam) Guidance, Navigation and Control System

    NASA Technical Reports Server (NTRS)

    Wagenknecht, J.; Fredrickson, S.; Manning, T.; Jones, B.

    2003-01-01

    Engineers at NASA Johnson Space Center have designed, developed, and tested a nanosatellite-class free-flyer intended for future external inspection and remote viewing of human spaceflight activities. The technology demonstration system, known as the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam), has been integrated into the approximate form and function of a flight system. The primary focus has been to develop a system capable of providing external views of the International Space Station. The Mini AERCam system is spherical-shaped and less than eight inches in diameter. It has a full suite of guidance, navigation, and control hardware and software, and is equipped with two digital video cameras and a high resolution still image camera. The vehicle is designed for either remotely piloted operations or supervised autonomous operations. Tests have been performed in both a six degree-of-freedom closed-loop orbital simulation and on an air-bearing table. The Mini AERCam system can also be used as a test platform for evaluating algorithms and relative navigation for autonomous proximity operations and docking around the Space Shuttle Orbiter or the ISS.

  20. Adaptive Control for Autonomous Navigation of Mobile Robots Considering Time Delay and Uncertainty

    NASA Astrophysics Data System (ADS)

    Armah, Stephen Kofi

    Autonomous control of mobile robots has attracted considerable attention of researchers in the areas of robotics and autonomous systems during the past decades. One of the goals in the field of mobile robotics is development of platforms that robustly operate in given, partially unknown, or unpredictable environments and offer desired services to humans. Autonomous mobile robots need to be equipped with effective, robust and/or adaptive, navigation control systems. In spite of enormous reported work on autonomous navigation control systems for mobile robots, achieving the goal above is still an open problem. Robustness and reliability of the controlled system can always be improved. The fundamental issues affecting the stability of the control systems include the undesired nonlinear effects introduced by actuator saturation, time delay in the controlled system, and uncertainty in the model. This research work develops robustly stabilizing control systems by investigating and addressing such nonlinear effects through analytical, simulations, and experiments. The control systems are designed to meet specified transient and steady-state specifications. The systems used for this research are ground (Dr Robot X80SV) and aerial (Parrot AR.Drone 2.0) mobile robots. Firstly, an effective autonomous navigation control system is developed for X80SV using logic control by combining 'go-to-goal', 'avoid-obstacle', and 'follow-wall' controllers. A MATLAB robot simulator is developed to implement this control algorithm and experiments are conducted in a typical office environment. The next stage of the research develops an autonomous position (x, y, and z) and attitude (roll, pitch, and yaw) controllers for a quadrotor, and PD-feedback control is used to achieve stabilization. The quadrotor's nonlinear dynamics and kinematics are implemented using MATLAB S-function to generate the state output. Secondly, the white-box and black-box approaches are used to obtain a linearized

  1. H∞ robust fault-tolerant controller design for an autonomous underwater vehicle's navigation control system

    NASA Astrophysics Data System (ADS)

    Cheng, Xiang-Qin; Qu, Jing-Yuan; Yan, Zhe-Ping; Bian, Xin-Qian

    2010-03-01

    In order to improve the security and reliability for autonomous underwater vehicle (AUV) navigation, an H∞ robust fault-tolerant controller was designed after analyzing variations in state-feedback gain. Operating conditions and the design method were then analyzed so that the control problem could be expressed as a mathematical optimization problem. This permitted the use of linear matrix inequalities (LMI) to solve for the H∞ controller for the system. When considering different actuator failures, these conditions were then also mathematically expressed, allowing the H∞ robust controller to solve for these events and thus be fault-tolerant. Finally, simulation results showed that the H∞ robust fault-tolerant controller could provide precise AUV navigation control with strong robustness.

  2. Technology initiatives for the autonomous guidance, navigation, and control of single and multiple satellites

    NASA Astrophysics Data System (ADS)

    Croft, John; Deily, John; Hartman, Kathy; Weidow, David

    1998-01-01

    In the twenty-first century, NASA envisions frequent low-cost missions to explore the solar system, observe the universe, and study our planet. To realize NASA's goal, the Guidance, Navigation, and Control Center (GNCC) at the Goddard Space Flight Center sponsors technology programs that enhance spacecraft performance, streamline processes and ultimately enable cheaper science. Our technology programs encompass control system architectures, sensor and actuator components, electronic systems, design and development of algorithms, embedded systems and space vehicle autonomy. Through collaboration with government, universities, non-profit organizations, and industry, the GNCC incrementally develops key technologies that conquer NASA's challenges. This paper presents an overview of several innovative technology initiatives for the autonomous guidance, navigation, and control (GN&C) of satellites.

  3. Land, sea, and air unmanned systems research and development at SPAWAR Systems Center Pacific

    NASA Astrophysics Data System (ADS)

    Nguyen, Hoa G.; Laird, Robin; Kogut, Greg; Andrews, John; Fletcher, Barbara; Webber, Todd; Arrieta, Rich; Everett, H. R.

    2009-05-01

    The Space and Naval Warfare (SPAWAR) Systems Center Pacific (SSC Pacific) has a long and extensive history in unmanned systems research and development, starting with undersea applications in the 1960s and expanding into ground and air systems in the 1980s. In the ground domain, we are addressing force-protection scenarios using large unmanned ground vehicles (UGVs) and fixed sensors, and simultaneously pursuing tactical and explosive ordnance disposal (EOD) operations with small man-portable robots. Technology thrusts include improving robotic intelligence and functionality, autonomous navigation and world modeling in urban environments, extended operational range of small teleoperated UGVs, enhanced human-robot interaction, and incorporation of remotely operated weapon systems. On the sea surface, we are pushing the envelope on dynamic obstacle avoidance while conforming to established nautical rules-of-the-road. In the air, we are addressing cooperative behaviors between UGVs and small vertical-takeoff- and-landing unmanned air vehicles (UAVs). Underwater applications involve very shallow water mine countermeasures, ship hull inspection, oceanographic data collection, and deep ocean access. Specific technology thrusts include fiber-optic communications, adaptive mission controllers, advanced navigation techniques, and concepts of operations (CONOPs) development. This paper provides a review of recent accomplishments and current status of a number of projects in these areas.

  4. EnEx-RANGE - Robust autonomous Acoustic Navigation in Glacial icE

    NASA Astrophysics Data System (ADS)

    Heinen, Dirk; Eliseev, Dmitry; Henke, Christoph; Jeschke, Sabina; Linder, Peter; Reuter, Sebastian; Schönitz, Sebastian; Scholz, Franziska; Weinstock, Lars Steffen; Wickmann, Stefan; Wiebusch, Christopher; Zierke, Simon

    2017-03-01

    Within the Enceladus Explorer Initiative of the DLR Space Administration navigation technologies for a future space mission are in development. Those technologies are the basis for the search for extraterrestrial life on the Saturn moon Enceladus. An autonomous melting probe, the EnEx probe, aims to extract a liquid sample from a water reservoir below the icy crust. A first EnEx probe was developed and demonstrated in a terrestrial scenario at the Bloodfalls, Taylor Glacier, Antarctica in November 2014. To enable navigation in glacier ice two acoustic systems were integrated into the probe in addition to conventional navigation technologies. The first acoustic system determines the position of the probe during the run based on propagation times of acoustic signals from emitters at reference positions at the glacier surface to receivers in the probe. The second system provides information about the forefield of the probe. It is based on sonographic principles with phased array technology integrated in the probe's melting head. Information about obstacles or sampling regions in the probe's forefield can be acquired. The development of both systems is now continued in the project EnEx-RANGE. The emitters of the localization system are replaced by a network of intelligent acoustic enabled melting probes. These localize each other by means of acoustic signals and create the reference system for the EnEx probe. This presentation includes the discussion of the intelligent acoustic network, the acoustic navigation systems of the EnEx probe and results of terrestrial tests.

  5. Autonomous Navigation of the SSTI/Lewis Spacecraft Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Hart, R. C.; Long, A. C.; Lee, T.

    1997-01-01

    The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) is pursuing the application of Global Positioning System (GPS) technology to improve the accuracy and economy of spacecraft navigation. High-accuracy autonomous navigation algorithms are being flight qualified in conjunction with GSFC's GPS Attitude Determination Flyer (GADFLY) experiment on the Small Satellite Technology Initiative (SSTI) Lewis spacecraft, which is scheduled for launch in 1997. Preflight performance assessments indicate that these algorithms can provide a real-time total position accuracy of better than 10 meters (1 sigma) and velocity accuracy of better than 0.01 meter per second (1 sigma), with selective availability at typical levels. This accuracy is projected to improve to the 2-meter level if corrections to be provided by the GPS Wide Area Augmentation System (WAAS) are included.

  6. A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft

    NASA Astrophysics Data System (ADS)

    Leishman, Robert C.

    Autonomous flight in unstructured, confined, and unknown GPS-denied environments is a challenging problem. Solutions could be tremendously beneficial for scenarios that require information about areas that are difficult to access and that present a great amount of risk. The goal of this research is to develop a new framework that enables improved solutions to this problem and to validate the approach with experiments using a hardware prototype. In Chapter 2 we examine the consequences and practical aspects of using an improved dynamic model for multirotor state estimation, using only IMU measurements. The improved model correctly explains the measurements available from the accelerometers on a multirotor. We provide hardware results demonstrating the improved attitude, velocity and even position estimates that can be achieved through the use of this model. We propose a new architecture to simplify some of the challenges that constrain GPS-denied aerial flight in Chapter 3. At the core, the approach combines visual graph-SLAM with a multiplicative extended Kalman filter (MEKF). More importantly, we depart from the common practice of estimating global states and instead keep the position and yaw states of the MEKF relative to the current node in the map. This relative navigation approach provides a tremendous benefit compared to maintaining estimates with respect to a single global coordinate frame. We discuss the architecture of this new system and provide important details for each component. We verify the approach with goal-directed autonomous flight-test results. The MEKF is the basis of the new relative navigation approach and is detailed in Chapter 4. We derive the relative filter and show how the states must be augmented and marginalized each time a new node is declared. The relative estimation approach is verified using hardware flight test results accompanied by comparisons to motion capture truth. Additionally, flight results with estimates in the control

  7. Terrain matching image pre-process and its format transform in autonomous underwater navigation

    NASA Astrophysics Data System (ADS)

    Cao, Xuejun; Zhang, Feizhou; Yang, Dongkai; Yang, Bogang

    2007-06-01

    Underwater passive navigation technology is one of the important development orientations in the field of modern navigation. With the advantage of high self-determination, stealth at sea, anti-jamming and high precision, passive navigation is completely meet with actual navigation requirements. Therefore passive navigation has become a specific navigating method for underwater vehicles. The scientists and researchers in the navigating field paid more attention to it. The underwater passive navigation can provide accurate navigation information with main Inertial Navigation System (INS) for a long period, such as location and speed. Along with the development of micro-electronics technology, the navigation of AUV is given priority to INS assisted with other navigation methods, such as terrain matching navigation. It can provide navigation ability for a long period, correct the errors of INS and make AUV not emerge from the seabed termly. With terrain matching navigation technique, in the assistance of digital charts and ocean geographical characteristics sensors, we carry through underwater image matching assistant navigation to obtain the higher location precision, therefore it is content with the requirement of underwater, long-term, high precision and all-weather of the navigation system for Autonomous Underwater Vehicles. Tertian-assistant navigation (TAN) is directly dependent on the image information (map information) in the navigating field to assist the primary navigation system according to the path appointed in advance. In TAN, a factor coordinative important with the system operation is precision and practicability of the storable images and the database which produce the image data. If the data used for characteristics are not suitable, the system navigation precision will be low. Comparing with terrain matching assistant navigation system, image matching navigation system is a kind of high precision and low cost assistant navigation system, and its

  8. Daytime Water Detection by Fusing Multiple Cues for Autonomous Off-Road Navigation

    NASA Technical Reports Server (NTRS)

    Rankin, A. L.; Matthies, L. H.; Huertas, A.

    2004-01-01

    Detecting water hazards is a significant challenge to unmanned ground vehicle autonomous off-road navigation. This paper focuses on detecting the presence of water during the daytime using color cameras. A multi-cue approach is taken. Evidence of the presence of water is generated from color, texture, and the detection of reflections in stereo range data. A rule base for fusing water cues was developed by evaluating detection results from an extensive archive of data collection imagery containing water. This software has been implemented into a run-time passive perception subsystem and tested thus far under Linux on a Pentium based processor.

  9. An adaptive SVSF-SLAM algorithm to improve the success and solving the UGVs cooperation problem

    NASA Astrophysics Data System (ADS)

    Demim, Fethi; Nemra, Abdelkrim; Louadj, Kahina; Hamerlain, Mustapha; Bazoula, Abdelouahab

    2018-05-01

    This paper aims to present a Decentralised Cooperative Simultaneous Localization and Mapping (DCSLAM) solution based on 2D laser data using an Adaptive Covariance Intersection (ACI). The ACI-DCSLAM algorithm will be validated on a swarm of Unmanned Ground Vehicles (UGVs) receiving features to estimate the position and covariance of shared features before adding them to the global map. With the proposed solution, a group of (UGVs) will be able to construct a large reliable map and localise themselves within this map without any user intervention. The most popular solutions to this problem are the EKF-SLAM, Nonlinear H-infinity ? SLAM and the FAST-SLAM. The former suffers from two important problems which are the poor consistency caused by the linearization problem and the calculation of Jacobian. The second solution is the ? which is a very promising filter because it doesn't make any assumption about noise characteristics, while the latter is not suitable for real time implementation. Therefore, a new alternative solution based on the smooth variable structure filter (SVSF) is adopted. Cooperative adaptive SVSF-SLAM algorithm is proposed in this paper to solve the UGVs SLAM problem. Our main contribution consists in adapting the SVSF filter to solve the Decentralised Cooperative SLAM problem for multiple UGVs. The algorithms developed in this paper were implemented using two mobile robots Pioneer ?, equiped with 2D laser telemetry sensors. Good results are obtained by the Cooperative adaptive SVSF-SLAM algorithm compared to the Cooperative EKF/?-SLAM algorithms, especially when the noise is colored or affected by a variable bias. Simulation results confirm and show the efficiency of the proposed algorithm which is more robust, stable and adapted to real time applications.

  10. Advancing Underwater Acoustic Communication for Autonomous Distributed Networks via Sparse Channel Sensing, Coding, and Navigation Support

    DTIC Science & Technology

    2013-09-30

    underwater acoustic communication technologies for autonomous distributed underwater networks, through innovative signal processing, coding, and navigation...in real enviroments , an offshore testbed has been developed to conduct field experimetns. The testbed consists of four nodes and has been deployed...Leadership by the Connecticut Technology Council. Dr. Zhaohui Wang joined the faculty of the Department of Electrical and Computer Engineering at

  11. Relative receiver autonomous integrity monitoring for future GNSS-based aircraft navigation

    NASA Astrophysics Data System (ADS)

    Gratton, Livio Rafael

    The Global Positioning System (GPS) has enabled reliable, safe, and practical aircraft positioning for en-route and non-precision phases of flight for more than a decade. Intense research is currently devoted to extending the use of Global Navigation Satellite Systems (GNSS), including GPS, to precision approach and landing operations. In this context, this work is focused on the development, analysis, and verification of the concept of Relative Receiver Autonomous Integrity Monitoring (RRAIM) and its potential applications to precision approach navigation. RRAIM fault detection algorithms are developed, and associated mathematical bounds on position error are derived. These are investigated as possible solutions to some current key challenges in precision approach navigation, discussed below. Augmentation systems serving continent-size areas (like the Wide Area Augmentation System or WAAS) allow certain precision approach operations within the covered region. More and better satellites, with dual frequency capabilities, are expected to be in orbit in the mid-term future, which will potentially allow WAAS-like capabilities worldwide with a sparse ground station network. Two main challenges in achieving this goal are (1) ensuring that navigation fault detection functions are fast enough to alert worldwide users of hazardously misleading information, and (2) minimizing situations in which navigation is unavailable because the user's local satellite geometry is insufficient for safe position estimation. Local augmentation systems (implemented at individual airports, like the Local Area Augmentation System or LAAS) have the potential to allow precision approach and landing operations by providing precise corrections to user-satellite range measurements. An exception to these capabilities arises during ionospheric storms (caused by solar activity), when hazardous situations can exist with residual range errors several orders of magnitudes higher than nominal. Until dual

  12. Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellutta, Paolo; Sherwin, Gary W.

    2011-01-01

    The ability to perform off-road autonomous navigation at any time of day or night is a requirement for some unmanned ground vehicle (UGV) programs. Because there are times when it is desirable for military UGVs to operate without emitting strong, detectable electromagnetic signals, a passive only terrain perception mode of operation is also often a requirement. Thermal infrared (TIR) cameras can be used to provide day and night passive terrain perception. TIR cameras have a detector sensitive to either mid-wave infrared (MWIR) radiation (3-5?m) or long-wave infrared (LWIR) radiation (8-12?m). With the recent emergence of high-quality uncooled LWIR cameras, TIR cameras have become viable passive perception options for some UGV programs. The Jet Propulsion Laboratory (JPL) has used a stereo pair of TIR cameras under several UGV programs to perform stereo ranging, terrain mapping, tree-trunk detection, pedestrian detection, negative obstacle detection, and water detection based on object reflections. In addition, we have evaluated stereo range data at a variety of UGV speeds, evaluated dual-band TIR classification of soil, vegetation, and rock terrain types, analyzed 24 hour water and 12 hour mud TIR imagery, and analyzed TIR imagery for hazard detection through smoke. Since TIR cameras do not currently provide the resolution available from megapixel color cameras, a UGV's daytime safe speed is often reduced when using TIR instead of color cameras. In this paper, we summarize the UGV terrain perception work JPL has performed with TIR cameras over the last decade and describe a calibration target developed by General Dynamics Robotic Systems (GDRS) for TIR cameras and other sensors.

  13. Autonomous navigation system. [gyroscopic pendulum for air navigation

    NASA Technical Reports Server (NTRS)

    Merhav, S. J. (Inventor)

    1981-01-01

    An inertial navigation system utilizing a servo-controlled two degree of freedom pendulum to obtain specific force components in the locally level coordinate system is described. The pendulum includes a leveling gyroscope and an azimuth gyroscope supported on a two gimbal system. The specific force components in the locally level coordinate system are converted to components in the geographical coordinate system by means of a single Euler transformation. The standard navigation equations are solved to determine longitudinal and lateral velocities. Finally, vehicle position is determined by a further integration.

  14. Field experiments using SPEAR: a speech control system for UGVs

    NASA Astrophysics Data System (ADS)

    Chhatpar, Siddharth R.; Blanco, Chris; Czerniak, Jeffrey; Hoffman, Orin; Juneja, Amit; Pruthi, Tarun; Liu, Dongqing; Karlsen, Robert; Brown, Jonathan

    2009-05-01

    This paper reports on a Field Experiment carried out by the Human Research and Engineering Directorate at Ft. Benning to evaluate the efficacy of using speech to control an Unmanned Ground Vehicle (UGV) concurrently with a handcontroller. The SPEAR system, developed by Think-A-Move, provides speech-control of UGVs. The system picks up user-speech in the ear canal with an in-ear microphone. This property allows it to work efficiently in high-noise environments, where traditional speech systems, employing external microphones, fail. It has been integrated with an iRobot PackBot 510 with EOD kit. The integrated system allows the hand-controller to be supplemented with speech for concurrent control. At Ft. Benning, the integrated system was tested by soldiers from the Officer Candidate School. The Experiment had dual focus: 1) Quantitative measurement of the time taken to complete each station and the cognitive load on users; 2) Qualitative evaluation of ease-of-use and ergonomics through soldier-feedback. Also of significant benefit to Think-A-Move was soldier-feedback on the speech-command vocabulary employed: What spoken commands are intuitive, and how the commands should be executed, e.g., limited-motion vs. unlimited-motion commands. Overall results from the Experiment are reported in the paper.

  15. Learning for autonomous navigation : extrapolating from underfoot to the far field

    NASA Technical Reports Server (NTRS)

    Matthies, Larry; Turmon, Michael; Howard, Andrew; Angelova, Anelia; Tang, Benyang; Mjolsness, Eric

    2005-01-01

    Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter. Enabling robots to learn from experience may alleviate both of these problems. We define two paradigms for this, learning from 3-D geometry and learning from proprioception, and describe initial instantiations of them we have developed under DARPA and NASA programs. Field test results show promise for learning traversability of vegetated terrain, learning to extend the lookahead range of the vision system, and learning how slip varies with slope.

  16. Preliminary navigation accuracy analysis for the TDRSS Onboard Navigation System (TONS) experiment on EP/EUVE

    NASA Technical Reports Server (NTRS)

    Gramling, C. J.; Long, A. C.; Lee, T.; Ottenstein, N. A.; Samii, M. V.

    1991-01-01

    A Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) is currently being developed by NASA to provide a high accuracy autonomous navigation capability for users of TDRSS and its successor, the Advanced TDRSS (ATDRSS). The fully autonomous user onboard navigation system will support orbit determination, time determination, and frequency determination, based on observation of a continuously available, unscheduled navigation beacon signal. A TONS experiment will be performed in conjunction with the Explorer Platform (EP) Extreme Ultraviolet Explorer (EUVE) mission to flight quality TONS Block 1. An overview is presented of TONS and a preliminary analysis of the navigation accuracy anticipated for the TONS experiment. Descriptions of the TONS experiment and the associated navigation objectives, as well as a description of the onboard navigation algorithms, are provided. The accuracy of the selected algorithms is evaluated based on the processing of realistic simulated TDRSS one way forward link Doppler measurements. The analysis process is discussed and the associated navigation accuracy results are presented.

  17. Development for SSV on a parallel processing system (PARAGON)

    NASA Astrophysics Data System (ADS)

    Gothard, Benny M.; Allmen, Mark; Carroll, Michael J.; Rich, Dan

    1995-12-01

    A goal of the surrogate semi-autonomous vehicle (SSV) program is to have multiple vehicles navigate autonomously and cooperatively with other vehicles. This paper describes the process and tools used in porting UGV/SSV (unmanned ground vehicle) autonomous mobility and target recognition algorithms from a SISD (single instruction single data) processor architecture (i.e., a Sun SPARC workstation running C/UNIX) to a MIMD (multiple instruction multiple data) parallel processor architecture (i.e., PARAGON-a parallel set of i860 processors running C/UNIX). It discusses the gains in performance and the pitfalls of such a venture. It also examines the merits of this processor architecture (based on this conceptual prototyping effort) and programming paradigm to meet the final SSV demonstration requirements.

  18. Autonomous sensor-transponder RFID with supply energy conditioning for object navigation systems

    NASA Astrophysics Data System (ADS)

    Skoczylas, M.; Kamuda, K.; Jankowski-Mihułowicz, P.; Kalita, W.; Weglarski, Mariusz

    2014-08-01

    The properties of energy conditioning electrical circuits that are developed for powering additional functional blocks of autonomous RFID transponders working in the HF band have been analyzed and presented in the paper. The concept of autonomy is realized by implementing extra functions in the typical transponder. First of all, the autonomous system should harvest energy, e.g. from the electromagnetic field of read/write devices but also the possibility of gathering information about environment should be available, e.g. by measuring different kind of physical quantities. In such an electrical device, the crucial problem consists in energy conditioning because the output voltage-current characteristic of an front-end (antenna with matching and harvesting circuit) as well as the total and instantaneous power load generated by internal circuits are strongly dependent on a realized function but also on energy and communication conditions in the RFID interface. The properly designed solution should improve harvesting efficiency, current leakage of supply storage, matching between antenna and input circuits, in order to save energy and increase operating time in such a battery-free system. The authors present methods how to increase the autonomous operation time even at advanced measuring algorithms. The measuring system with wide spectrum of sensors dedicated for different quantities (physical, chemical, etc.) has also been presented. The results of model calculations and experimental verifications have been also discussed on the basis of investigations conducted in the unique laboratory stand of object navigation systems.

  19. Relative Navigation of Formation Flying Satellites

    NASA Technical Reports Server (NTRS)

    Long, Anne; Kelbel, David; Lee, Taesul; Leung, Dominic; Carpenter, Russell; Gramling, Cheryl; Bauer, Frank (Technical Monitor)

    2002-01-01

    The Guidance, Navigation, and Control Center (GNCC) at Goddard Space Flight Center (GSFC) has successfully developed high-accuracy autonomous satellite navigation systems using the National Aeronautics and Space Administration's (NASA's) space and ground communications systems and the Global Positioning System (GPS). In addition, an autonomous navigation system that uses celestial object sensor measurements is currently under development and has been successfully tested using real Sun and Earth horizon measurements.The GNCC has developed advanced spacecraft systems that provide autonomous navigation and control of formation flyers in near-Earth, high-Earth, and libration point orbits. To support this effort, the GNCC is assessing the relative navigation accuracy achievable for proposed formations using GPS, intersatellite crosslink, ground-to-satellite Doppler, and celestial object sensor measurements. This paper evaluates the performance of these relative navigation approaches for three proposed missions with two or more vehicles maintaining relatively tight formations. High-fidelity simulations were performed to quantify the absolute and relative navigation accuracy as a function of navigation algorithm and measurement type. Realistically-simulated measurements were processed using the extended Kalman filter implemented in the GPS Enhanced Inboard Navigation System (GEONS) flight software developed by GSFC GNCC. Solutions obtained by simultaneously estimating all satellites in the formation were compared with the results obtained using a simpler approach based on differencing independently estimated state vectors.

  20. Flight Analysis of an Autonomously Navigated Experimental Lander for High Altitude Recovery

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey; Niehaus, Justin; Goodenow, Debra; Dunker, Storm; Montague, David

    2016-01-01

    First steps have been taken to qualify a family of parafoil systems capable of increasing the survivability and reusability of high-altitude balloon payloads. The research is motivated by the common risk facing balloon payloads where expensive flight hardware can often land in inaccessible areas that make them difficult or impossible to recover. The Autonomously Navigated Experimental Lander (ANGEL) flight test introduced a commercial Guided Parachute Aerial Delivery System (GPADS) to a previously untested environment at 108,000ft MSL to determine its high-altitude survivability and capabilities. Following release, ANGEL descended under a drogue until approximately 25,000ft, at which point the drogue was jettisoned and the main parachute was deployed, commencing navigation. Multiple data acquisition platforms were used to characterize the return-to-point technology performance and help determine its suitability for returning future scientific payloads ranging from 180 to 10,000lbs to safer and more convenient landing locations. This report describes the test vehicle design, and summarizes the captured sensor data. Various post-flight analyses are used to quantify the system's performance, gondola load data, and serve as a reference point for subsequent missions.

  1. Lessons Learned from OSIRIS-Rex Autonomous Navigation Using Natural Feature Tracking

    NASA Technical Reports Server (NTRS)

    Lorenz, David A.; Olds, Ryan; May, Alexander; Mario, Courtney; Perry, Mark E.; Palmer, Eric E.; Daly, Michael

    2017-01-01

    The Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (Osiris-REx) spacecraft is scheduled to launch in September, 2016 to embark on an asteroid sample return mission. It is expected to rendezvous with the asteroid, Bennu, navigate to the surface, collect a sample (July 20), and return the sample to Earth (September 23). The original mission design called for using one of two Flash Lidar units to provide autonomous navigation to the surface. Following Preliminary design and initial development of the Lidars, reliability issues with the hardware and test program prompted the project to begin development of an alternative navigation technique to be used as a backup to the Lidar. At the critical design review, Natural Feature Tracking (NFT) was added to the mission. NFT is an onboard optical navigation system that compares observed images to a set of asteroid terrain models which are rendered in real-time from a catalog stored in memory on the flight computer. Onboard knowledge of the spacecraft state is then updated by a Kalman filter using the measured residuals between the rendered reference images and the actual observed images. The asteroid terrain models used by NFT are built from a shape model generated from observations collected during earlier phases of the mission and include both terrain shape and albedo information about the asteroid surface. As a result, the success of NFT is highly dependent on selecting a set of topographic features that can be both identified during descent as well as reliably rendered using the shape model data available. During development, the OSIRIS-REx team faced significant challenges in developing a process conducive to robust operation. This was especially true for terrain models to be used as the spacecraft gets close to the asteroid and higher fidelity models are required for reliable image correlation. This paper will present some of the challenges and lessons learned from the development

  2. Fast fingerprint database maintenance for indoor positioning based on UGV SLAM.

    PubMed

    Tang, Jian; Chen, Yuwei; Chen, Liang; Liu, Jingbin; Hyyppä, Juha; Kukko, Antero; Kaartinen, Harri; Hyyppä, Hannu; Chen, Ruizhi

    2015-03-04

    Indoor positioning technology has become more and more important in the last two decades. Utilizing Received Signal Strength Indicator (RSSI) fingerprints of Signals of OPportunity (SOP) is a promising alternative navigation solution. However, as the RSSIs vary during operation due to their physical nature and are easily affected by the environmental change, one challenge of the indoor fingerprinting method is maintaining the RSSI fingerprint database in a timely and effective manner. In this paper, a solution for rapidly updating the fingerprint database is presented, based on a self-developed Unmanned Ground Vehicles (UGV) platform NAVIS. Several SOP sensors were installed on NAVIS for collecting indoor fingerprint information, including a digital compass collecting magnetic field intensity, a light sensor collecting light intensity, and a smartphone which collects the access point number and RSSIs of the pre-installed WiFi network. The NAVIS platform generates a map of the indoor environment and collects the SOPs during processing of the mapping, and then the SOP fingerprint database is interpolated and updated in real time. Field tests were carried out to evaluate the effectiveness and efficiency of the proposed method. The results showed that the fingerprint databases can be quickly created and updated with a higher sampling frequency (5Hz) and denser reference points compared with traditional methods, and the indoor map can be generated without prior information. Moreover, environmental changes could also be detected quickly for fingerprint indoor positioning.

  3. Fast Fingerprint Database Maintenance for Indoor Positioning Based on UGV SLAM

    PubMed Central

    Tang, Jian; Chen, Yuwei; Chen, Liang; Liu, Jingbin; Hyyppä, Juha; Kukko, Antero; Kaartinen, Harri; Hyyppä, Hannu; Chen, Ruizhi

    2015-01-01

    Indoor positioning technology has become more and more important in the last two decades. Utilizing Received Signal Strength Indicator (RSSI) fingerprints of Signals of OPportunity (SOP) is a promising alternative navigation solution. However, as the RSSIs vary during operation due to their physical nature and are easily affected by the environmental change, one challenge of the indoor fingerprinting method is maintaining the RSSI fingerprint database in a timely and effective manner. In this paper, a solution for rapidly updating the fingerprint database is presented, based on a self-developed Unmanned Ground Vehicles (UGV) platform NAVIS. Several SOP sensors were installed on NAVIS for collecting indoor fingerprint information, including a digital compass collecting magnetic field intensity, a light sensor collecting light intensity, and a smartphone which collects the access point number and RSSIs of the pre-installed WiFi network. The NAVIS platform generates a map of the indoor environment and collects the SOPs during processing of the mapping, and then the SOP fingerprint database is interpolated and updated in real time. Field tests were carried out to evaluate the effectiveness and efficiency of the proposed method. The results showed that the fingerprint databases can be quickly created and updated with a higher sampling frequency (5Hz) and denser reference points compared with traditional methods, and the indoor map can be generated without prior information. Moreover, environmental changes could also be detected quickly for fingerprint indoor positioning. PMID:25746096

  4. Optimization design about gimbal structure of high-precision autonomous celestial navigation tracking mirror system

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Yang, Xiao-xu; Han, Jun-feng; Wei, Yu; Zhang, Jing; Xie, Mei-lin; Yue, Peng

    2016-01-01

    High precision tracking platform of celestial navigation with control mirror servo structure form, to solve the disadvantages of big volume and rotational inertia, slow response speed, and so on. It improved the stability and tracking accuracy of platform. Due to optical sensor and mirror are installed on the middle-gimbal, stiffness and resonant frequency requirement for high. Based on the application of finite element modality analysis theory, doing Research on dynamic characteristics of the middle-gimbal, and ANSYS was used for the finite element dynamic emulator analysis. According to the result of the computer to find out the weak links of the structure, and Put forward improvement suggestions and reanalysis. The lowest resonant frequency of optimization middle-gimbal avoid the bandwidth of the platform servo mechanism, and much higher than the disturbance frequency of carrier aircraft, and reduces mechanical resonance of the framework. Reaching provides a theoretical basis for the whole machine structure optimization design of high-precision of autonomous Celestial navigation tracking mirror system.

  5. Search Problems in Mission Planning and Navigation of Autonomous Aircraft. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Krozel, James A.

    1988-01-01

    An architecture for the control of an autonomous aircraft is presented. The architecture is a hierarchical system representing an anthropomorphic breakdown of the control problem into planner, navigator, and pilot systems. The planner system determines high level global plans from overall mission objectives. This abstract mission planning is investigated by focusing on the Traveling Salesman Problem with variations on local and global constraints. Tree search techniques are applied including the breadth first, depth first, and best first algorithms. The minimum-column and row entries for the Traveling Salesman Problem cost matrix provides a powerful heuristic to guide these search techniques. Mission planning subgoals are directed from the planner to the navigator for planning routes in mountainous terrain with threats. Terrain/threat information is abstracted into a graph of possible paths for which graph searches are performed. It is shown that paths can be well represented by a search graph based on the Voronoi diagram of points representing the vertices of mountain boundaries. A comparison of Dijkstra's dynamic programming algorithm and the A* graph search algorithm from artificial intelligence/operations research is performed for several navigation path planning examples. These examples illustrate paths that minimize a combination of distance and exposure to threats. Finally, the pilot system synthesizes the flight trajectory by creating the control commands to fly the aircraft.

  6. Autonomous Wheeled Robot Platform Testbed for Navigation and Mapping Using Low-Cost Sensors

    NASA Astrophysics Data System (ADS)

    Calero, D.; Fernandez, E.; Parés, M. E.

    2017-11-01

    This paper presents the concept of an architecture for a wheeled robot system that helps researchers in the field of geomatics to speed up their daily research on kinematic geodesy, indoor navigation and indoor positioning fields. The presented ideas corresponds to an extensible and modular hardware and software system aimed at the development of new low-cost mapping algorithms as well as at the evaluation of the performance of sensors. The concept, already implemented in the CTTC's system ARAS (Autonomous Rover for Automatic Surveying) is generic and extensible. This means that it is possible to incorporate new navigation algorithms or sensors at no maintenance cost. Only the effort related to the development tasks required to either create such algorithms needs to be taken into account. As a consequence, change poses a much small problem for research activities in this specific area. This system includes several standalone sensors that may be combined in different ways to accomplish several goals; that is, this system may be used to perform a variety of tasks, as, for instance evaluates positioning algorithms performance or mapping algorithms performance.

  7. DDDAMS-based Urban Surveillance and Crowd Control via UAVs and UGVs

    DTIC Science & Technology

    2015-12-04

    for crowd dynamics modeling by incorporating multi-resolution data, where a grid-based method is used to model crowd motion with UAVs’ low -resolution...information and more computational intensive (and time-consuming). Given that the deployment of fidelity selection results in simulation faces computational... low fidelity information FOV y (A) DR x (A) DR y (A) Not detected high fidelity information Table 1: Parameters for UAV and UGV for their detection

  8. Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellut, Paolo; Sherwin, Gary

    2011-01-01

    TIR cameras can be used for day/night Unmanned Ground Vehicle (UGV) autonomous navigation when stealth is required. The quality of uncooled TIR cameras has significantly improved over the last decade, making them a viable option at low speed Limiting factors for stereo ranging with uncooled LWIR cameras are image blur and low texture scenes TIR perception capabilities JPL has explored includes: (1) single and dual band TIR terrain classification (2) obstacle detection (pedestrian, vehicle, tree trunks, ditches, and water) (3) perception thru obscurants

  9. Autonomous precision landing using terrain-following navigation

    NASA Technical Reports Server (NTRS)

    Vaughan, R. M.; Gaskell, R. W.; Halamek, P.; Klumpp, A. R.; Synnott, S. P.

    1991-01-01

    Terrain-following navigation studies that have been done over the past two years in the navigation system section at JPL are described. A descent to Mars scenario based on Mars Rover and Sample Return mission profiles is described, and navigation and image processing issues pertaining to descent phases where landmark picture can be obtained are examined. A covariance analysis is performed to verify that landmark measurements from a terrain-following navigation system can satisfy precision landing requirements. Image processing problems involving known landmarks in actual pictures are considered. Mission design alternatives that can alleviate some of these problems are suggested.

  10. Development of a Commercially Viable, Modular Autonomous Robotic Systems for Converting any Vehicle to Autonomous Control

    NASA Technical Reports Server (NTRS)

    Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.

    1994-01-01

    A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.

  11. Autonomous navigation method for substation inspection robot based on travelling deviation

    NASA Astrophysics Data System (ADS)

    Yang, Guoqing; Xu, Wei; Li, Jian; Fu, Chongguang; Zhou, Hao; Zhang, Chuanyou; Shao, Guangting

    2017-06-01

    A new method of edge detection is proposed in substation environment, which can realize the autonomous navigation of the substation inspection robot. First of all, the road image and information are obtained by using an image acquisition device. Secondly, the noise in the region of interest which is selected in the road image, is removed with the digital image processing algorithm, the road edge is extracted by Canny operator, and the road boundaries are extracted by Hough transform. Finally, the distance between the robot and the left and the right boundaries is calculated, and the travelling distance is obtained. The robot's walking route is controlled according to the travel deviation and the preset threshold. Experimental results show that the proposed method can detect the road area in real time, and the algorithm has high accuracy and stable performance.

  12. Crew-Aided Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Holt, Greg N.

    2015-01-01

    A sextant provides manual capability to perform star/planet-limb sightings and offers a cheap, simple, robust backup navigation source for exploration missions independent from the ground. Sextant sightings from spacecraft were first exercised in Gemini and flew as the lost-communication backup for all Apollo missions. This study characterized error sources of navigation-grade sextants for feasibility of taking star and planetary limb sightings from inside a spacecraft. A series of similar studies was performed in the early/mid-1960s in preparation for Apollo missions. This study modernized and updated those findings in addition to showing feasibility using Linear Covariance analysis techniques. The human eyeball is a remarkable piece of optical equipment and provides many advantages over camera-based systems, including dynamic range and detail resolution. This technique utilizes those advantages and provides important autonomy to the crew in the event of lost communication with the ground. It can also provide confidence and verification of low-TRL automated onboard systems. The technique is extremely flexible and is not dependent on any particular vehicle type. The investigation involved procuring navigation-grade sextants and characterizing their performance under a variety of conditions encountered in exploration missions. The JSC optical sensor lab and Orion mockup were the primary testing locations. For the accuracy assessment, a group of test subjects took sextant readings on calibrated targets while instrument/operator precision was measured. The study demonstrated repeatability of star/planet-limb sightings with bias and standard deviation around 10 arcseconds, then used high-fidelity simulations to verify those accuracy levels met the needs for targeting mid-course maneuvers in preparation for Earth reen.

  13. Cloud Absorption Radiometer Autonomous Navigation System - CANS

    NASA Technical Reports Server (NTRS)

    Kahle, Duncan; Gatebe, Charles; McCune, Bill; Hellwig, Dustan

    2013-01-01

    CAR (cloud absorption radiometer) acquires spatial reference data from host aircraft navigation systems. This poses various problems during CAR data reduction, including navigation data format, accuracy of position data, accuracy of airframe inertial data, and navigation data rate. Incorporating its own navigation system, which included GPS (Global Positioning System), roll axis inertia and rates, and three axis acceleration, CANS expedites data reduction and increases the accuracy of the CAR end data product. CANS provides a self-contained navigation system for the CAR, using inertial reference and GPS positional information. The intent of the software application was to correct the sensor with respect to aircraft roll in real time based upon inputs from a precision navigation sensor. In addition, the navigation information (including GPS position), attitude data, and sensor position details are all streamed to a remote system for recording and later analysis. CANS comprises a commercially available inertial navigation system with integral GPS capability (Attitude Heading Reference System AHRS) integrated into the CAR support structure and data system. The unit is attached to the bottom of the tripod support structure. The related GPS antenna is located on the P-3 radome immediately above the CAR. The AHRS unit provides a RS-232 data stream containing global position and inertial attitude and velocity data to the CAR, which is recorded concurrently with the CAR data. This independence from aircraft navigation input provides for position and inertial state data that accounts for very small changes in aircraft attitude and position, sensed at the CAR location as opposed to aircraft state sensors typically installed close to the aircraft center of gravity. More accurate positional data enables quicker CAR data reduction with better resolution. The CANS software operates in two modes: initialization/calibration and operational. In the initialization/calibration mode

  14. Parameterization of norfolk sandy loam properties for stochastic modeling of light in-wheel motor UGV

    USDA-ARS?s Scientific Manuscript database

    To accurately develop a mathematical model for an In-Wheel Motor Unmanned Ground Vehicle (IWM UGV) on soft terrain, parameterization of terrain properties is essential to stochastically model tire-terrain interaction for each wheel independently. Operating in off-road conditions requires paying clos...

  15. Design and test of a simulation system for autonomous optic-navigated planetary landing

    NASA Astrophysics Data System (ADS)

    Cai, Sheng; Yin, Yanhe; Liu, Yanjun; He, Fengyun

    2018-02-01

    In this paper, a simulation system based on commercial projector is proposed to test the optical navigation algorithms for autonomous planetary landing in laboratorial scenarios. The design work of optics, mechanics and synchronization control are carried out. Furthermore, the whole simulation system is set up and tested. Through the calibration of the system, two main problems, synchronization between the projector and CCD and pixel-level shifting caused by the low repeatability of DMD used in the projector, are settled. The experimental result shows that the RMS errors of pitch, yaw and roll angles are 0.78', 0.48', and 2.95' compared with the theoretical calculation, which can fulfill the requirement of experimental simulation for planetary landing in laboratory.

  16. Multi-Spacecraft Autonomous Positioning System

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan

    2015-01-01

    As the number of spacecraft in simultaneous operation continues to grow, there is an increased dependency on ground-based navigation support. The current baseline system for deep space navigation utilizes Earth-based radiometric tracking, requiring long-duration observations to perform orbit determination and generate a state update. The age, complexity, and high utilization of the ground assets pose a risk to spacecraft navigation performance. In order to perform complex operations at large distances from Earth, such as extraterrestrial landing and proximity operations, autonomous systems are required. With increasingly complex mission operations, the need for frequent and Earth-independent navigation capabilities is further reinforced. The Multi-spacecraft Autonomous Positioning System (MAPS) takes advantage of the growing interspacecraft communication network and infrastructure to allow for Earth-autonomous state measurements to enable network-based space navigation. A notional concept of operations is given in figure 1. This network is already being implemented and routinely used in Martian communications through the use of the Mars Reconnaissance Orbiter and Mars Odyssey spacecraft as relays for surface assets. The growth of this communications architecture is continued through MAVEN, and future potential commercial Mars telecom orbiters. This growing network provides an initial Marslocal capability for inter-spacecraft communication and navigation. These navigation updates are enabled by cross-communication between assets in the network, coupled with onboard navigation estimation routines to integrate packet travel time to generate ranging measurements. Inter-spacecraft communication allows for frequent state broadcasts and time updates from trusted references. The architecture is a software-based solution, enabling its implementation on a wide variety of current assets, with the operational constraints and measurement accuracy determined by onboard systems.

  17. Crew-Aided Autonomous Navigation Project

    NASA Technical Reports Server (NTRS)

    Holt, Greg

    2015-01-01

    Manual capability to perform star/planet-limb sightings provides a cheap, simple, and robust backup navigation source for exploration missions independent from the ground. Sextant sightings from spacecraft were first exercised in Gemini and flew as the loss-of-communications backup for all Apollo missions. This study seeks to procure and characterize error sources of navigation-grade sextants for feasibility of taking star and planetary limb sightings from inside a spacecraft. A series of similar studies was performed in the early/mid-1960s in preparation for Apollo missions, and one goal of this study is to modernize and update those findings. This technique has the potential to deliver significant risk mitigation, validation, and backup to more complex low-TRL automated systems under development involving cameras.

  18. LiDAR Scan Matching Aided Inertial Navigation System in GNSS-Denied Environments.

    PubMed

    Tang, Jian; Chen, Yuwei; Niu, Xiaoji; Wang, Li; Chen, Liang; Liu, Jingbin; Shi, Chuang; Hyyppä, Juha

    2015-07-10

    A new scan that matches an aided Inertial Navigation System (INS) with a low-cost LiDAR is proposed as an alternative to GNSS-based navigation systems in GNSS-degraded or -denied environments such as indoor areas, dense forests, or urban canyons. In these areas, INS-based Dead Reckoning (DR) and Simultaneous Localization and Mapping (SLAM) technologies are normally used to estimate positions as separate tools. However, there are critical implementation problems with each standalone system. The drift errors of velocity, position, and heading angles in an INS will accumulate over time, and on-line calibration is a must for sustaining positioning accuracy. SLAM performance is poor in featureless environments where the matching errors can significantly increase. Each standalone positioning method cannot offer a sustainable navigation solution with acceptable accuracy. This paper integrates two complementary technologies-INS and LiDAR SLAM-into one navigation frame with a loosely coupled Extended Kalman Filter (EKF) to use the advantages and overcome the drawbacks of each system to establish a stable long-term navigation process. Static and dynamic field tests were carried out with a self-developed Unmanned Ground Vehicle (UGV) platform-NAVIS. The results prove that the proposed approach can provide positioning accuracy at the centimetre level for long-term operations, even in a featureless indoor environment.

  19. 76 FR 21772 - Navigation Safety Advisory Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-18

    ..., routing measures, marine information, diving safety, and aids to navigation systems. Agenda The NAVSAC... discussion of autonomous unmanned vessels and discuss their implications for the Inland Navigation Rules. A... DEPARTMENT OF HOMELAND SECURITY Coast Guard [Docket No. USCG-2011-0204] Navigation Safety Advisory...

  20. Autonomous Underwater Vehicle Navigation

    DTIC Science & Technology

    2008-02-01

    three standard deviations are ignored as indicated by the × marker. 25 7. REFERENCES [1] R. G. Brown and P. Y. C. Hwang , Introduction to Random Signals...autonomous underwater vehicle with six degrees of freedom. We approach this problem using an error state formulation of the Kalman filter. Integration...each position fix, but is this ad-hoc method optimal? Here, we present an approach using an error state formulation of the Kalman filter to provide an

  1. Draper Laboratory small autonomous aerial vehicle

    NASA Astrophysics Data System (ADS)

    DeBitetto, Paul A.; Johnson, Eric N.; Bosse, Michael C.; Trott, Christian A.

    1997-06-01

    The Charles Stark Draper Laboratory, Inc. and students from Massachusetts Institute of Technology and Boston University have cooperated to develop an autonomous aerial vehicle that won the 1996 International Aerial Robotics Competition. This paper describes the approach, system architecture and subsystem designs for the entry. This entry represents a combination of many technology areas: navigation, guidance, control, vision processing, human factors, packaging, power, real-time software, and others. The aerial vehicle, an autonomous helicopter, performs navigation and control functions using multiple sensors: differential GPS, inertial measurement unit, sonar altimeter, and a flux compass. The aerial transmits video imagery to the ground. A ground based vision processor converts the image data into target position and classification estimates. The system was designed, built, and flown in less than one year and has provided many lessons about autonomous vehicle systems, several of which are discussed. In an appendix, our current research in augmenting the navigation system with vision- based estimates is presented.

  2. A Dynamic Navigation Model for Unmanned Aircraft Systems and an Application to Autonomous Front-On Environmental Sensing and Photography Using Low-Cost Sensor Systems.

    PubMed

    Cooper, Andrew James; Redman, Chelsea Anne; Stoneham, David Mark; Gonzalez, Luis Felipe; Etse, Victor Kwesi

    2015-08-28

    This paper presents an unmanned aircraft system (UAS) that uses a probabilistic model for autonomous front-on environmental sensing or photography of a target. The system is based on low-cost and readily-available sensor systems in dynamic environments and with the general intent of improving the capabilities of dynamic waypoint-based navigation systems for a low-cost UAS. The behavioural dynamics of target movement for the design of a Kalman filter and Markov model-based prediction algorithm are included. Geometrical concepts and the Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of a target, thus delivering a new waypoint for autonomous navigation. The results of the application to aerial filming with low-cost UAS are presented, achieving the desired goal of maintained front-on perspective without significant constraint to the route or pace of target movement.

  3. A Dynamic Navigation Model for Unmanned Aircraft Systems and an Application to Autonomous Front-On Environmental Sensing and Photography Using Low-Cost Sensor Systems

    PubMed Central

    Cooper, Andrew James; Redman, Chelsea Anne; Stoneham, David Mark; Gonzalez, Luis Felipe; Etse, Victor Kwesi

    2015-01-01

    This paper presents an unmanned aircraft system (UAS) that uses a probabilistic model for autonomous front-on environmental sensing or photography of a target. The system is based on low-cost and readily-available sensor systems in dynamic environments and with the general intent of improving the capabilities of dynamic waypoint-based navigation systems for a low-cost UAS. The behavioural dynamics of target movement for the design of a Kalman filter and Markov model-based prediction algorithm are included. Geometrical concepts and the Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of a target, thus delivering a new waypoint for autonomous navigation. The results of the application to aerial filming with low-cost UAS are presented, achieving the desired goal of maintained front-on perspective without significant constraint to the route or pace of target movement. PMID:26343680

  4. Fuzzy Logic Based Control for Autonomous Mobile Robot Navigation

    PubMed Central

    Masmoudi, Mohamed Slim; Masmoudi, Mohamed

    2016-01-01

    This paper describes the design and the implementation of a trajectory tracking controller using fuzzy logic for mobile robot to navigate in indoor environments. Most of the previous works used two independent controllers for navigation and avoiding obstacles. The main contribution of the paper can be summarized in the fact that we use only one fuzzy controller for navigation and obstacle avoidance. The used mobile robot is equipped with DC motor, nine infrared range (IR) sensors to measure the distance to obstacles, and two optical encoders to provide the actual position and speeds. To evaluate the performances of the intelligent navigation algorithms, different trajectories are used and simulated using MATLAB software and SIMIAM navigation platform. Simulation results show the performances of the intelligent navigation algorithms in terms of simulation times and travelled path. PMID:27688748

  5. High Speed Lunar Navigation for Crewed and Remotely Piloted Vehicles

    NASA Technical Reports Server (NTRS)

    Pedersen, L.; Allan, M.; To, V.; Utz, H.; Wojcikiewicz, W.; Chautems, C.

    2010-01-01

    Increased navigation speed is desirable for lunar rovers, whether autonomous, crewed or remotely operated, but is hampered by the low gravity, high contrast lighting and rough terrain. We describe lidar based navigation system deployed on NASA's K10 autonomous rover and to increase the terrain hazard situational awareness of the Lunar Electric Rover crew.

  6. Autonomous Flight Safety System

    NASA Technical Reports Server (NTRS)

    Simpson, James

    2010-01-01

    The Autonomous Flight Safety System (AFSS) is an independent self-contained subsystem mounted onboard a launch vehicle. AFSS has been developed by and is owned by the US Government. Autonomously makes flight termination/destruct decisions using configurable software-based rules implemented on redundant flight processors using data from redundant GPS/IMU navigation sensors. AFSS implements rules determined by the appropriate Range Safety officials.

  7. LiDAR Scan Matching Aided Inertial Navigation System in GNSS-Denied Environments

    PubMed Central

    Tang, Jian; Chen, Yuwei; Niu, Xiaoji; Wang, Li; Chen, Liang; Liu, Jingbin; Shi, Chuang; Hyyppä, Juha

    2015-01-01

    A new scan that matches an aided Inertial Navigation System (INS) with a low-cost LiDAR is proposed as an alternative to GNSS-based navigation systems in GNSS-degraded or -denied environments such as indoor areas, dense forests, or urban canyons. In these areas, INS-based Dead Reckoning (DR) and Simultaneous Localization and Mapping (SLAM) technologies are normally used to estimate positions as separate tools. However, there are critical implementation problems with each standalone system. The drift errors of velocity, position, and heading angles in an INS will accumulate over time, and on-line calibration is a must for sustaining positioning accuracy. SLAM performance is poor in featureless environments where the matching errors can significantly increase. Each standalone positioning method cannot offer a sustainable navigation solution with acceptable accuracy. This paper integrates two complementary technologies—INS and LiDAR SLAM—into one navigation frame with a loosely coupled Extended Kalman Filter (EKF) to use the advantages and overcome the drawbacks of each system to establish a stable long-term navigation process. Static and dynamic field tests were carried out with a self-developed Unmanned Ground Vehicle (UGV) platform—NAVIS. The results prove that the proposed approach can provide positioning accuracy at the centimetre level for long-term operations, even in a featureless indoor environment. PMID:26184206

  8. University of Pennsylvania MAGIC 2010 Final Report

    DTIC Science & Technology

    2011-01-10

    and mapping ( SLAM ) techniques are employed to build a local map of the environment surrounding the robot. Readings from the two complementary LIDAR sen...IMU, LIDAR , Cameras Localization Disrupter UGV Local Navigation Sensors: GPS, IMU, LIDAR , Cameras Laser Control Localization Task Planner Strategy/Plan...various components shown in Figure 2. This is comprised of the following subsystems: • Sensor UGV: Mobile UGVs with LIDAR and camera sensors, GPS, and

  9. Design and Development of the WVU Advanced Technology Satellite for Optical Navigation

    NASA Astrophysics Data System (ADS)

    Straub, Miranda

    In order to meet the demands of future space missions, it is beneficial for spacecraft to have the capability to support autonomous navigation. This is true for both crewed and uncrewed vehicles. For crewed vehicles, autonomous navigation would allow the crew to safely navigate home in the event of a communication system failure. For uncrewed missions, autonomous navigation reduces the demand on ground-based infrastructure and could allow for more flexible operation. One promising technique for achieving these goals is through optical navigation. To this end, the present work considers how camera images of the Earth's surface could enable autonomous navigation of a satellite in low Earth orbit. Specifically, this study will investigate the use of coastlines and other natural land-water boundaries for navigation. Observed coastlines can be matched to a pre-existing coastline database in order to determine the location of the spacecraft. This paper examines how such measurements may be processed in an on-board extended Kalman filter (EKF) to provide completely autonomous estimates of the spacecraft state throughout the duration of the mission. In addition, future work includes implementing this work on a CubeSat mission within the WVU Applied Space Exploration Lab (ASEL). The mission titled WVU Advanced Technology Satellite for Optical Navigation (WATSON) will provide students with an opportunity to experience the life cycle of a spacecraft from design through operation while hopefully meeting the primary and secondary goals defined for mission success. The spacecraft design process, although simplified by CubeSat standards, will be discussed in this thesis as well as the current results of laboratory testing with the CubeSat model in the ASEL.

  10. Evaluation of Relative Navigation Algorithms for Formation-Flying Satellites

    NASA Technical Reports Server (NTRS)

    Kelbel, David; Lee, Taesul; Long, Anne; Carpenter, J. Russell; Gramling, Cheryl

    2001-01-01

    Goddard Space Flight Center is currently developing advanced spacecraft systems to provide autonomous navigation and control of formation flyers. This paper discusses autonomous relative navigation performance for formations in eccentric, medium, and high-altitude Earth orbits using Global Positioning System (GPS) Standard Positioning Service (SPS) and intersatellite range measurements. The performance of several candidate relative navigation approaches is evaluated. These analyses indicate that the relative navigation accuracy is primarily a function of the frequency of acquisition and tracking of the GPS signals. A relative navigation position accuracy of 0.5 meters root-mean-square (RMS) can be achieved for formations in medium-attitude eccentric orbits that can continuously track at least one GPS signal. A relative navigation position accuracy of better than 75 meters RMS can be achieved for formations in high-altitude eccentric orbits that have sparse tracking of the GPS signals. The addition of round-trip intersatellite range measurements can significantly improve relative navigation accuracy for formations with sparse tracking of the GPS signals.

  11. Rule-based navigation control design for autonomous flight

    NASA Astrophysics Data System (ADS)

    Contreras, Hugo; Bassi, Danilo

    2008-04-01

    This article depicts a navigation control system design that is based on a set of rules in order to follow a desired trajectory. The full control of the aircraft considered here comprises: a low level stability control loop, based on classic PID controller and the higher level navigation whose main job is to exercise lateral control (course) and altitude control, trying to follow a desired trajectory. The rules and PID gains were adjusted systematically according to the result of flight simulation. In spite of its simplicity, the rule-based navigation control proved to be robust, even with big perturbation, like crossing winds.

  12. Development Of Autonomous Systems

    NASA Astrophysics Data System (ADS)

    Kanade, Takeo

    1989-03-01

    In the last several years at the Robotics Institute of Carnegie Mellon University, we have been working on two projects for developing autonomous systems: Nablab for Autonomous Land Vehicle and Ambler for Mars Rover. These two systems are for different purposes: the Navlab is a four-wheeled vehicle (van) for road and open terrain navigation, and the Ambler is a six-legged locomotor for Mars exploration. The two projects, however, share many common aspects. Both are large-scale integrated systems for navigation. In addition to the development of individual components (eg., construction and control of the vehicle, vision and perception, and planning), integration of those component technologies into a system by means of an appropriate architecture is a major issue.

  13. Autonomous Locator of Thermals (ALOFT) Autonomous Soaring Algorithm

    DTIC Science & Technology

    2015-04-03

    estimator used on the NRL CICADA Mk 3 micro air vehicle [13]. An extended Kalman filter (EKF) was designed to estimate the airspeed sensor bias and...Boulder, 2007. ALOFT Autonomous Soaring Algorithm 31 13. A.D. Kahn and D.J. Edwards, “Navigation, Guidance and Control for the CICADA Expendable

  14. New distributed radar technology based on UAV or UGV application

    NASA Astrophysics Data System (ADS)

    Molchanov, Pavlo A.; Contarino, Vincent M.

    2013-05-01

    Regular micro and nano radars cannot provide reliable tracking of low altitude low profile aerial targets in urban and mountain areas because of reflection and re-reflections from buildings and terrain. They become visible and vulnerable to guided missiles if positioned on a tower or blimp. Doppler radar cannot distinguish moving cars and small low altitude aerial targets in an urban area. A new concept of pocket size distributed radar technology based on the application of UAV (Unmanned Air Vehicles), UGV (Unmanned Ground Vehicles) is proposed for tracking of low altitude low profile aerial targets at short and medium distances for protection of stadium, camp, military facility in urban or mountain areas.

  15. Multi-Sensor Mud Detection

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.; Matthies, Larry H.

    2010-01-01

    Robust mud detection is a critical perception requirement for Unmanned Ground Vehicle (UGV) autonomous offroad navigation. A military UGV stuck in a mud body during a mission may have to be sacrificed or rescued, both of which are unattractive options. There are several characteristics of mud that may be detectable with appropriate UGV-mounted sensors. For example, mud only occurs on the ground surface, is cooler than surrounding dry soil during the daytime under nominal weather conditions, is generally darker than surrounding dry soil in visible imagery, and is highly polarized. However, none of these cues are definitive on their own. Dry soil also occurs on the ground surface, shadows, snow, ice, and water can also be cooler than surrounding dry soil, shadows are also darker than surrounding dry soil in visible imagery, and cars, water, and some vegetation are also highly polarized. Shadows, snow, ice, water, cars, and vegetation can all be disambiguated from mud by using a suite of sensors that span multiple bands in the electromagnetic spectrum. Because there are military operations when it is imperative for UGV's to operate without emitting strong, detectable electromagnetic signals, passive sensors are desirable. JPL has developed a daytime mud detection capability using multiple passive imaging sensors. Cues for mud from multiple passive imaging sensors are fused into a single mud detection image using a rule base, and the resultant mud detection is localized in a terrain map using range data generated from a stereo pair of color cameras.

  16. Experiment D009: Simple navigation

    NASA Technical Reports Server (NTRS)

    Silva, R. M.; Jorris, T. R.; Vallerie, E. M., III

    1971-01-01

    Space position-fixing techniques have been investigated by collecting data on the observable phenomena of space flight that could be used to solve the problem of autonomous navigation by the use of optical data and manual computations to calculate the position of a spacecraft. After completion of the developmental and test phases, the product of the experiment would be a manual-optical technique of orbital space navigation that could be used as a backup to onboard and ground-based spacecraft-navigation systems.

  17. Mamdani Fuzzy System for Indoor Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Khan, M. K. A. Ahamed; Rashid, Razif; Elamvazuthi, I.

    2011-06-01

    Several control algorithms for autonomous mobile robot navigation have been proposed in the literature. Recently, the employment of non-analytical methods of computing such as fuzzy logic, evolutionary computation, and neural networks has demonstrated the utility and potential of these paradigms for intelligent control of mobile robot navigation. In this paper, Mamdani fuzzy system for an autonomous mobile robot is developed. The paper begins with the discussion on the conventional controller and then followed by the description of fuzzy logic controller in detail.

  18. Small Body Landing Accuracy Using In-Situ Navigation

    NASA Technical Reports Server (NTRS)

    Bhaskaran, Shyam; Nandi, Sumita; Broschart, Stephen; Wallace, Mark; Olson, Corwin; Cangahuala, L. Alberto

    2011-01-01

    Spacecraft landings on small bodies (asteroids and comets) can require target accuracies too stringent to be met using ground-based navigation alone, especially if specific landing site requirements must be met for safety or to meet science goals. In-situ optical observations coupled with onboard navigation processing can meet the tighter accuracy requirements to enable such missions. Recent developments in deep space navigation capability include a self-contained autonomous navigation system (used in flight on three missions) and a landmark tracking system (used experimentally on the Japanese Hayabusa mission). The merging of these two technologies forms a methodology to perform autonomous onboard navigation around small bodies. This paper presents an overview of these systems, as well as the results from Monte Carlo studies to quantify the achievable landing accuracies by using these methods. Sensitivity of the results to variations in spacecraft maneuver execution error, attitude control accuracy and unmodeled forces are examined. Cases for two bodies, a small asteroid and on a mid-size comet, are presented.

  19. An Algorithm for Autonomous Formation Obstacle Avoidance

    NASA Astrophysics Data System (ADS)

    Cruz, Yunior I.

    The level of human interaction with Unmanned Aerial Systems varies greatly from remotely piloted aircraft to fully autonomous systems. In the latter end of the spectrum, the challenge lies in designing effective algorithms to dictate the behavior of the autonomous agents. A swarm of autonomous Unmanned Aerial Vehicles requires collision avoidance and formation flight algorithms to negotiate environmental challenges it may encounter during the execution of its mission, which may include obstacles and chokepoints. In this work, a simple algorithm is developed to allow a formation of autonomous vehicles to perform point to point navigation while avoiding obstacles and navigating through chokepoints. Emphasis is placed on maintaining formation structures. Rather than breaking formation and individually navigating around the obstacle or through the chokepoint, vehicles are required to assemble into appropriately sized/shaped sub-formations, bifurcate around the obstacle or negotiate the chokepoint, and reassemble into the original formation at the far side of the obstruction. The algorithm receives vehicle and environmental properties as inputs and outputs trajectories for each vehicle from start to the desired ending location. Simulation results show that the algorithm safely routes all vehicles past the obstruction while adhering to the aforementioned requirements. The formation adapts and successfully negotiates the obstacles and chokepoints in its path while maintaining proper vehicle separation.

  20. Towards Commanding Unmanned Ground Vehicle Movement in Unfamiliar Environments Using Unconstrained English: Initial Research Results

    DTIC Science & Technology

    2007-06-01

    constrained list of command words could be valuable in many systems, as would the ability of driverless vehicles to navigate through a route...Sensemaking in UGVs • Future Combat Systems UGV roles – Driverless trucks – Robotic mules (soldier, squad aid) – Intelligent munitions – And more! • Some

  1. Navigation Concepts for the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Long, Anne; Leung, Dominic; Kelbel, David; Beckman, Mark; Grambling, Cheryl

    2003-01-01

    This paper evaluates the performance that can be achieved using candidate ground and onboard navigation approaches for operation of the James Webb Space Telescope, which will be in an orbit about the Sun-Earth L2 libration point. The ground navigation approach processes standard range and Doppler measurements from the Deep Space Network The onboard navigation approach processes celestial object measurements and/or ground-to- spacecraft Doppler measurements to autonomously estimate the spacecraft s position and velocity and Doppler reference frequency. Particular attention is given to assessing the absolute position and velocity accuracy that can be achieved in the presence of the frequent spacecraft reorientations and momentum unloads planned for this mission. The ground navigation approach provides stable navigation solutions using a tracking schedule of one 30-minute contact per day. The onboard navigation approach that uses only optical quality celestial object measurements provides stable autonomous navigation solutions. This study indicates that unmodeled changes in the solar radiation pressure cross-sectional area and modeled momentum unload velocity changes are the major error sources. These errors can be mitigated by modeling these changes, by estimating corrections to compensate for the changes, or by including acceleration measurements.

  2. Tracked robot controllers for climbing obstacles autonomously

    NASA Astrophysics Data System (ADS)

    Vincent, Isabelle

    2009-05-01

    Research in mobile robot navigation has demonstrated some success in navigating flat indoor environments while avoiding obstacles. However, the challenge of analyzing complex environments to climb obstacles autonomously has had very little success due to the complexity of the task. Unmanned ground vehicles currently exhibit simple autonomous behaviours compared to the human ability to move in the world. This paper presents the control algorithms designed for a tracked mobile robot to autonomously climb obstacles by varying its tracks configuration. Two control algorithms are proposed to solve the autonomous locomotion problem for climbing obstacles. First, a reactive controller evaluates the appropriate geometric configuration based on terrain and vehicle geometric considerations. Then, a reinforcement learning algorithm finds alternative solutions when the reactive controller gets stuck while climbing an obstacle. The methodology combines reactivity to learning. The controllers have been demonstrated in box and stair climbing simulations. The experiments illustrate the effectiveness of the proposed approach for crossing obstacles.

  3. Navigation of robotic system using cricket motes

    NASA Astrophysics Data System (ADS)

    Patil, Yogendra J.; Baine, Nicholas A.; Rattan, Kuldip S.

    2011-06-01

    This paper presents a novel algorithm for self-mapping of the cricket motes that can be used for indoor navigation of autonomous robotic systems. The cricket system is a wireless sensor network that can provide indoor localization service to its user via acoustic ranging techniques. The behavior of the ultrasonic transducer on the cricket mote is studied and the regions where satisfactorily distance measurements can be obtained are recorded. Placing the motes in these regions results fine-grain mapping of the cricket motes. Trilateration is used to obtain a rigid coordinate system, but is insufficient if the network is to be used for navigation. A modified SLAM algorithm is applied to overcome the shortcomings of trilateration. Finally, the self-mapped cricket motes can be used for navigation of autonomous robotic systems in an indoor location.

  4. Development of a GPS/INS/MAG navigation system and waypoint navigator for a VTOL UAV

    NASA Astrophysics Data System (ADS)

    Meister, Oliver; Mönikes, Ralf; Wendel, Jan; Frietsch, Natalie; Schlaile, Christian; Trommer, Gert F.

    2007-04-01

    Unmanned aerial vehicles (UAV) can be used for versatile surveillance and reconnaissance missions. If a UAV is capable of flying automatically on a predefined path the range of possible applications is widened significantly. This paper addresses the development of the integrated GPS/INS/MAG navigation system and a waypoint navigator for a small vertical take-off and landing (VTOL) unmanned four-rotor helicopter with a take-off weight below 1 kg. The core of the navigation system consists of low cost inertial sensors which are continuously aided with GPS, magnetometer compass, and a barometric height information. Due to the fact, that the yaw angle becomes unobservable during hovering flight, the integration with a magnetic compass is mandatory. This integration must be robust with respect to errors caused by the terrestrial magnetic field deviation and interferences from surrounding electronic devices as well as ferrite metals. The described integration concept with a Kalman filter overcomes the problem that erroneous magnetic measurements yield to an attitude error in the roll and pitch axis. The algorithm provides long-term stable navigation information even during GPS outages which is mandatory for the flight control of the UAV. In the second part of the paper the guidance algorithms are discussed in detail. These algorithms allow the UAV to operate in a semi-autonomous mode position hold as well an complete autonomous waypoint mode. In the position hold mode the helicopter maintains its position regardless of wind disturbances which ease the pilot job during hold-and-stare missions. The autonomous waypoint navigator enable the flight outside the range of vision and beyond the range of the radio link. Flight test results of the implemented modes of operation are shown.

  5. Scalable autonomous operations of unmanned assets

    NASA Astrophysics Data System (ADS)

    Jung, Sunghun

    Although there have been great theoretical advances in the region of Unmanned Aerial Vehicle (UAV) autonomy, applications of those theories into real world are still hesitated due to unexpected disturbances. Most of UAVs which are currently used are mainly, strictly speaking, Remotely Piloted Vehicles (RPA) since most works related with the flight control, sensor data analysis, and decision makings are done by human operators. To increase the degree of autonomy, many researches are focused on developing Unmanned Autonomous Aerial Vehicle (UAAV) which can takeoff, fly to the interested area by avoiding unexpected obstacles, perform various missions with decision makings, come back to the base station, and land on by itself without any human operators. To improve the performance of UAVs, the accuracies of position and orientation sensors are enhanced by integrating a Unmanned Ground Vehicle (UGV) or a solar compass to a UAV; Position sensor accuracy of a GPS sensor on a UAV is improved by referencing the position of a UGV which is calculated by using three GPS sensors and Weighted Centroid Localization (WCL) method; Orientation sensor accuracy is improved as well by using Three Pixel Theorem (TPT) and integrating a solar compass which composed of nine light sensors to a magnetic compass. Also, improved health management of a UAV is fulfilled by developing a wireless autonomous charging station which uses four pairs of transmitter and receiver magnetic loops with four robotic arms. For the software aspect, I also analyze the error propagation of the proposed mission planning hierarchy to achieve the safest size of the buffer zone. In addition, among seven future research areas regarding UAV, this paper mainly focuses on developing algorithms of path planning, trajectory generation, and cooperative tactics for the operations of multiple UAVs using GA based multiple Traveling Salesman Problem (mTSP) which is solved by dividing into m number of Traveling Salesman

  6. NAVIS-An UGV Indoor Positioning System Using Laser Scan Matching for Large-Area Real-Time Applications

    PubMed Central

    Tang, Jian.; Chen, Yuwei.; Jaakkola, Anttoni.; Liu, Jinbing.; Hyyppä, Juha.; Hyyppä, Hannu.

    2014-01-01

    Laser scan matching with grid-based maps is a promising tool for real-time indoor positioning of mobile Unmanned Ground Vehicles (UGVs). While there are critical implementation problems, such as the ability to estimate the position by sensing the unknown indoor environment with sufficient accuracy and low enough latency for stable vehicle control, further development work is necessary. Unfortunately, most of the existing methods employ heuristics for quick positioning in which numerous accumulated errors easily lead to loss of positioning accuracy. This severely restricts its applications in large areas and over lengthy periods of time. This paper introduces an efficient real-time mobile UGV indoor positioning system for large-area applications using laser scan matching with an improved probabilistically-motivated Maximum Likelihood Estimation (IMLE) algorithm, which is based on a multi-resolution patch-divided grid likelihood map. Compared with traditional methods, the improvements embodied in IMLE include: (a) Iterative Closed Point (ICP) preprocessing, which adaptively decreases the search scope; (b) a totally brute search matching method on multi-resolution map layers, based on the likelihood value between current laser scan and the grid map within refined search scope, adopted to obtain the global optimum position at each scan matching; and (c) a patch-divided likelihood map supporting a large indoor area. A UGV platform called NAVIS was designed, manufactured, and tested based on a low-cost robot integrating a LiDAR and an odometer sensor to verify the IMLE algorithm. A series of experiments based on simulated data and field tests with NAVIS proved that the proposed IMEL algorithm is a better way to perform local scan matching that can offer a quick and stable positioning solution with high accuracy so it can be part of a large area localization/mapping, application. The NAVIS platform can reach an updating rate of 12 Hz in a feature-rich environment and 2 Hz

  7. NAVIS-An UGV indoor positioning system using laser scan matching for large-area real-time applications.

    PubMed

    Tang, Jian; Chen, Yuwei; Jaakkola, Anttoni; Liu, Jinbing; Hyyppä, Juha; Hyyppä, Hannu

    2014-07-04

    Laser scan matching with grid-based maps is a promising tool for real-time indoor positioning of mobile Unmanned Ground Vehicles (UGVs). While there are critical implementation problems, such as the ability to estimate the position by sensing the unknown indoor environment with sufficient accuracy and low enough latency for stable vehicle control, further development work is necessary. Unfortunately, most of the existing methods employ heuristics for quick positioning in which numerous accumulated errors easily lead to loss of positioning accuracy. This severely restricts its applications in large areas and over lengthy periods of time. This paper introduces an efficient real-time mobile UGV indoor positioning system for large-area applications using laser scan matching with an improved probabilistically-motivated Maximum Likelihood Estimation (IMLE) algorithm, which is based on a multi-resolution patch-divided grid likelihood map. Compared with traditional methods, the improvements embodied in IMLE include: (a) Iterative Closed Point (ICP) preprocessing, which adaptively decreases the search scope; (b) a totally brute search matching method on multi-resolution map layers, based on the likelihood value between current laser scan and the grid map within refined search scope, adopted to obtain the global optimum position at each scan matching; and (c) a patch-divided likelihood map supporting a large indoor area. A UGV platform called NAVIS was designed, manufactured, and tested based on a low-cost robot integrating a LiDAR and an odometer sensor to verify the IMLE algorithm. A series of experiments based on simulated data and field tests with NAVIS proved that the proposed IMEL algorithm is a better way to perform local scan matching that can offer a quick and stable positioning solution with high accuracy so it can be part of a large area localization/mapping, application. The NAVIS platform can reach an updating rate of 12 Hz in a feature-rich environment and 2 Hz

  8. Development of Navigation Doppler Lidar for Future Landing Mission

    NASA Technical Reports Server (NTRS)

    Amzajerdian, Farzin; Hines, Glenn D.; Petway, Larry B.; Barnes, Bruce W.; Pierrottet, Diego F.; Carson, John M., III

    2016-01-01

    A coherent Navigation Doppler Lidar (NDL) sensor has been developed under the Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project to support future NASA missions to planetary bodies. This lidar sensor provides accurate surface-relative altitude and vector velocity data during the descent phase that can be used by an autonomous Guidance, Navigation, and Control (GN&C) system to precisely navigate the vehicle from a few kilometers above the ground to a designated location and execute a controlled soft touchdown. The operation and performance of the NDL was demonstrated through closed-loop flights onboard the rocket-propelled Morpheus vehicle in 2014. In Morpheus flights, conducted at the NASA Kennedy Space Center, the NDL data was used by an autonomous GN&C system to navigate and land the vehicle precisely at the selected location surrounded by hazardous rocks and craters. Since then, development efforts for the NDL have shifted toward enhancing performance, optimizing design, and addressing spaceflight size and mass constraints and environmental and reliability requirements. The next generation NDL, with expanded operational envelope and significantly reduced size, will be demonstrated in 2017 through a new flight test campaign onboard a commercial rocketpropelled test vehicle.

  9. Vector Pursuit Path Tracking for Autonomous Ground Vehicles

    DTIC Science & Technology

    2000-08-01

    vi INTRODUCTION ...........................................................................................................1...other geometric path-tracking techniques. 1 CHAPTER 1 INTRODUCTION An autonomous vehicle is one that is capable of automatic navigation. It is...Joint Architecture for Unmanned Ground Vehicles ( JAUGS ) working group meeting held at the University of Florida. 5 Figure 1.5: Autonomous

  10. The JPL roadmap for Deep Space navigation

    NASA Technical Reports Server (NTRS)

    Martin-Mur, Tomas J.; Abraham, Douglas S.; Berry, David; Bhaskaran, Shyam; Cesarone, Robert J.; Wood, Lincoln

    2006-01-01

    This paper reviews the tentative set of deep space missions that will be supported by NASA's Deep Space Mission System in the next twenty-five years, and extracts the driving set of navigation capabilities that these missions will require. There will be many challenges including the support of new mission navigation approaches such as formation flying and rendezvous in deep space, low-energy and low-thrust orbit transfers, precise landing and ascent vehicles, and autonomous navigation. Innovative strategies and approaches will be needed to develop and field advanced navigation capabilities.

  11. Visual Odometry for Autonomous Deep-Space Navigation Project

    NASA Technical Reports Server (NTRS)

    Robinson, Shane; Pedrotty, Sam

    2016-01-01

    Autonomous rendezvous and docking (AR&D) is a critical need for manned spaceflight, especially in deep space where communication delays essentially leave crews on their own for critical operations like docking. Previously developed AR&D sensors have been large, heavy, power-hungry, and may still require further development (e.g. Flash LiDAR). Other approaches to vision-based navigation are not computationally efficient enough to operate quickly on slower, flight-like computers. The key technical challenge for visual odometry is to adapt it from the current terrestrial applications it was designed for to function in the harsh lighting conditions of space. This effort leveraged Draper Laboratory’s considerable prior development and expertise, benefitting both parties. The algorithm Draper has created is unique from other pose estimation efforts as it has a comparatively small computational footprint (suitable for use onboard a spacecraft, unlike alternatives) and potentially offers accuracy and precision needed for docking. This presents a solution to the AR&D problem that only requires a camera, which is much smaller, lighter, and requires far less power than competing AR&D sensors. We have demonstrated the algorithm’s performance and ability to process ‘flight-like’ imagery formats with a ‘flight-like’ trajectory, positioning ourselves to easily process flight data from the upcoming ‘ISS Selfie’ activity and then compare the algorithm’s quantified performance to the simulated imagery. This will bring visual odometry beyond TRL 5, proving its readiness to be demonstrated as part of an integrated system.Once beyond TRL 5, visual odometry will be poised to be demonstrated as part of a system in an in-space demo where relative pose is critical, like Orion AR&D, ISS robotic operations, asteroid proximity operations, and more.

  12. Visual Odometry for Autonomous Deep-Space Navigation Project

    NASA Technical Reports Server (NTRS)

    Robinson, Shane; Pedrotty, Sam

    2016-01-01

    Autonomous rendezvous and docking (AR&D) is a critical need for manned spaceflight, especially in deep space where communication delays essentially leave crews on their own for critical operations like docking. Previously developed AR&D sensors have been large, heavy, power-hungry, and may still require further development (e.g. Flash LiDAR). Other approaches to vision-based navigation are not computationally efficient enough to operate quickly on slower, flight-like computers. The key technical challenge for visual odometry is to adapt it from the current terrestrial applications it was designed for to function in the harsh lighting conditions of space. This effort leveraged Draper Laboratory's considerable prior development and expertise, benefitting both parties. The algorithm Draper has created is unique from other pose estimation efforts as it has a comparatively small computational footprint (suitable for use onboard a spacecraft, unlike alternatives) and potentially offers accuracy and precision needed for docking. This presents a solution to the AR&D problem that only requires a camera, which is much smaller, lighter, and requires far less power than competing AR&D sensors. We have demonstrated the algorithm's performance and ability to process 'flight-like' imagery formats with a 'flight-like' trajectory, positioning ourselves to easily process flight data from the upcoming 'ISS Selfie' activity and then compare the algorithm's quantified performance to the simulated imagery. This will bring visual odometry beyond TRL 5, proving its readiness to be demonstrated as part of an integrated system. Once beyond TRL 5, visual odometry will be poised to be demonstrated as part of a system in an in-space demo where relative pose is critical, like Orion AR&D, ISS robotic operations, asteroid proximity operations, and more.

  13. Navigation Architecture For A Space Mobile Network

    NASA Technical Reports Server (NTRS)

    Valdez, Jennifer E.; Ashman, Benjamin; Gramling, Cheryl; Heckler, Gregory W.; Carpenter, Russell

    2016-01-01

    The Tracking and Data Relay Satellite System (TDRSS) Augmentation Service for Satellites (TASS) is a proposed beacon service to provide a global, space-based GPS augmentation service based on the NASA Global Differential GPS (GDGPS) System. The TASS signal will be tied to the GPS time system and usable as an additional ranging and Doppler radiometric source. Additionally, it will provide data vital to autonomous navigation in the near Earth regime, including space weather information, TDRS ephemerides, Earth Orientation Parameters (EOP), and forward commanding capability. TASS benefits include enhancing situational awareness, enabling increased autonomy, and providing near real-time command access for user platforms. As NASA Headquarters Space Communication and Navigation Office (SCaN) begins to move away from a centralized network architecture and towards a Space Mobile Network (SMN) that allows for user initiated services, autonomous navigation will be a key part of such a system. This paper explores how a TASS beacon service enables the Space Mobile Networking paradigm, what a typical user platform would require, and provides an in-depth analysis of several navigation scenarios and operations concepts.

  14. Multiagent pursuit-evasion games: Algorithms and experiments

    NASA Astrophysics Data System (ADS)

    Kim, Hyounjin

    Deployment of intelligent agents has been made possible through advances in control software, microprocessors, sensor/actuator technology, communication technology, and artificial intelligence. Intelligent agents now play important roles in many applications where human operation is too dangerous or inefficient. There is little doubt that the world of the future will be filled with intelligent robotic agents employed to autonomously perform tasks, or embedded in systems all around us, extending our capabilities to perceive, reason and act, and replacing human efforts. There are numerous real-world applications in which a single autonomous agent is not suitable and multiple agents are required. However, after years of active research in multi-agent systems, current technology is still far from achieving many of these real-world applications. Here, we consider the problem of deploying a team of unmanned ground vehicles (UGV) and unmanned aerial vehicles (UAV) to pursue a second team of UGV evaders while concurrently building a map in an unknown environment. This pursuit-evasion game encompasses many of the challenging issues that arise in operations using intelligent multi-agent systems. We cast the problem in a probabilistic game theoretic framework and consider two computationally feasible pursuit policies: greedy and global-max. We also formulate this probabilistic pursuit-evasion game as a partially observable Markov decision process and employ a policy search algorithm to obtain a good pursuit policy from a restricted class of policies. The estimated value of this policy is guaranteed to be uniformly close to the optimal value in the given policy class under mild conditions. To implement this scenario on real UAVs and UGVs, we propose a distributed hierarchical hybrid system architecture which emphasizes the autonomy of each agent yet allows for coordinated team efforts. We then describe our implementation on a fleet of UGVs and UAVs, detailing components such

  15. Development and Evaluation of Positioning Systems for Autonomous Vehicle Navigation

    DTIC Science & Technology

    2001-12-01

    generation of autonomous vehicles to utilize NTV technology is built on a commercially-available vehicle built by ASV. The All-Purpose Remote Transport...larger scale, AFRL and CIMAR are involved in the development of a standard approach in the design and specification of autonomous vehicles being...1996. Shi92 Shin, D.H., Sanjiv, S., and Lee, J.J., “Explicit Path Tracking by Autonomous Vehicles ,” Robotica, 10, (1992), 69-87. Ste95

  16. An Efficient Model-Based Image Understanding Method for an Autonomous Vehicle.

    DTIC Science & Technology

    1997-09-01

    The problem discussed in this dissertation is the development of an efficient method for visual navigation of autonomous vehicles . The approach is to... autonomous vehicles . Thus the new method is implemented as a component of the image-understanding system in the autonomous mobile robot Yamabico-11 at

  17. Towards collaboration between unmanned aerial and ground vehicles for precision agriculture

    NASA Astrophysics Data System (ADS)

    Bhandari, Subodh; Raheja, Amar; Green, Robert L.; Do, Dat

    2017-05-01

    This paper presents the work being conducted at Cal Poly Pomona on the collaboration between unmanned aerial and ground vehicles for precision agriculture. The unmanned aerial vehicles (UAVs), equipped with multispectral/hyperspectral cameras and RGB cameras, take images of the crops while flying autonomously. The images are post processed or can be processed onboard. The processed images are used in the detection of unhealthy plants. Aerial data can be used by the UAVs and unmanned ground vehicles (UGVs) for various purposes including care of crops, harvest estimation, etc. The images can also be useful for optimized harvesting by isolating low yielding plants. These vehicles can be operated autonomously with limited or no human intervention, thereby reducing cost and limiting human exposure to agricultural chemicals. The paper discuss the autonomous UAV and UGV platforms used for the research, sensor integration, and experimental testing. Methods for ground truthing the results obtained from the UAVs will be used. The paper will also discuss equipping the UGV with a robotic arm for removing the unhealthy plants and/or weeds.

  18. A navigation and control system for an autonomous rescue vehicle in the space station environment

    NASA Technical Reports Server (NTRS)

    Merkel, Lawrence

    1991-01-01

    A navigation and control system was designed and implemented for an orbital autonomous rescue vehicle envisioned to retrieve astronauts or equipment in the case that they become disengaged from the space station. The rescue vehicle, termed the Extra-Vehicular Activity Retriever (EVAR), has an on-board inertial measurement unit ahd GPS receivers for self state estimation, a laser range imager (LRI) and cameras for object state estimation, and a data link for reception of space station state information. The states of the retriever and objects (obstacles and the target object) are estimated by inertial state propagation which is corrected via measurements from the GPS, the LRI system, or the camera system. Kalman filters are utilized to perform sensor fusion and estimate the state propagation errors. Control actuation is performed by a Manned Maneuvering Unit (MMU). Phase plane control techniques are used to control the rotational and translational state of the retriever. The translational controller provides station-keeping or motion along either Clohessy-Wiltshire trajectories or straight line trajectories in the LVLH frame of any sufficiently observed object or of the space station. The software was used to successfully control a prototype EVAR on an air bearing floor facility, and a simulated EVAR operating in a simulated orbital environment. The design of the navigation system and the control system are presented. Also discussed are the hardware systems and the overall software architecture.

  19. An Autonomous Gps-Denied Unmanned Vehicle Platform Based on Binocular Vision for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    Qin, M.; Wan, X.; Shao, Y. Y.; Li, S. Y.

    2018-04-01

    Vision-based navigation has become an attractive solution for autonomous navigation for planetary exploration. This paper presents our work of designing and building an autonomous vision-based GPS-denied unmanned vehicle and developing an ARFM (Adaptive Robust Feature Matching) based VO (Visual Odometry) software for its autonomous navigation. The hardware system is mainly composed of binocular stereo camera, a pan-and tilt, a master machine, a tracked chassis. And the ARFM-based VO software system contains four modules: camera calibration, ARFM-based 3D reconstruction, position and attitude calculation, BA (Bundle Adjustment) modules. Two VO experiments were carried out using both outdoor images from open dataset and indoor images captured by our vehicle, the results demonstrate that our vision-based unmanned vehicle is able to achieve autonomous localization and has the potential for future planetary exploration.

  20. The use of x-ray pulsar-based navigation method for interplanetary flight

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Guo, Xingcan; Yang, Yong

    2009-07-01

    As interplanetary missions are increasingly complex, the existing unique mature interplanetary navigation method mainly based on radiometric tracking techniques of Deep Space Network can not meet the rising demands of autonomous real-time navigation. This paper studied the applications for interplanetary flights of a new navigation technology under rapid development-the X-ray pulsar-based navigation for spacecraft (XPNAV), and valued its performance with a computer simulation. The XPNAV is an excellent autonomous real-time navigation method, and can provide comprehensive navigation information, including position, velocity, attitude, attitude rate and time. In the paper the fundamental principles and time transformation of the XPNAV were analyzed, and then the Delta-correction XPNAV blending the vehicles' trajectory dynamics with the pulse time-of-arrival differences at nominal and estimated spacecraft locations within an Unscented Kalman Filter (UKF) was discussed with a background mission of Mars Pathfinder during the heliocentric transferring orbit. The XPNAV has an intractable problem of integer pulse phase cycle ambiguities similar to the GPS carrier phase navigation. This article innovatively proposed the non-ambiguity assumption approach based on an analysis of the search space array method to resolve pulse phase cycle ambiguities between the nominal position and estimated position of the spacecraft. The simulation results show that the search space array method are computationally intensive and require long processing time when the position errors are large, and the non-ambiguity assumption method can solve ambiguity problem quickly and reliably. It is deemed that autonomous real-time integrated navigation system of the XPNAV blending with DSN, celestial navigation, inertial navigation and so on will be the development direction of interplanetary flight navigation system in the future.

  1. Relative Navigation of Formation-Flying Satellites

    NASA Technical Reports Server (NTRS)

    Long, Anne; Kelbel, David; Lee, Taesul; Leung, Dominic; Carpenter, J. Russell; Grambling, Cheryl

    2002-01-01

    This paper compares autonomous relative navigation performance for formations in eccentric, medium and high-altitude Earth orbits using Global Positioning System (GPS) Standard Positioning Service (SPS), crosslink, and celestial object measurements. For close formations, the relative navigation accuracy is highly dependent on the magnitude of the uncorrelated measurement errors. A relative navigation position accuracy of better than 10 centimeters root-mean-square (RMS) can be achieved for medium-altitude formations that can continuously track at least one GPS signal. A relative navigation position accuracy of better than 15 meters RMS can be achieved for high-altitude formations that have sparse tracking of the GPS signals. The addition of crosslink measurements can significantly improve relative navigation accuracy for formations that use sparse GPS tracking or celestial object measurements for absolute navigation.

  2. Robot navigation research using the HERMIES mobile robot

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, D.L.

    1989-01-01

    In recent years robot navigation has attracted much attention from researchers around the world. Not only are theoretical studies being simulated on sophisticated computers, but many mobile robots are now used as test vehicles for these theoretical studies. Various algorithms have been perfected for navigation in a known static environment; but navigation in an unknown and dynamic environment poses a much more challenging problem for researchers. Many different methodologies have been developed for autonomous robot navigation, but each methodology is usually restricted to a particular type of environment. One important research focus of the Center for Engineering Systems Advanced researchmore » (CESAR) at Oak Ridge National Laboratory, is autonomous navigation in unknown and dynamic environments using the series of HERMIES mobile robots. The research uses an expert system for high-level planning interfaced with C-coded routines for implementing the plans, and for quick processing of data requested by the expert system. In using this approach, the navigation is not restricted to one methodology since the expert system can activate a rule module for the methodology best suited for the current situation. Rule modules can be added the rule base as they are developed and tested. Modules are being developed or enhanced for navigating from a map, searching for a target, exploring, artificial potential-field navigation, navigation using edge-detection, etc. This paper will report on the various rule modules and methods of navigation in use, or under development at CESAR, using the HERMIES-IIB robot as a testbed. 13 refs., 5 figs., 1 tab.« less

  3. SLAM algorithm applied to robotics assistance for navigation in unknown environments.

    PubMed

    Cheein, Fernando A Auat; Lopez, Natalia; Soria, Carlos M; di Sciascio, Fernando A; Pereira, Fernando Lobo; Carelli, Ricardo

    2010-02-17

    The combination of robotic tools with assistance technology determines a slightly explored area of applications and advantages for disability or elder people in their daily tasks. Autonomous motorized wheelchair navigation inside an environment, behaviour based control of orthopaedic arms or user's preference learning from a friendly interface are some examples of this new field. In this paper, a Simultaneous Localization and Mapping (SLAM) algorithm is implemented to allow the environmental learning by a mobile robot while its navigation is governed by electromyographic signals. The entire system is part autonomous and part user-decision dependent (semi-autonomous). The environmental learning executed by the SLAM algorithm and the low level behaviour-based reactions of the mobile robot are robotic autonomous tasks, whereas the mobile robot navigation inside an environment is commanded by a Muscle-Computer Interface (MCI). In this paper, a sequential Extended Kalman Filter (EKF) feature-based SLAM algorithm is implemented. The features correspond to lines and corners -concave and convex- of the environment. From the SLAM architecture, a global metric map of the environment is derived. The electromyographic signals that command the robot's movements can be adapted to the patient's disabilities. For mobile robot navigation purposes, five commands were obtained from the MCI: turn to the left, turn to the right, stop, start and exit. A kinematic controller to control the mobile robot was implemented. A low level behavior strategy was also implemented to avoid robot's collisions with the environment and moving agents. The entire system was tested in a population of seven volunteers: three elder, two below-elbow amputees and two young normally limbed patients. The experiments were performed within a closed low dynamic environment. Subjects took an average time of 35 minutes to navigate the environment and to learn how to use the MCI. The SLAM results have shown a

  4. SLAM algorithm applied to robotics assistance for navigation in unknown environments

    PubMed Central

    2010-01-01

    Background The combination of robotic tools with assistance technology determines a slightly explored area of applications and advantages for disability or elder people in their daily tasks. Autonomous motorized wheelchair navigation inside an environment, behaviour based control of orthopaedic arms or user's preference learning from a friendly interface are some examples of this new field. In this paper, a Simultaneous Localization and Mapping (SLAM) algorithm is implemented to allow the environmental learning by a mobile robot while its navigation is governed by electromyographic signals. The entire system is part autonomous and part user-decision dependent (semi-autonomous). The environmental learning executed by the SLAM algorithm and the low level behaviour-based reactions of the mobile robot are robotic autonomous tasks, whereas the mobile robot navigation inside an environment is commanded by a Muscle-Computer Interface (MCI). Methods In this paper, a sequential Extended Kalman Filter (EKF) feature-based SLAM algorithm is implemented. The features correspond to lines and corners -concave and convex- of the environment. From the SLAM architecture, a global metric map of the environment is derived. The electromyographic signals that command the robot's movements can be adapted to the patient's disabilities. For mobile robot navigation purposes, five commands were obtained from the MCI: turn to the left, turn to the right, stop, start and exit. A kinematic controller to control the mobile robot was implemented. A low level behavior strategy was also implemented to avoid robot's collisions with the environment and moving agents. Results The entire system was tested in a population of seven volunteers: three elder, two below-elbow amputees and two young normally limbed patients. The experiments were performed within a closed low dynamic environment. Subjects took an average time of 35 minutes to navigate the environment and to learn how to use the MCI. The SLAM

  5. Autonomous Spacecraft Navigation Using Above-the-Constellation GPS Signals

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke

    2017-01-01

    GPS-based spacecraft navigation offers many performance and cost benefits, and GPS receivers are now standard GNC components for LEO missions. Recently, more and more high-altitude missions are taking advantage of the benefits of GPS navigation as well. High-altitude applications pose challenges, however, because receivers operating above the GPS constellations are subject to reduced signal strength and availability, and uncertain signal quality. This presentation will present the history and state-of-the-art in high-altitude GPS spacecraft navigation, including early experiments, current missions and receivers, and efforts to characterize and protect signals available to high-altitude users. Recent results from the very-high altitude MMS mission are also provided.

  6. Developments in Acoustic Navigation and Communication for High-Latitude Ocean Research

    NASA Astrophysics Data System (ADS)

    Gobat, J.; Lee, C.

    2006-12-01

    Developments in autonomous platforms (profiling floats, drifters, long-range gliders and propeller-driven vehicles) offer the possibility of unprecedented access to logistically difficult polar regions that challenge conventional techniques. Currently, however, navigation and telemetry for these platforms rely on satellite positioning and communications poorly suited for high-latitude applications where ice cover restricts access to the sea surface. A similar infrastructure offering basin-wide acoustic geolocation and telemetry would allow the community to employ autonomous platforms to address previously intractable problems in Arctic oceanography. Two recent efforts toward the development of such an infrastructure are reported here. As part of an observational array monitoring fluxes through Davis Strait, development of real-time RAFOS acoustic navigation for gliders has been ongoing since autumn 2004. To date, test deployments have been conducted in a 260 Hz field in the Pacific and 780 Hz fields off Norway and in Davis Strait. Real-time navigation accuracy of ~1~km is achievable. Autonomously navigating gliders will operate under ice cover beginning in autumn 2006. In addition to glider navigation development, the Davis Strait array moorings carry fixed RAFOS recorders to study propagation over a range of distances under seasonally varying ice cover. Results from the under-ice propagation and glider navigation experiments are presented. Motivated by the need to coordinate these types of development efforts, an international group of acousticians, autonomous platform developers, high-latitude oceanographers and marine mammal researchers gathered in Seattle, U.S.A. from 27 February -- 1 March 2006 for an NSF Office of Polar Programs sponsored Acoustic Navigation and Communication for High-latitude Ocean Research (ANCHOR) workshop. Workshop participants focused on summarizing the current state of knowledge concerning Arctic acoustics, navigation and communications

  7. Optimization of eyesafe avalanche photodiode lidar for automobile safety and autonomous navigation systems

    NASA Astrophysics Data System (ADS)

    Williams, George M.

    2017-03-01

    Newly emerging accident-reducing, driver-assistance, and autonomous-navigation technology for automobiles is based on real-time three-dimensional mapping and object detection, tracking, and classification using lidar sensors. Yet, the lack of lidar sensors suitable for meeting application requirements appreciably limits practical widespread use of lidar in trucking, public livery, consumer cars, and fleet automobiles. To address this need, a system-engineering perspective to eyesafe lidar-system design for high-level advanced driver-assistance sensor systems and a design trade study including 1.5-μm spot-scanned, line-scanned, and flash-lidar systems are presented. A cost-effective lidar instrument design is then proposed based on high-repetition-rate diode-pumped solid-state lasers and high-gain, low-excess-noise InGaAs avalanche photodiode receivers and focal plane arrays. Using probabilistic receiver-operating-characteristic analysis, derived from measured component performance, a compact lidar system is proposed that is capable of 220 m ranging with 5-cm accuracy, which can be readily scaled to a 360-deg field of regard.

  8. Communication and Control for Fleets of Autonomous Underwater Vehicles

    DTIC Science & Technology

    2006-10-30

    Washington State University (WSU) on fuzzy logic control systems [2-4] and autonomous vehicles [5-10]. The ALWSE-MC program developed at NAVSEA CSS was...rotating head sonar on crawlers as an additional sensor for navigation. We have previously investigated the use of video cameras on autonomous vehicles for...simulates autonomous vehicles performing mine reconnaissance/mapping, clearance, and surveillance in a littoral region. Three simulations were preformed

  9. Acoustic Communications and Navigation for Mobile Under-Ice Sensors

    DTIC Science & Technology

    2017-02-04

    From- To) 04/02/2017 Final Report 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Acoustic Communications and Navigation for Mobile Under-Ice Sensors...development and fielding of a new acoustic communications and navigation system for use on autonomous platforms (gliders and profiling floats) under the...contact below the ice. 15. SUBJECT TERMS Arctic Ocean, Undersea Workstations & Vehicles, Signal Processing, Navigation, Underwater Acoustics 16

  10. Deep Impact Autonomous Navigation : the trials of targeting the unknown

    NASA Technical Reports Server (NTRS)

    Kubitschek, Daniel G.; Mastrodemos, Nickolaos; Werner, Robert A.; Kennedy, Brian M.; Synnott, Stephen P.; Null, George W.; Bhaskaran, Shyam; Riedel, Joseph E.; Vaughan, Andrew T.

    2006-01-01

    On July 4, 2005 at 05:44:34.2 UTC the Impactor Spacecraft (s/c) impacted comet Tempel 1 with a relative speed of 10.3 km/s capturing high-resolution images of the surface of a cometary nucleus just seconds before impact. Meanwhile, the Flyby s/c captured the impact event using both the Medium Resolution Imager (MRI) and the High Resolution Imager (HRI) and tracked the nucleus for the entire 800 sec period between impact and shield attitude transition. The objective of the Impactor s/c was to impact in an illuminated area viewable from the Flyby s/c and capture high-resolution context images of the impact site. This was accomplished by using autonomous navigation (AutoNav) algorithms and precise attitude information from the attitude determination and control subsystem (ADCS). The Flyby s/c had two primary objectives: 1) capture the impact event with the highest temporal resolution possible in order to observe the ejecta plume expansion dynamics; and 2) track the impact site for at least 800 sec to observe the crater formation and capture the highest resolution images possible of the fully developed crater. These two objectives were met by estimating the Flyby s/c trajectory relative to Tempel 1 using the same AutoNav algorithms along with precise attitude information from ADCS and independently selecting the best impact site. This paper describes the AutoNav system, what happened during the encounter with Tempel 1 and what could have happened.

  11. Optimal scheme of star observation of missile-borne inertial navigation system/stellar refraction integrated navigation

    NASA Astrophysics Data System (ADS)

    Lu, Jiazhen; Yang, Lie

    2018-05-01

    To achieve accurate and completely autonomous navigation for spacecraft, inertial/celestial integrated navigation gets increasing attention. In this study, a missile-borne inertial/stellar refraction integrated navigation scheme is proposed. Position Dilution of Precision (PDOP) for stellar refraction is introduced and the corresponding equation is derived. Based on the condition when PDOP reaches the minimum value, an optimized observation scheme is proposed. To verify the feasibility of the proposed scheme, numerical simulation is conducted. The results of the Extended Kalman Filter (EKF) and Unscented Kalman Filter (UKF) are compared and impact factors of navigation accuracy are studied in the simulation. The simulation results indicated that the proposed observation scheme has an accurate positioning performance, and the results of EKF and UKF are similar.

  12. Optimal scheme of star observation of missile-borne inertial navigation system/stellar refraction integrated navigation.

    PubMed

    Lu, Jiazhen; Yang, Lie

    2018-05-01

    To achieve accurate and completely autonomous navigation for spacecraft, inertial/celestial integrated navigation gets increasing attention. In this study, a missile-borne inertial/stellar refraction integrated navigation scheme is proposed. Position Dilution of Precision (PDOP) for stellar refraction is introduced and the corresponding equation is derived. Based on the condition when PDOP reaches the minimum value, an optimized observation scheme is proposed. To verify the feasibility of the proposed scheme, numerical simulation is conducted. The results of the Extended Kalman Filter (EKF) and Unscented Kalman Filter (UKF) are compared and impact factors of navigation accuracy are studied in the simulation. The simulation results indicated that the proposed observation scheme has an accurate positioning performance, and the results of EKF and UKF are similar.

  13. Cybersecurity for aerospace autonomous systems

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2015-05-01

    High profile breaches have occurred across numerous information systems. One area where attacks are particularly problematic is autonomous control systems. This paper considers the aerospace information system, focusing on elements that interact with autonomous control systems (e.g., onboard UAVs). It discusses the trust placed in the autonomous systems and supporting systems (e.g., navigational aids) and how this trust can be validated. Approaches to remotely detect the UAV compromise, without relying on the onboard software (on a potentially compromised system) as part of the process are discussed. How different levels of autonomy (task-based, goal-based, mission-based) impact this remote characterization is considered.

  14. Daytime Water Detection Based on Sky Reflections

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.; Matthies, Larry H.; Bellutta, Paolo

    2011-01-01

    Robust water detection is a critical perception requirement for unmanned ground vehicle (UGV) autonomous navigation. This is particularly true in wide-open areas where water can collect in naturally occurring terrain depressions during periods of heavy precipitation and form large water bodies. One of the properties of water useful for detecting it is that its surface acts as a horizontal mirror at large incidence angles. Water bodies can be indirectly detected by detecting reflections of the sky below the horizon in color imagery. The Jet Propulsion Laboratory (JPL) has implemented a water detector based on sky reflections that geometrically locates the pixel in the sky that is reflecting on a candidate water pixel on the ground and predicts if the ground pixel is water based on color similarity and local terrain features. This software detects water bodies in wide-open areas on cross-country terrain at mid- to far-range using imagery acquired from a forward-looking stereo pair of color cameras mounted on a terrestrial UGV. In three test sequences approaching a pond under a clear, overcast, and cloudy sky, the true positive detection rate was 100% when the UGV was beyond 7 meters of the water's leading edge and the largest false positive detection rate was 0.58%. The sky reflection based water detector has been integrated on an experimental unmanned vehicle and field tested at Ft. Indiantown Gap, PA, USA.

  15. Autonomous interplanetary constellation design

    NASA Astrophysics Data System (ADS)

    Chow, Cornelius Channing, II

    According to NASA's integrated space technology roadmaps, space-based infrastructures are envisioned as necessary ingredients to a sustained effort in continuing space exploration. Whether it be for extra-terrestrial habitats, roving/cargo vehicles, or space tourism, autonomous space networks will provide a vital communications lifeline for both future robotic and human missions alike. Projecting that the Moon will be a bustling hub of activity within a few decades, a near-term opportunity for in-situ infrastructure development is within reach. This dissertation addresses the anticipated need for in-space infrastructure by investigating a general design methodology for autonomous interplanetary constellations; to illustrate the theory, this manuscript presents results from an application to the Earth-Moon neighborhood. The constellation design methodology is formulated as an optimization problem, involving a trajectory design step followed by a spacecraft placement sequence. Modeling the dynamics as a restricted 3-body problem, the investigated design space consists of families of periodic orbits which play host to the constellations, punctuated by arrangements of spacecraft autonomously guided by a navigation strategy called LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation). Instead of more traditional exhaustive search methods, a numerical continuation approach is implemented to map the admissible configuration space. In particular, Keller's pseudo-arclength technique is used to follow folding/bifurcating solution manifolds, which are otherwise inaccessible with other parameter continuation schemes. A succinct characterization of the underlying structure of the local, as well as global, extrema is thus achievable with little a priori intuition of the solution space. Furthermore, the proposed design methodology offers benefits in computation speed plus the ability to handle mildly stochastic systems. An application of the constellation design

  16. Autonomous path-planning navigation system for site characterization

    NASA Astrophysics Data System (ADS)

    Rankin, Arturo L.; Crane, Carl D., III; Armstrong, David G., II; Nease, Allen D.; Brown, H. Edward

    1996-05-01

    The location and removal of buried munitions is an important yet hazardous task. Current development is aimed at performing both the ordnance location and removal tasks autonomously. An autonomous survey vehicle (ASV) named the Gator has been developed at the Center for Intelligent Machines and Robotics, under the direction of Wright Laboratory, Tyndall Air Force Base, Florida, and the Navy Explosive Ordnance Disposal Technology Division, Indian Head, Maryland. The primary task of the survey vehicle is to autonomously traverse an off-road site, towing behind it a trailer containing a sensor package capable of characterizing the sub-surface contents. Achieving 00 percent coverage of the site is critical to fully characterizing the site. This paper presents a strategy for planning efficient paths for the survey vehicle that guarantees near-complete coverage of a site. A small library of three in-house developed path planners are reviewed. A strategy is also presented to keep the trailer on-path and to calculate the percent of coverage of a site with a resolution of 0.01 m2. All of the algorithms discussed in this paper were initially developed in simulation on a Silicon Graphics computer and subsequently implemented on the survey vehicle.

  17. RAIM availability for supplemental GPS navigation

    DOT National Transportation Integrated Search

    1992-06-29

    This paper examines GPS receiver autonomous integrity monitoring (RAIM) availability for supplemental navigation based on the approximate radial-error protection (ARP) method. This method applies ceiling levels for the ARP figure of merit to screen o...

  18. Terminal Homing for Autonomous Underwater Vehicle Docking

    DTIC Science & Technology

    2016-06-01

    underwater domain, accurate navigation. Above the water, light and electromagnetic signals travel well through air and space, mediums that allow for a...DISTRIBUTION CODE 13. ABSTRACT The use of docking stations for autonomous underwater vehicles (AUV) provides the ability to keep a vehicle on...Mechanical and Aerospace Engineering iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT The use of docking stations for autonomous underwater

  19. An onboard navigation system which fulfills Mars aerocapture guidance requirements

    NASA Technical Reports Server (NTRS)

    Brand, Timothy J.; Fuhry, Douglas P.; Shepperd, Stanley W.

    1989-01-01

    The development of a candidate autonomous onboard Mars approach navigation scheme capable of supporting aerocapture into Mars orbit is discussed. An aerocapture guidance and navigation system which can run independently of the preaerocapture navigation was used to define a preliminary set of accuracy requirements at entry interface. These requirements are used to evaluate the proposed preaerocapture navigation scheme. This scheme uses optical sightings on Deimos with a star tracker and an inertial measurement unit for instrumentation as a source for navigation nformation. Preliminary results suggest that the approach will adequately support aerocaputre into Mars orbit.

  20. Three-dimensional motor schema based navigation

    NASA Technical Reports Server (NTRS)

    Arkin, Ronald C.

    1989-01-01

    Reactive schema-based navigation is possible in space domains by extending the methods developed for ground-based navigation found within the Autonomous Robot Architecture (AuRA). Reformulation of two dimensional motor schemas for three dimensional applications is a straightforward process. The manifold advantages of schema-based control persist, including modular development, amenability to distributed processing, and responsiveness to environmental sensing. Simulation results show the feasibility of this methodology for space docking operations in a cluttered work area.

  1. Real-time visual mosaicking and navigation on the seafloor

    NASA Astrophysics Data System (ADS)

    Richmond, Kristof

    Remote robotic exploration holds vast potential for gaining knowledge about extreme environments accessible to humans only with great difficulty. Robotic explorers have been sent to other solar system bodies, and on this planet into inaccessible areas such as caves and volcanoes. In fact, the largest unexplored land area on earth lies hidden in the airless cold and intense pressure of the ocean depths. Exploration in the oceans is further hindered by water's high absorption of electromagnetic radiation, which both inhibits remote sensing from the surface, and limits communications with the bottom. The Earth's oceans thus provide an attractive target for developing remote exploration capabilities. As a result, numerous robotic vehicles now routinely survey this environment, from remotely operated vehicles piloted over tethers from the surface to torpedo-shaped autonomous underwater vehicles surveying the mid-waters. However, these vehicles are limited in their ability to navigate relative to their environment. This limits their ability to return to sites with precision without the use of external navigation aids, and to maneuver near and interact with objects autonomously in the water and on the sea floor. The enabling of environment-relative positioning on fully autonomous underwater vehicles will greatly extend their power and utility for remote exploration in the furthest reaches of the Earth's waters---even under ice and under ground---and eventually in extraterrestrial liquid environments such as Europa's oceans. This thesis presents an operational, fielded system for visual navigation of underwater robotic vehicles in unexplored areas of the seafloor. The system does not depend on external sensing systems, using only instruments on board the vehicle. As an area is explored, a camera is used to capture images and a composite view, or visual mosaic, of the ocean bottom is created in real time. Side-to-side visual registration of images is combined with dead

  2. Autonomous Navigation Performance During The Hartley 2 Comet Flyby

    NASA Technical Reports Server (NTRS)

    Abrahamson, Matthew J; Kennedy, Brian A.; Bhaskaran, Shyam

    2012-01-01

    On November 4, 2010, the EPOXI spacecraft performed a 700-km flyby of the comet Hartley 2 as follow-on to the successful 2005 Deep Impact prime mission. EPOXI, an extended mission for the Deep Impact Flyby spacecraft, returned a wealth of visual and infrared data from Hartley 2, marking the fifth time that high-resolution images of a cometary nucleus have been captured by a spacecraft. The highest resolution science return, captured at closest approach to the comet nucleus, was enabled by use of an onboard autonomous navigation system called AutoNav. AutoNav estimates the comet-relative spacecraft trajectory using optical measurements from the Medium Resolution Imager (MRI) and provides this relative position information to the Attitude Determination and Control System (ADCS) for maintaining instrument pointing on the comet. For the EPOXI mission, AutoNav was tasked to enable continuous tracking of a smaller, more active Hartley 2, as compared to Tempel 1, through the full encounter while traveling at a higher velocity. To meet the mission goal of capturing the comet in all MRI science images, position knowledge accuracies of +/- 3.5 km (3-?) cross track and +/- 0.3 seconds (3-?) time of flight were required. A flight-code-in-the-loop Monte Carlo simulation assessed AutoNav's statistical performance under the Hartley 2 flyby dynamics and determined optimal configuration. The AutoNav performance at Hartley 2 was successful, capturing the comet in all of the MRI images. The maximum residual between observed and predicted comet locations was 20 MRI pixels, primarily influenced by the center of brightness offset from the center of mass in the observations and attitude knowledge errors. This paper discusses the Monte Carlo-based analysis that led to the final AutoNav configuration and a comparison of the predicted performance with the flyby performance.

  3. SOVEREIGN: An autonomous neural system for incrementally learning planned action sequences to navigate towards a rewarded goal.

    PubMed

    Gnadt, William; Grossberg, Stephen

    2008-06-01

    How do reactive and planned behaviors interact in real time? How are sequences of such behaviors released at appropriate times during autonomous navigation to realize valued goals? Controllers for both animals and mobile robots, or animats, need reactive mechanisms for exploration, and learned plans to reach goal objects once an environment becomes familiar. The SOVEREIGN (Self-Organizing, Vision, Expectation, Recognition, Emotion, Intelligent, Goal-oriented Navigation) animat model embodies these capabilities, and is tested in a 3D virtual reality environment. SOVEREIGN includes several interacting subsystems which model complementary properties of cortical What and Where processing streams and which clarify similarities between mechanisms for navigation and arm movement control. As the animat explores an environment, visual inputs are processed by networks that are sensitive to visual form and motion in the What and Where streams, respectively. Position-invariant and size-invariant recognition categories are learned by real-time incremental learning in the What stream. Estimates of target position relative to the animat are computed in the Where stream, and can activate approach movements toward the target. Motion cues from animat locomotion can elicit head-orienting movements to bring a new target into view. Approach and orienting movements are alternately performed during animat navigation. Cumulative estimates of each movement are derived from interacting proprioceptive and visual cues. Movement sequences are stored within a motor working memory. Sequences of visual categories are stored in a sensory working memory. These working memories trigger learning of sensory and motor sequence categories, or plans, which together control planned movements. Predictively effective chunk combinations are selectively enhanced via reinforcement learning when the animat is rewarded. Selected planning chunks effect a gradual transition from variable reactive exploratory

  4. Experiments in autonomous robotics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamel, W.R.

    1987-01-01

    The Center for Engineering Systems Advanced Research (CESAR) is performing basic research in autonomous robotics for energy-related applications in hazardous environments. The CESAR research agenda includes a strong experimental component to assure practical evaluation of new concepts and theories. An evolutionary sequence of mobile research robots has been planned to support research in robot navigation, world sensing, and object manipulation. A number of experiments have been performed in studying robot navigation and path planning with planar sonar sensing. Future experiments will address more complex tasks involving three-dimensional sensing, dexterous manipulation, and human-scale operations.

  5. Design of all-weather celestial navigation system

    NASA Astrophysics Data System (ADS)

    Sun, Hongchi; Mu, Rongjun; Du, Huajun; Wu, Peng

    2018-03-01

    In order to realize autonomous navigation in the atmosphere, an all-weather celestial navigation system is designed. The research of celestial navigation system include discrimination method of comentropy and the adaptive navigation algorithm based on the P value. The discrimination method of comentropy is studied to realize the independent switching of two celestial navigation modes, starlight and radio. Finally, an adaptive filtering algorithm based on P value is proposed, which can greatly improve the disturbance rejection capability of the system. The experimental results show that the accuracy of the three axis attitude is better than 10″, and it can work all weather. In perturbation environment, the position accuracy of the integrated navigation system can be increased 20% comparing with the traditional method. It basically meets the requirements of the all-weather celestial navigation system, and it has the ability of stability, reliability, high accuracy and strong anti-interference.

  6. On-Orbit Autonomous Assembly from Nanosatellites

    NASA Technical Reports Server (NTRS)

    Murchison, Luke S.; Martinez, Andres; Petro, Andrew

    2015-01-01

    The On-Orbit Autonomous Assembly from Nanosatellites (OAAN) project will demonstrate autonomous control algorithms for rendezvous and docking maneuvers; low-power reconfigurable magnetic docking technology; and compact, lightweight and inexpensive precision relative navigation using carrier-phase differential (CD) GPS with a three-degree of freedom ground demonstration. CDGPS is a specific relative position determination method that measures the phase of the GPS carrier wave to yield relative position data accurate to.4 inch (1 centimeter). CDGPS is a technology commonly found in the surveying industry. The development and demonstration of these technologies will fill a current gap in the availability of proven autonomous rendezvous and docking systems for small satellites.

  7. Canoe: An Autonomous Infrastructure-Free Indoor Navigation System.

    PubMed

    Dong, Kai; Wu, Wenjia; Ye, Haibo; Yang, Ming; Ling, Zhen; Yu, Wei

    2017-04-30

    The development of the Internet of Things (IoT) has accelerated research in indoor navigation systems, a majority of which rely on adequate wireless signals and sources. Nonetheless, deploying such a system requires periodic site-survey, which is time consuming and labor intensive. To address this issue, in this paper we present Canoe , an indoor navigation system that considers shopping mall scenarios. In our system, we do not assume any prior knowledge, such as floor-plan or the shop locations, access point placement or power settings, historical RSS measurements or fingerprints, etc. Instead, Canoe requires only that the shop owners collect and publish RSS values at the entrances of their shops and can direct a consumer to any of these shops by comparing the observed RSS values. The locations of the consumers and the shops are estimated using maximum likelihood estimation. In doing this, the direction of the target shop relative to the current orientation of the consumer can be precisely computed, such that the direction that a consumer should move can be determined. We have conducted extensive simulations using a real-world dataset. Our experiments in a real shopping mall demonstrate that if 50% of the shops publish their RSS values, Canoe can precisely navigate a consumer within 30 s, with an error rate below 9%.

  8. Canoe: An Autonomous Infrastructure-Free Indoor Navigation System

    PubMed Central

    Dong, Kai; Wu, Wenjia; Ye, Haibo; Yang, Ming; Ling, Zhen; Yu, Wei

    2017-01-01

    The development of the Internet of Things (IoT) has accelerated research in indoor navigation systems, a majority of which rely on adequate wireless signals and sources. Nonetheless, deploying such a system requires periodic site-survey, which is time consuming and labor intensive. To address this issue, in this paper we present Canoe, an indoor navigation system that considers shopping mall scenarios. In our system, we do not assume any prior knowledge, such as floor-plan or the shop locations, access point placement or power settings, historical RSS measurements or fingerprints, etc. Instead, Canoe requires only that the shop owners collect and publish RSS values at the entrances of their shops and can direct a consumer to any of these shops by comparing the observed RSS values. The locations of the consumers and the shops are estimated using maximum likelihood estimation. In doing this, the direction of the target shop relative to the current orientation of the consumer can be precisely computed, such that the direction that a consumer should move can be determined. We have conducted extensive simulations using a real-world dataset. Our experiments in a real shopping mall demonstrate that if 50% of the shops publish their RSS values, Canoe can precisely navigate a consumer within 30 s, with an error rate below 9%. PMID:28468291

  9. Satellite Imagery Assisted Road-Based Visual Navigation System

    NASA Astrophysics Data System (ADS)

    Volkova, A.; Gibbens, P. W.

    2016-06-01

    There is a growing demand for unmanned aerial systems as autonomous surveillance, exploration and remote sensing solutions. Among the key concerns for robust operation of these systems is the need to reliably navigate the environment without reliance on global navigation satellite system (GNSS). This is of particular concern in Defence circles, but is also a major safety issue for commercial operations. In these circumstances, the aircraft needs to navigate relying only on information from on-board passive sensors such as digital cameras. An autonomous feature-based visual system presented in this work offers a novel integral approach to the modelling and registration of visual features that responds to the specific needs of the navigation system. It detects visual features from Google Earth* build a feature database. The same algorithm then detects features in an on-board cameras video stream. On one level this serves to localise the vehicle relative to the environment using Simultaneous Localisation and Mapping (SLAM). On a second level it correlates them with the database to localise the vehicle with respect to the inertial frame. The performance of the presented visual navigation system was compared using the satellite imagery from different years. Based on comparison results, an analysis of the effects of seasonal, structural and qualitative changes of the imagery source on the performance of the navigation algorithm is presented. * The algorithm is independent of the source of satellite imagery and another provider can be used

  10. Underwater terrain-aided navigation system based on combination matching algorithm.

    PubMed

    Li, Peijuan; Sheng, Guoliang; Zhang, Xiaofei; Wu, Jingqiu; Xu, Baochun; Liu, Xing; Zhang, Yao

    2018-07-01

    Considering that the terrain-aided navigation (TAN) system based on iterated closest contour point (ICCP) algorithm diverges easily when the indicative track of strapdown inertial navigation system (SINS) is large, Kalman filter is adopted in the traditional ICCP algorithm, difference between matching result and SINS output is used as the measurement of Kalman filter, then the cumulative error of the SINS is corrected in time by filter feedback correction, and the indicative track used in ICCP is improved. The mathematic model of the autonomous underwater vehicle (AUV) integrated into the navigation system and the observation model of TAN is built. Proper matching point number is designated by comparing the simulation results of matching time and matching precision. Simulation experiments are carried out according to the ICCP algorithm and the mathematic model. It can be concluded from the simulation experiments that the navigation accuracy and stability are improved with the proposed combinational algorithm in case that proper matching point number is engaged. It will be shown that the integrated navigation system is effective in prohibiting the divergence of the indicative track and can meet the requirements of underwater, long-term and high precision of the navigation system for autonomous underwater vehicles. Copyright © 2017. Published by Elsevier Ltd.

  11. Towards high-speed autonomous navigation of unknown environments

    NASA Astrophysics Data System (ADS)

    Richter, Charles; Roy, Nicholas

    2015-05-01

    In this paper, we summarize recent research enabling high-speed navigation in unknown environments for dynamic robots that perceive the world through onboard sensors. Many existing solutions to this problem guarantee safety by making the conservative assumption that any unknown portion of the map may contain an obstacle, and therefore constrain planned motions to lie entirely within known free space. In this work, we observe that safety constraints may significantly limit performance and that faster navigation is possible if the planner reasons about collision with unobserved obstacles probabilistically. Our overall approach is to use machine learning to approximate the expected costs of collision using the current state of the map and the planned trajectory. Our contribution is to demonstrate fast but safe planning using a learned function to predict future collision probabilities.

  12. An Analysis of Navigation Algorithms for Smartphones Using J2ME

    NASA Astrophysics Data System (ADS)

    Santos, André C.; Tarrataca, Luís; Cardoso, João M. P.

    Embedded systems are considered one of the most potential areas for future innovations. Two embedded fields that will most certainly take a primary role in future innovations are mobile robotics and mobile computing. Mobile robots and smartphones are growing in number and functionalities, becoming a presence in our daily life. In this paper, we study the current feasibility of a smartphone to execute navigation algorithms. As a test case, we use a smartphone to control an autonomous mobile robot. We tested three navigation problems: Mapping, Localization and Path Planning. For each of these problems, an algorithm has been chosen, developed in J2ME, and tested on the field. Results show the current mobile Java capacity for executing computationally demanding algorithms and reveal the real possibility of using smartphones for autonomous navigation.

  13. Polarized skylight navigation.

    PubMed

    Hamaoui, Moshe

    2017-01-20

    Vehicle state estimation is an essential prerequisite for navigation. The present approach seeks to use skylight polarization to facilitate state estimation under autonomous unconstrained flight conditions. Atmospheric scattering polarizes incident sunlight such that solar position is mathematically encoded in the resulting skylight polarization pattern. Indeed, several species of insects are able to sense skylight polarization and are believed to navigate polarimetrically. Sun-finding methodologies for polarized skylight navigation (PSN) have been proposed in the literature but typically rely on calibration updates to account for changing atmospheric conditions and/or are limited to 2D operation. To address this technology gap, a gradient-based PSN solution is developed based upon the Rayleigh sky model. The solution is validated in simulation, and effects of measurement error and changing atmospheric conditions are investigated. Finally, an experimental effort is described wherein polarimetric imagery is collected, ground-truth is established through independent imager-attitude measurement, the gradient-based PSN solution is applied, and results are analyzed.

  14. DEMONSTRATION OF AUTONOMOUS AIR MONITORING THROUGH ROBOTICS

    EPA Science Inventory

    This project included modifying an existing teleoperated robot to include autonomous navigation, large object avoidance, and air monitoring and demonstrating that prototype robot system in indoor and outdoor environments. An existing teleoperated "Surveyor" robot developed by ARD...

  15. Systems Engineering Approach to Develop Guidance, Navigation and Control Algorithms for Unmanned Ground Vehicle

    DTIC Science & Technology

    2016-09-01

    identification and tracking algorithm. 14. SUBJECT TERMS unmanned ground vehicles , pure pursuit, vector field histogram, feature recognition 15. NUMBER OF...located within the various theaters of war. The pace for the development and deployment of unmanned ground vehicles (UGV) was, however, not keeping...DEVELOPMENT OF UNMANNED GROUND VEHICLES The development and fielding of UGVs in an operational role are not a new concept in the battlefield. In

  16. Achieving integrated convoys: cargo unmanned ground vehicle development and experimentation

    NASA Astrophysics Data System (ADS)

    Zych, Noah; Silver, David; Stager, David; Green, Colin; Pilarski, Thomas; Fischer, Jacob

    2013-05-01

    The Cargo UGV project was initiated in 2010 with the aim of developing and experimenting with advanced autonomous vehicles capable of being integrated unobtrusively into manned logistics convoys. The intent was to validate two hypotheses in complex, operationally representative environments: first, that unmanned tactical wheeled vehicles provide a force protection advantage by creating standoff distance to warfighters during ambushes or improvised explosive device attacks; and second, that these UGVs serve as force multipliers by enabling a single operator to control multiple unmanned assets. To assess whether current state-of-the-art autonomous vehicle technology was sufficiently capable to permit resupply missions to be executed with decreased risk and reduced manpower, and to assess the effect of UGVs on customary convoy tactics, the Marine Corps Warfighting Laboratory and the Joint Ground Robotics Enterprise sponsored Oshkosh Defense and the National Robotics Engineering Center to equip two standard Marine Corps cargo trucks for autonomous operation. This paper details the system architecture, hardware implementation, and software modules developed to meet the vehicle control, perception, and planner requirements compelled by this application. Additionally, the design of a custom human machine interface and an accompanying training program are described, as is the creation of a realistic convoy simulation environment for rapid system development. Finally, results are conveyed from a warfighter experiment in which the effectiveness of the training program for novice operators was assessed, and the impact of the UGVs on convoy operations was observed in a variety of scenarios via direct comparison to a fully manned convoy.

  17. Magician Simulator: A Realistic Simulator for Heterogenous Teams of Autonomous Robots. MAGIC 2010 Challenge

    DTIC Science & Technology

    2011-02-07

    Sensor UGVs (SUGV) or Disruptor UGVs, depending on their payload. The SUGVs included vision, GPS/IMU, and LIDAR systems for identifying and tracking...employed by all the MAGICian research groups. Objects of interest were tracked using standard LIDAR and Computer Vision template-based feature...tracking approaches. Mapping was solved through Multi-Agent particle-filter based Simultaneous Locali- zation and Mapping ( SLAM ). Our system contains

  18. Libration Point Navigation Concepts Supporting the Vision for Space Exploration

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Folta, David C.; Moreau, Michael C.; Quinn, David A.

    2004-01-01

    This work examines the autonomous navigation accuracy achievable for a lunar exploration trajectory from a translunar libration point lunar navigation relay satellite, augmented by signals from the Global Positioning System (GPS). We also provide a brief analysis comparing the libration point relay to lunar orbit relay architectures, and discuss some issues of GPS usage for cis-lunar trajectories.

  19. Study of the Navigation Method for a Snake Robot Based on the Kinematics Model with MEMS IMU

    PubMed Central

    Dou, Lihua; Su, Zhong; Liu, Ning

    2018-01-01

    A snake robot is a type of highly redundant mobile robot that significantly differs from a tracked robot, wheeled robot and legged robot. To address the issue of a snake robot performing self-localization in the application environment without assistant orientation, an autonomous navigation method is proposed based on the snake robot’s motion characteristic constraints. The method realized the autonomous navigation of the snake robot with non-nodes and an external assistant using its own Micro-Electromechanical-Systems (MEMS) Inertial-Measurement-Unit (IMU). First, it studies the snake robot’s motion characteristics, builds the kinematics model, and then analyses the motion constraint characteristics and motion error propagation properties. Second, it explores the snake robot’s navigation layout, proposes a constraint criterion and the fixed relationship, and makes zero-state constraints based on the motion features and control modes of a snake robot. Finally, it realizes autonomous navigation positioning based on the Extended-Kalman-Filter (EKF) position estimation method under the constraints of its motion characteristics. With the self-developed snake robot, the test verifies the proposed method, and the position error is less than 5% of Total-Traveled-Distance (TDD). In a short-distance environment, this method is able to meet the requirements of a snake robot in order to perform autonomous navigation and positioning in traditional applications and can be extended to other familiar multi-link robots. PMID:29547515

  20. Study of the Navigation Method for a Snake Robot Based on the Kinematics Model with MEMS IMU.

    PubMed

    Zhao, Xu; Dou, Lihua; Su, Zhong; Liu, Ning

    2018-03-16

    A snake robot is a type of highly redundant mobile robot that significantly differs from a tracked robot, wheeled robot and legged robot. To address the issue of a snake robot performing self-localization in the application environment without assistant orientation, an autonomous navigation method is proposed based on the snake robot's motion characteristic constraints. The method realized the autonomous navigation of the snake robot with non-nodes and an external assistant using its own Micro-Electromechanical-Systems (MEMS) Inertial-Measurement-Unit (IMU). First, it studies the snake robot's motion characteristics, builds the kinematics model, and then analyses the motion constraint characteristics and motion error propagation properties. Second, it explores the snake robot's navigation layout, proposes a constraint criterion and the fixed relationship, and makes zero-state constraints based on the motion features and control modes of a snake robot. Finally, it realizes autonomous navigation positioning based on the Extended-Kalman-Filter (EKF) position estimation method under the constraints of its motion characteristics. With the self-developed snake robot, the test verifies the proposed method, and the position error is less than 5% of Total-Traveled-Distance (TDD). In a short-distance environment, this method is able to meet the requirements of a snake robot in order to perform autonomous navigation and positioning in traditional applications and can be extended to other familiar multi-link robots.

  1. Navigation Architecture for a Space Mobile Network

    NASA Technical Reports Server (NTRS)

    Valdez, Jennifer E.; Ashman, Benjamin; Gramling, Cheryl; Heckler, Gregory W.; Carpenter, Russell

    2016-01-01

    The Tracking and Data Relay Satellite System (TDRSS) Augmentation Service for Satellites (TASS) is a proposed beacon service to provide a global, space based GPS augmentation service based on the NASA Global Differential GPS (GDGPS) System. The TASS signal will be tied to the GPS time system and usable as an additional ranging and Doppler radiometric source. Additionally, it will provide data vital to autonomous navigation in the near Earth regime, including space weather information, TDRS ephemerides, Earth Orientation Parameters (EOP), and forward commanding capability. TASS benefits include enhancing situational awareness, enabling increased autonomy, and providing near real-time command access for user platforms. As NASA Headquarters' Space Communication and Navigation Office (SCaN) begins to move away from a centralized network architecture and towards a Space Mobile Network (SMN) that allows for user initiated services, autonomous navigation will be a key part of such a system. This paper explores how a TASS beacon service enables the Space Mobile Networking paradigm, what a typical user platform would require, and provides an in-depth analysis of several navigation scenarios and operations concepts. This paper provides an overview of the TASS beacon and its role within the SMN and user community. Supporting navigation analysis is presented for two user mission scenarios: an Earth observing spacecraft in low earth orbit (LEO), and a highly elliptical spacecraft in a lunar resonance orbit. These diverse flight scenarios indicate the breadth of applicability of the TASS beacon for upcoming users within the current network architecture and in the SMN.

  2. A simplified satellite navigation system for an autonomous Mars roving vehicle.

    NASA Technical Reports Server (NTRS)

    Janosko, R. E.; Shen, C. N.

    1972-01-01

    The use of a retroflecting satellite and a laser rangefinder to navigate a Martian roving vehicle is considered in this paper. It is shown that a simple system can be employed to perform this task. An error analysis is performed on the navigation equations and it is shown that the error inherent in the scheme proposed can be minimized by the proper choice of measurement geometry. A nonlinear programming approach is used to minimize the navigation error subject to constraints that are due to geometric and laser requirements. The problem is solved for a particular set of laser parameters and the optimal solution is presented.

  3. Open-Loop Flight Testing of COBALT Navigation and Sensor Technologies for Precise Soft Landing

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Restrepo, Caroline I.; Seubert, Carl R.; Amzajerdian, Farzin; Pierrottet, Diego F.; Collins, Steven M.; O'Neal, Travis V.; Stelling, Richard

    2017-01-01

    An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) payload was conducted onboard the Masten Xodiac suborbital rocket testbed. The payload integrates two complementary sensor technologies that together provide a spacecraft with knowledge during planetary descent and landing to precisely navigate and softly touchdown in close proximity to targeted surface locations. The two technologies are the Navigation Doppler Lidar (NDL), for high-precision velocity and range measurements, and the Lander Vision System (LVS) for map-relative state esti- mates. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a very precise Terrain Relative Navigation (TRN) solution that is suitable for future, autonomous planetary landing systems that require precise and soft landing capabilities. During the open-loop flight campaign, the COBALT payload acquired measurements and generated a precise navigation solution, but the Xodiac vehicle planned and executed its maneuvers based on an independent, GPS-based navigation solution. This minimized the risk to the vehicle during the integration and testing of the new navigation sensing technologies within the COBALT payload.

  4. ARV robotic technologies (ART): a risk reduction effort for future unmanned systems

    NASA Astrophysics Data System (ADS)

    Jaster, Jeffrey F.

    2006-05-01

    The Army's ARV (Armed Robotic Vehicle) Robotic Technologies (ART) program is working on the development of various technological thrusts for use in the robotic forces of the future. The ART program will develop, integrate and demonstrate the technology required to advance the maneuver technologies (i.e., perception, mobility, tactical behaviors) and increase the survivability of unmanned platforms for the future force while focusing on reducing the soldiers' burden by providing an increase in vehicle autonomy coinciding with a decrease in the total number user interventions required to control the unmanned assets. This program will advance the state of the art in perception technologies to provide the unmanned platform an increasingly accurate view of the terrain that surrounds it; while developing tactical/mission behavior technologies to provide the Unmanned Ground Vehicle (UGV) the capability to maneuver tactically, in conjunction with the manned systems in an autonomous mode. The ART testbed will be integrated with the advanced technology software and associated hardware developed under this effort, and incorporate appropriate mission modules (e.g. RSTA sensors, MILES, etc.) to support Warfighter experiments and evaluations (virtual and field) in a military significant environment (open/rolling and complex/urban terrain). The outcome of these experiments as well as other lessons learned through out the program life cycle will be used to reduce the current risks that are identified for the future UGV systems that will be developed under the Future Combat Systems (FCS) program, including the early integration of an FCS-like autonomous navigation system onto a tracked skid steer platform.

  5. Navigation system for autonomous mapper robots

    NASA Astrophysics Data System (ADS)

    Halbach, Marc; Baudoin, Yvan

    1993-05-01

    This paper describes the conception and realization of a fast, robust, and general navigation system for a mobile (wheeled or legged) robot. A database, representing a high level map of the environment is generated and continuously updated. The first part describes the legged target vehicle and the hexapod robot being developed. The second section deals with spatial and temporal sensor fusion for dynamic environment modeling within an obstacle/free space probabilistic classification grid. Ultrasonic sensors are used, others are suspected to be integrated, and a-priori knowledge is treated. US sensors are controlled by the path planning module. The third part concerns path planning and a simulation of a wheeled robot is also presented.

  6. Visual Requirements for Human Drivers and Autonomous Vehicles

    DOT National Transportation Integrated Search

    2016-03-01

    Identification of published literature between 1995 and 2013, focusing on determining the quantity and quality of visual information needed under both driving modes (i.e., human and autonomous) to navigate the road safely, especially as it pertains t...

  7. Design of an autonomous exterior security robot

    NASA Technical Reports Server (NTRS)

    Myers, Scott D.

    1994-01-01

    This paper discusses the requirements and preliminary design of robotic vehicle designed for performing autonomous exterior perimeter security patrols around warehouse areas, ammunition supply depots, and industrial parks for the U.S. Department of Defense. The preliminary design allows for the operation of up to eight vehicles in a six kilometer by six kilometer zone with autonomous navigation and obstacle avoidance. In addition to detection of crawling intruders at 100 meters, the system must perform real-time inventory checking and database comparisons using a microwave tags system.

  8. Analysis of key technologies in geomagnetic navigation

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoming; Zhao, Yan

    2008-10-01

    Because of the costly price and the error accumulation of high precise Inertial Navigation Systems (INS) and the vulnerability of Global Navigation Satellite Systems (GNSS), the geomagnetic navigation technology, a passive autonomous navigation method, is paid attention again. Geomagnetic field is a natural spatial physical field, and is a function of position and time in near earth space. The navigation technology based on geomagnetic field is researched in a wide range of commercial and military applications. This paper presents the main features and the state-of-the-art of Geomagnetic Navigation System (GMNS). Geomagnetic field models and reference maps are described. Obtaining, modeling and updating accurate Anomaly Magnetic Field information is an important step for high precision geomagnetic navigation. In addition, the errors of geomagnetic measurement using strapdown magnetometers are analyzed. The precise geomagnetic data is obtained by means of magnetometer calibration and vehicle magnetic field compensation. According to the measurement data and reference map or model of geomagnetic field, the vehicle's position and attitude can be obtained using matching algorithm or state-estimating method. The tendency of geomagnetic navigation in near future is introduced at the end of this paper.

  9. Autonomous navigation and mobility for a planetary rover

    NASA Technical Reports Server (NTRS)

    Miller, David P.; Mishkin, Andrew H.; Lambert, Kenneth E.; Bickler, Donald; Bernard, Douglas E.

    1989-01-01

    This paper presents an overview of the onboard subsystems that will be used in guiding a planetary rover. Particular emphasis is placed on the planning and sensing systems and their associated costs, particularly in computation. Issues that will be used in evaluating trades between the navigation system and mobility system are also presented.

  10. Investigating the Usefulness of Operator Aids for Autonomous Unmanned Ground Vehicles Performing Reconnaissance Tasks

    DTIC Science & Technology

    2013-09-01

    generated using data from the ANS about the path that the automation attempted to follow. The STP operator aid was displayed as a translucent green...intended route of the UGV projected for the next several seconds. Similarly, the LTP operator aid was displayed as a translucent blue line overlaid on...route of the UGV projected for the next several minutes or more. The combination of STP and LTP operator aids simply displayed both translucent green

  11. An Architecture for Autonomous Rovers on Future Planetary Missions

    NASA Astrophysics Data System (ADS)

    Ocon, J.; Avilés, M.; Graziano, M.

    2018-04-01

    This paper proposes an architecture for autonomous planetary rovers. This architecture combines a set of characteristics required in this type of system: high level of abstraction, reactive event-based activity execution, and automous navigation.

  12. Simple autonomous Mars walker

    NASA Technical Reports Server (NTRS)

    Larimer, Stanley J.; Lisec, Thomas R.; Spiessbach, Andrew J.

    1989-01-01

    Under a contract with NASA's Jet Propulsion Laboratory, Martin Marietta has developed several alternative rover concepts for unmanned exploration of the planet Mars. One of those concepts, the 'Walking Beam', is the subject of this paper. This concept was developed with the goal of achieving many of the capabilities of more sophisticated articulated-leg walkers with a much simpler, more robust, less computationally demanding and more power efficient design. It consists of two large-base tripods nested one within the other which alternately translate with respect to each other along a 5-meter beam to propel the vehicle. The semiautonomous navigation system relies on terrain geometry sensors and tacticle feedback from each foot to autonomously select a path which avoids hazards along a route designated from earth. Both mobility and navigation features of this concept are discussed including a top-level description of the vehicle's physical characteristics, deployment strategy, mobility elements, sensor suite, theory of operation, navigation and control processes, and estimated performance.

  13. A reactive system for open terrain navigation: Performance and limitations

    NASA Technical Reports Server (NTRS)

    Langer, D.; Rosenblatt, J.; Hebert, M.

    1994-01-01

    We describe a core system for autonomous navigation in outdoor natural terrain. The system consists of three parts: a perception module which processes range images to identify untraversable regions of the terrain, a local map management module which maintains a representation of the environment in the vicinity of the vehicle, and a planning module which issues commands to the vehicle controller. Our approach is to use the concept of 'early traversability evaluation', and on the use of reactive planning for generating commands to drive the vehicle. We argue that our approach leads to a robust and efficient navigation system. We illustrate our approach by an experiment in which a vehicle travelled autonomously for one kilometer through unmapped cross-country terrain.

  14. Neuro-fuzzy controller to navigate an unmanned vehicle.

    PubMed

    Selma, Boumediene; Chouraqui, Samira

    2013-12-01

    A Neuro-fuzzy control method for an Unmanned Vehicle (UV) simulation is described. The objective is guiding an autonomous vehicle to a desired destination along a desired path in an environment characterized by a terrain and a set of distinct objects, such as obstacles like donkey traffic lights and cars circulating in the trajectory. The autonomous navigate ability and road following precision are mainly influenced by its control strategy and real-time control performance. Fuzzy Logic Controller can very well describe the desired system behavior with simple "if-then" relations owing the designer to derive "if-then" rules manually by trial and error. On the other hand, Neural Networks perform function approximation of a system but cannot interpret the solution obtained neither check if its solution is plausible. The two approaches are complementary. Combining them, Neural Networks will allow learning capability while Fuzzy-Logic will bring knowledge representation (Neuro-Fuzzy). In this paper, an artificial neural network fuzzy inference system (ANFIS) controller is described and implemented to navigate the autonomous vehicle. Results show several improvements in the control system adjusted by neuro-fuzzy techniques in comparison to the previous methods like Artificial Neural Network (ANN).

  15. Conceptual Design of a Communication-Based Deep Space Navigation Network

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan J.; Chuang, C. H.

    2012-01-01

    As the need grows for increased autonomy and position knowledge accuracy to support missions beyond Earth orbit, engineers must push and develop more advanced navigation sensors and systems that operate independent of Earth-based analysis and processing. Several spacecraft are approaching this problem using inter-spacecraft radiometric tracking and onboard autonomous optical navigation methods. This paper proposes an alternative implementation to aid in spacecraft position fixing. The proposed method Network-Based Navigation technique takes advantage of the communication data being sent between spacecraft and between spacecraft and ground control to embed navigation information. The navigation system uses these packets to provide navigation estimates to an onboard navigation filter to augment traditional ground-based radiometric tracking techniques. As opposed to using digital signal measurements to capture inherent information of the transmitted signal itself, this method relies on the embedded navigation packet headers to calculate a navigation estimate. This method is heavily dependent on clock accuracy and the initial results show the promising performance of a notional system.

  16. Navigation of military and space unmanned ground vehicles in unstructured terrains

    NASA Technical Reports Server (NTRS)

    Lescoe, Paul; Lavery, David; Bedard, Roger

    1991-01-01

    Development of unmanned vehicles for local navigation in terrains unstructured by humans is reviewed. Modes of navigation include teleoperation or remote control, computer assisted remote driving (CARD), and semiautonomous navigation (SAN). A first implementation of a CARD system was successfully tested using the Robotic Technology Test Vehicle developed by Jet Propulsion Laboratory. Stereo pictures were transmitted to a remotely located human operator, who performed the sensing, perception, and planning functions of navigation. A computer provided range and angle measurements and the path plan was transmitted to the vehicle which autonomously executed the path. This implementation is to be enhanced by providing passive stereo vision and a reflex control system for autonomously stopping the vehicle if blocked by an obstacle. SAN achievements include implementation of a navigation testbed on a six wheel, three-body articulated rover vehicle, development of SAN algorithms and code, integration of SAN software onto the vehicle, and a successful feasibility demonstration that represents a step forward towards the technology required for long-range exploration of the lunar or Martian surface. The vehicle includes a passive stereo vision system with real-time area-based stereo image correlation, a terrain matcher, a path planner, and a path execution planner.

  17. Autonomous Rovers for Polar Science Campaigns

    NASA Astrophysics Data System (ADS)

    Lever, J. H.; Ray, L. E.; Williams, R. M.; Morlock, A. M.; Burzynski, A. M.

    2012-12-01

    We have developed and deployed two over-snow autonomous rovers able to conduct remote science campaigns on Polar ice sheets. Yeti is an 80-kg, four-wheel-drive (4WD) battery-powered robot with 3 - 4 hr endurance, and Cool Robot is a 60-kg 4WD solar-powered robot with unlimited endurance during Polar summers. Both robots navigate using GPS waypoint-following to execute pre-planned courses autonomously, and they can each carry or tow 20 - 160 kg instrument payloads over typically firm Polar snowfields. In 2008 - 12, we deployed Yeti to conduct autonomous ground-penetrating radar (GPR) surveys to detect hidden crevasses to help establish safe routes for overland resupply of research stations at South Pole, Antarctica, and Summit, Greenland. We also deployed Yeti with GPR at South Pole in 2011 to identify the locations of potentially hazardous buried buildings from the original 1950's-era station. Autonomous surveys remove personnel from safety risks posed during manual GPR surveys by undetected crevasses or buried buildings. Furthermore, autonomous surveys can yield higher quality and more comprehensive data than manual ones: Yeti's low ground pressure (20 kPa) allows it to cross thinly bridged crevasses or other voids without interrupting a survey, and well-defined survey grids allow repeated detection of buried voids to improve detection reliability and map their extent. To improve survey efficiency, we have automated the mapping of detected hazards, currently identified via post-survey manual review of the GPR data. Additionally, we are developing machine-learning algorithms to detect crevasses autonomously in real time, with reliability potentially higher than manual real-time detection. These algorithms will enable the rover to relay crevasse locations to a base station for near real-time mapping and decision-making. We deployed Cool Robot at Summit Station in 2005 to verify its mobility and power budget over Polar snowfields. Using solar power, this zero

  18. Innovation Talk at TARDEC by Dr. Tulga Ersal

    Science.gov Websites

    problems of teleoperation and fully autonomous operation of large Unmanned Ground Vehicles (UGVs) at high wide spectrum in their mode of operation ranging from teleoperated, in which the remote human operator implementable solution. High speeds also present a challenge to fully autonomous operation with respect to

  19. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1990-01-01

    A research program and strategy are described which include fundamental teleoperation issues and autonomous-control issues of sensing and navigation for satellite robots. The program consists of developing interfaces for visual operation and studying the consequences of interface designs as well as developing navigation and control technologies based on visual interaction. A space-robot-vehicle simulator is under development for use in virtual-environment teleoperation experiments and neutral-buoyancy investigations. These technologies can be utilized in a study of visual interfaces to address tradeoffs between head-tracking and manual remote cameras, panel-mounted and helmet-mounted displays, and stereoscopic and monoscopic display systems. The present program can provide significant data for the development of control experiments for autonomously controlled satellite robots.

  20. ALHAT COBALT: CoOperative Blending of Autonomous Landing Technology

    NASA Technical Reports Server (NTRS)

    Carson, John M.

    2015-01-01

    The COBALT project is a flight demonstration of two NASA ALHAT (Autonomous precision Landing and Hazard Avoidance Technology) capabilities that are key for future robotic or human landing GN&C (Guidance, Navigation and Control) systems. The COBALT payload integrates the Navigation Doppler Lidar (NDL) for ultraprecise velocity and range measurements with the Lander Vision System (LVS) for Terrain Relative Navigation (TRN) position estimates. Terrestrial flight tests of the COBALT payload in an open-loop and closed-loop GN&C configuration will be conducted onboard a commercial, rocket-propulsive Vertical Test Bed (VTB) at a test range in Mojave, CA.

  1. INS/GPS/LiDAR Integrated Navigation System for Urban and Indoor Environments Using Hybrid Scan Matching Algorithm.

    PubMed

    Gao, Yanbin; Liu, Shifei; Atia, Mohamed M; Noureldin, Aboelmagd

    2015-09-15

    This paper takes advantage of the complementary characteristics of Global Positioning System (GPS) and Light Detection and Ranging (LiDAR) to provide periodic corrections to Inertial Navigation System (INS) alternatively in different environmental conditions. In open sky, where GPS signals are available and LiDAR measurements are sparse, GPS is integrated with INS. Meanwhile, in confined outdoor environments and indoors, where GPS is unreliable or unavailable and LiDAR measurements are rich, LiDAR replaces GPS to integrate with INS. This paper also proposes an innovative hybrid scan matching algorithm that combines the feature-based scan matching method and Iterative Closest Point (ICP) based scan matching method. The algorithm can work and transit between two modes depending on the number of matched line features over two scans, thus achieving efficiency and robustness concurrently. Two integration schemes of INS and LiDAR with hybrid scan matching algorithm are implemented and compared. Real experiments are performed on an Unmanned Ground Vehicle (UGV) for both outdoor and indoor environments. Experimental results show that the multi-sensor integrated system can remain sub-meter navigation accuracy during the whole trajectory.

  2. INS/GPS/LiDAR Integrated Navigation System for Urban and Indoor Environments Using Hybrid Scan Matching Algorithm

    PubMed Central

    Gao, Yanbin; Liu, Shifei; Atia, Mohamed M.; Noureldin, Aboelmagd

    2015-01-01

    This paper takes advantage of the complementary characteristics of Global Positioning System (GPS) and Light Detection and Ranging (LiDAR) to provide periodic corrections to Inertial Navigation System (INS) alternatively in different environmental conditions. In open sky, where GPS signals are available and LiDAR measurements are sparse, GPS is integrated with INS. Meanwhile, in confined outdoor environments and indoors, where GPS is unreliable or unavailable and LiDAR measurements are rich, LiDAR replaces GPS to integrate with INS. This paper also proposes an innovative hybrid scan matching algorithm that combines the feature-based scan matching method and Iterative Closest Point (ICP) based scan matching method. The algorithm can work and transit between two modes depending on the number of matched line features over two scans, thus achieving efficiency and robustness concurrently. Two integration schemes of INS and LiDAR with hybrid scan matching algorithm are implemented and compared. Real experiments are performed on an Unmanned Ground Vehicle (UGV) for both outdoor and indoor environments. Experimental results show that the multi-sensor integrated system can remain sub-meter navigation accuracy during the whole trajectory. PMID:26389906

  3. Vision-Aided Autonomous Landing and Ingress of Micro Aerial Vehicles

    NASA Technical Reports Server (NTRS)

    Brockers, Roland; Ma, Jeremy C.; Matthies, Larry H.; Bouffard, Patrick

    2012-01-01

    Micro aerial vehicles have limited sensor suites and computational power. For reconnaissance tasks and to conserve energy, these systems need the ability to autonomously land at vantage points or enter buildings (ingress). But for autonomous navigation, information is needed to identify and guide the vehicle to the target. Vision algorithms can provide egomotion estimation and target detection using input from cameras that are easy to include in miniature systems.

  4. Pulsar Timing and Its Application for Navigation and Gravitational Wave Detection

    NASA Astrophysics Data System (ADS)

    Becker, Werner; Kramer, Michael; Sesana, Alberto

    2018-02-01

    Pulsars are natural cosmic clocks. On long timescales they rival the precision of terrestrial atomic clocks. Using a technique called pulsar timing, the exact measurement of pulse arrival times allows a number of applications, ranging from testing theories of gravity to detecting gravitational waves. Also an external reference system suitable for autonomous space navigation can be defined by pulsars, using them as natural navigation beacons, not unlike the use of GPS satellites for navigation on Earth. By comparing pulse arrival times measured on-board a spacecraft with predicted pulse arrivals at a reference location (e.g. the solar system barycenter), the spacecraft position can be determined autonomously and with high accuracy everywhere in the solar system and beyond. We describe the unique properties of pulsars that suggest that such a navigation system will certainly have its application in future astronautics. We also describe the on-going experiments to use the clock-like nature of pulsars to "construct" a galactic-sized gravitational wave detector for low-frequency (f_{GW}˜ 10^{-9} - 10^{-7} Hz) gravitational waves. We present the current status and provide an outlook for the future.

  5. Neural Network-Based Landmark Recognition and Navigation with IAMRs. Understanding the Principles of Thought and Behavior.

    ERIC Educational Resources Information Center

    Doty, Keith L.

    1999-01-01

    Research on neural networks and hippocampal function demonstrating how mammals construct mental maps and develop navigation strategies is being used to create Intelligent Autonomous Mobile Robots (IAMRs). Such robots are able to recognize landmarks and navigate without "vision." (SK)

  6. The Development of a Simulator System and Hardware Test Bed for Deep Space X-Ray Navigation

    NASA Astrophysics Data System (ADS)

    Doyle, Patrick T.

    2013-03-01

    Currently, there is a considerable interest in developing technologies that will allow using photon measurements from celestial x-ray sources for deep space navigation. The impetus for this is that many envisioned future space missions will require spacecraft to have autonomous navigation capabilities. For missions close to Earth, Global Navigation Satellite Systems (GNSS) such as GPS are readily available for use, but for missions far from Earth, other alternatives must be provided. While existing systems such as the Deep Space Network (DSN) can be used, latencies associated with servicing a fleet of vehicles may not be compatible with some autonomous operations requiring timely updates of their navigation solution. Because of their somewhat predictable emissions, pulsars are the ideal candidates for x-ray sources that can be used to provide key parameters for navigation. Algorithms and simulation tools that will enable designing and analyzing x-ray navigation concepts are presented. The development of a compact x-ray detector system is pivotal to the eventual deployment of such navigation systems. Therefore, results of a high altitude balloon test to evaluate the design of a compact x-ray detector system are described as well.

  7. Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles.

    PubMed

    Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F

    2016-09-16

    Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV's navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results.

  8. Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles

    PubMed Central

    Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F.

    2016-01-01

    Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV’s navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results. PMID:27649203

  9. Integrating Terrain Maps Into a Reactive Navigation Strategy

    NASA Technical Reports Server (NTRS)

    Howard, Ayanna; Werger, Barry; Seraji, Homayoun

    2006-01-01

    An improved method of processing information for autonomous navigation of a robotic vehicle across rough terrain involves the integration of terrain maps into a reactive navigation strategy. Somewhat more precisely, the method involves the incorporation, into navigation logic, of data equivalent to regional traversability maps. The terrain characteristic is mapped using a fuzzy-logic representation of the difficulty of traversing the terrain. The method is robust in that it integrates a global path-planning strategy with sensor-based regional and local navigation strategies to ensure a high probability of success in reaching a destination and avoiding obstacles along the way. The sensor-based strategies use cameras aboard the vehicle to observe the regional terrain, defined as the area of the terrain that covers the immediate vicinity near the vehicle to a specified distance a few meters away.

  10. State estimation for autonomous flight in cluttered environments

    NASA Astrophysics Data System (ADS)

    Langelaan, Jacob Willem

    Safe, autonomous operation in complex, cluttered environments is a critical challenge facing autonomous mobile systems. The research described in this dissertation was motivated by a particularly difficult example of autonomous mobility: flight of a small Unmanned Aerial Vehicle (UAV) through a forest. In cluttered environments (such as forests or natural and urban canyons) signals from navigation beacons such as GPS may frequently be occluded. Direct measurements of vehicle position are therefore unavailable, and information required for flight control, obstacle avoidance, and navigation must be obtained using only on-board sensors. However, payload limitations of small UAVs restrict both the mass and physical dimensions of sensors that can be carried. This dissertation describes the development and proof-of-concept demonstration of a navigation system that uses only a low-cost inertial measurement unit and a monocular camera. Micro electromechanical inertial measurements units are well suited to small UAV applications and provide measurements of acceleration and angular rate. However, they do not provide information about nearby obstacles (needed for collision avoidance) and their noise and bias characteristics lead to unbounded growth in computed position. A monocular camera can provide bearings to nearby obstacles and landmarks. These bearings can be used both to enable obstacle avoidance and to aid navigation. Presented here is a solution to the problem of estimating vehicle state (position, orientation and velocity) as well as positions of obstacles in the environment using only inertial measurements and bearings to obstacles. This is a highly nonlinear estimation problem, and standard estimation techniques such as the Extended Kalman Filter are prone to divergence in this application. In this dissertation a Sigma Point Kalman Filter is implemented, resulting in an estimator which is able to cope with the significant nonlinearities in the system equations and

  11. TDRSS Onboard Navigation System (TONS) experiment for the Explorer Platform (EP)

    NASA Astrophysics Data System (ADS)

    Gramling, C. J.; Hornstein, R. S.; Long, A. C.; Samii, M. V.; Elrod, B. D.

    A TDRSS Onboard Navigation System (TONS) is currently being developed by NASA to provide a high-accuracy autonomous spacecraft navigation capability for users of TDRSS and its successor, the Advanced TDRSS. A TONS experiment will be performed in conjunction with the Explorer Platform (EP)/EUV Explorer mission to flight-qualify TONS Block I. This paper presents an overview of TDRSS on-board navigation goals and plans and the technical objectives of the TONS experiment. The operations concept of the experiment is described, including the characteristics of the ultrastable oscillator, the Doppler extractor, the signal-acquisition process, the TONS ground-support system, and the navigation flight software. A description of the on-board navigation algorithms and the rationale for their selection is also presented.

  12. Navigation, behaviors, and control modes in an autonomous vehicle

    NASA Astrophysics Data System (ADS)

    Byler, Eric A.

    1995-01-01

    An Intelligent Mobile Sensing System (IMSS) has been developed for the automated inspection of radioactive and hazardous waste storage containers in warehouse facilities at Department of Energy sites. A 2D space of control modes was used that provides a combined view of reactive and planning approaches wherein a 2D situation space is defined by dimensions representing the predictability of the agent's task environment and the constraint imposed by its goals. In this sense selection of appropriate systems for planning, navigation, and control depends on the problem at hand. The IMSS vehicle navigation system is based on a combination of feature based motion, landmark sightings, and an a priori logical map of the mockup storage facility. Motion for the inspection activities are composed of different interactions of several available control modes, several obstacle avoidance modes, and several feature identification modes. Features used to drive these behaviors are both visual and acoustic.

  13. Science Benefits of Onboard Spacecraft Navigation

    NASA Technical Reports Server (NTRS)

    Cangahuala, Al; Bhaskaran, Shyam; Owen, Bill

    2012-01-01

    navigation can be accomplished through a self- contained system that by eliminating light time restrictions dramatically improves the relative trajectory knowledge and control and subsequently increases the amount of quality data collected. Flybys are one-time events, so the system's underlying algorithms and software must be extremely robust. The autonomous software must also be able to cope with the unknown size, shape, and orientation of the previously unseen comet nucleus. Furthermore, algorithms must be reliable in the presence of imperfections and/or damage to onboard cameras accrued after many years of deep-space operations. The AutoNav operational flight software packages, developed by scientists at the Jet Propulsion Laboratory (JPL) under contract with NASA, meet all these requirements. They have been directly responsible for the successful encounters on all of NASA's close-up comet-imaging missions (see Figure !1). AutoNav is the only system to date that has autonomously tracked comet nuclei during encounters and performed autonomous interplanetary navigation. AutoNav has enabled five cometary flyby missions (Table!1) residing on four NASA spacecraft provided by three different spacecraft builders. Using this software, missions were able to process a combined total of nearly 1000 images previously unseen by humans. By eliminating the need to navigate spacecraft from Earth, the accuracy gained by AutoNav during flybys compared to ground-based navigation is about 1!order of magnitude in targeting and 2!orders of magnitude in time of flight. These benefits ensure that pointing errors do not compromise data gathered during flybys. In addition, these benefits can be applied to flybys of other solar system objects, flybys at much slower relative velocities, mosaic imaging campaigns, and other proximity activities (e.g., orbiting, hovering, and descent/ascent).

  14. Control of autonomous robot using neural networks

    NASA Astrophysics Data System (ADS)

    Barton, Adam; Volna, Eva

    2017-07-01

    The aim of the article is to design a method of control of an autonomous robot using artificial neural networks. The introductory part describes control issues from the perspective of autonomous robot navigation and the current mobile robots controlled by neural networks. The core of the article is the design of the controlling neural network, and generation and filtration of the training set using ART1 (Adaptive Resonance Theory). The outcome of the practical part is an assembled Lego Mindstorms EV3 robot solving the problem of avoiding obstacles in space. To verify models of an autonomous robot behavior, a set of experiments was created as well as evaluation criteria. The speed of each motor was adjusted by the controlling neural network with respect to the situation in which the robot was found.

  15. A Self-Tuning Kalman Filter for Autonomous Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, Son H.

    1999-01-01

    Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS (Global Positioning Systems) data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.

  16. A Self-Tuning Kalman Filter for Autonomous Navigation using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.

    1999-01-01

    Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.

  17. Development of Mission Enabling Infrastructure — Cislunar Autonomous Positioning System (CAPS)

    NASA Astrophysics Data System (ADS)

    Cheetham, B. W.

    2017-10-01

    Advanced Space, LLC is developing the Cislunar Autonomous Positioning System (CAPS) which would provide a scalable and evolvable architecture for navigation to reduce ground congestion and improve operations for missions throughout cislunar space.

  18. Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.

    1996-01-01

    The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,

  19. Autonomous orbital navigation using Kepler's equation

    NASA Technical Reports Server (NTRS)

    Boltz, F. W.

    1974-01-01

    A simple method of determining the six elements of elliptic satellite orbits has been developed for use aboard manned and unmanned spacecraft orbiting the earth, moon, or any planet. The system requires the use of a horizon sensor or other device for determining the local vertical, a precision clock or timing device, and Apollo-type navigation equipment including an inertial measurement unit (IMU), a digital computer, and a coupling data unit. The three elements defining the in-plane motion are obtained from simultaneous measurements of central angle traversed around the planet and elapsed flight time using a linearization of Kepler's equation about a reference orbit. It is shown how Kalman filter theory may also be used to determine the in-plane orbital elements. The three elements defining the orbit orientation are obtained from position angles in celestial coordinates derived from the IMU with the spacecraft vertically oriented after alignment of the IMU to a known inertial coordinate frame.

  20. Early Synthetic Prototyping: Exploring Designs and Concepts Within Games

    DTIC Science & Technology

    2014-12-01

    UAS unmanned aircraft system UGV unmanned ground vehicle USD(AT&L) Under Secretary of Defense for Acquisition, Technology, and Logistics... unmanned aircraft system (UAS) realm for the wingman concept? The players were familiar with the Marine Corps’ unmanned tactical autonomous control and...UTACCS Unmanned Tactical Autonomous Control and Collaboration System VBIED vehicle borne improvised explosive device VBS2/3 Virtual Battlespace

  1. Autonomous integrated GPS/INS navigation experiment for OMV. Phase 1: Feasibility study

    NASA Technical Reports Server (NTRS)

    Upadhyay, Triveni N.; Priovolos, George J.; Rhodehamel, Harley

    1990-01-01

    The phase 1 research focused on the experiment definition. A tightly integrated Global Positioning System/Inertial Navigation System (GPS/INS) navigation filter design was analyzed and was shown, via detailed computer simulation, to provide precise position, velocity, and attitude (alignment) data to support navigation and attitude control requirements of future NASA missions. The application of the integrated filter was also shown to provide the opportunity to calibrate inertial instrument errors which is particularly useful in reducing INS error growth during times of GPS outages. While the Orbital Maneuvering Vehicle (OMV) provides a good target platform for demonstration and for possible flight implementation to provide improved capability, a successful proof-of-concept ground demonstration can be obtained using any simulated mission scenario data, such as Space Transfer Vehicle, Shuttle-C, Space Station.

  2. A Simulation Study of a Speed Control System for Autonomous On-Road Operation of Automotive Vehicles.

    DTIC Science & Technology

    1987-06-01

    by block numoiber) The study of human driving of automotive vehicles is an important aid to the development of viable autonomous vehicle navigation...of human driving which could provide some different insights into possible approaches to autonomous vehicle control. At the start of this work, it was...advanced work in the behavioral aspects of human driving . Research of this nature can have a significant impact on the development of autonomous vehicles

  3. Autonomous formation flying based on GPS — PRISMA flight results

    NASA Astrophysics Data System (ADS)

    D'Amico, Simone; Ardaens, Jean-Sebastien; De Florio, Sergio

    2013-01-01

    This paper presents flight results from the early harvest of the Spaceborne Autonomous Formation Flying Experiment (SAFE) conducted in the frame of the Swedish PRISMA technology demonstration mission. SAFE represents one of the first demonstrations in low Earth orbit of an advanced guidance, navigation and control system for dual-spacecraft formations. Innovative techniques based on differential GPS-based navigation and relative orbital elements control are validated and tuned in orbit to fulfill the typical requirements of future distributed scientific instruments for remote sensing.

  4. Autonomous spacecraft maintenance study group

    NASA Technical Reports Server (NTRS)

    Marshall, M. H.; Low, G. D.

    1981-01-01

    A plan to incorporate autonomous spacecraft maintenance (ASM) capabilities into Air Force spacecraft by 1989 is outlined. It includes the successful operation of the spacecraft without ground operator intervention for extended periods of time. Mechanisms, along with a fault tolerant data processing system (including a nonvolatile backup memory) and an autonomous navigation capability, are needed to replace the routine servicing that is presently performed by the ground system. The state of the art fault handling capabilities of various spacecraft and computers are described, and a set conceptual design requirements needed to achieve ASM is established. Implementations for near term technology development needed for an ASM proof of concept demonstration by 1985, and a research agenda addressing long range academic research for an advanced ASM system for 1990s are established.

  5. HERMIES-3: A step toward autonomous mobility, manipulation, and perception

    NASA Technical Reports Server (NTRS)

    Weisbin, C. R.; Burks, B. L.; Einstein, J. R.; Feezell, R. R.; Manges, W. W.; Thompson, D. H.

    1989-01-01

    HERMIES-III is an autonomous robot comprised of a seven degree-of-freedom (DOF) manipulator designed for human scale tasks, a laser range finder, a sonar array, an omni-directional wheel-driven chassis, multiple cameras, and a dual computer system containing a 16-node hypercube expandable to 128 nodes. The current experimental program involves performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES-III). The environment in which the robots operate has been designed to include multiple valves, pipes, meters, obstacles on the floor, valves occluded from view, and multiple paths of differing navigation complexity. The ongoing research program supports the development of autonomous capability for HERMIES-IIB and III to perform complex navigation and manipulation under time constraints, while dealing with imprecise sensory information.

  6. The Role of X-Rays in Future Space Navigation and Communication

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke M. B.; Gendreau, Keith C.; Hasouneh, Monther A.; Mitchell, Jason W.; Fong, Wai H.; Lee, Wing-Tsz; Gavriil, Fotis; Arzoumanian, Zaven

    2013-01-01

    In the near future, applications using X-rays will enable autonomous navigation and time distribution throughout the solar system, high capacity and low-power space data links, highly accurate attitude sensing, and extremely high-precision formation flying capabilities. Each of these applications alone has the potential to revolutionize mission capabilities, particularly beyond Earth orbit. This paper will outline the NASA Goddard Space Flight Center vision and efforts toward realizing the full potential of X-ray navigation and communications.

  7. Evaluation of Hardware and Software for a Small Autonomous Underwater Vehicle Navigation System (SANS)

    DTIC Science & Technology

    1994-09-01

    Hyslop , G.L., Schieber, G.E., Schwartz, M.K., "Automated Mission Planning for the Standoff Land Attack Missile (SLAM)", Proceedings of the...1993, pp. 277-290. [PARK80] Parkinson, B.W., "Overview", Global Positioning System, Vol. 1, The Institute of Navigation, Washington, D.C., 1980 , pp...Navigation Message", Global Positioning System, Vol. 1, The Institute of Navigation, Washington, D.C., 1980 , pp. 55-73. 139 [WOOD851 Wooden, W. H

  8. Improving CAR Navigation with a Vision-Based System

    NASA Astrophysics Data System (ADS)

    Kim, H.; Choi, K.; Lee, I.

    2015-08-01

    The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.

  9. Improving Car Navigation with a Vision-Based System

    NASA Astrophysics Data System (ADS)

    Kim, H.; Choi, K.; Lee, I.

    2015-08-01

    The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.

  10. ARK: Autonomous mobile robot in an industrial environment

    NASA Technical Reports Server (NTRS)

    Nickerson, S. B.; Jasiobedzki, P.; Jenkin, M.; Jepson, A.; Milios, E.; Down, B.; Service, J. R. R.; Terzopoulos, D.; Tsotsos, J.; Wilkes, D.

    1994-01-01

    This paper describes research on the ARK (Autonomous Mobile Robot in a Known Environment) project. The technical objective of the project is to build a robot that can navigate in a complex industrial environment using maps with permanent structures. The environment is not altered in any way by adding easily identifiable beacons and the robot relies on naturally occurring objects to use as visual landmarks for navigation. The robot is equipped with various sensors that can detect unmapped obstacles, landmarks and objects. In this paper we describe the robot's industrial environment, it's architecture, a novel combined range and vision sensor and our recent results in controlling the robot in the real-time detection of objects using their color and in the processing of the robot's range and vision sensor data for navigation.

  11. Error Analysis System for Spacecraft Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.; Hart, R. C.; Hartman, K. R.; Tomcsik, T. L.; Searl, J. E.; Bernstein, A.

    1997-01-01

    The Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) is currently developing improved space-navigation filtering algorithms to use the Global Positioning System (GPS) for autonomous real-time onboard orbit determination. In connection with a GPS technology demonstration on the Small Satellite Technology Initiative (SSTI)/Lewis spacecraft, FDD analysts and programmers have teamed with the GSFC Guidance, Navigation, and Control Branch to develop the GPS Enhanced Orbit Determination Experiment (GEODE) system. The GEODE system consists of a Kalman filter operating as a navigation tool for estimating the position, velocity, and additional states required to accurately navigate the orbiting Lewis spacecraft by using astrodynamic modeling and GPS measurements from the receiver. A parallel effort at the FDD is the development of a GPS Error Analysis System (GEAS) that will be used to analyze and improve navigation filtering algorithms during development phases and during in-flight calibration. For GEAS, the Kalman filter theory is extended to estimate the errors in position, velocity, and other error states of interest. The estimation of errors in physical variables at regular intervals will allow the time, cause, and effect of navigation system weaknesses to be identified. In addition, by modeling a sufficient set of navigation system errors, a system failure that causes an observed error anomaly can be traced and accounted for. The GEAS software is formulated using Object Oriented Design (OOD) techniques implemented in the C++ programming language on a Sun SPARC workstation. The Phase 1 of this effort is the development of a basic system to be used to evaluate navigation algorithms implemented in the GEODE system. This paper presents the GEAS mathematical methodology, systems and operations concepts, and software design and implementation. Results from the use of the basic system to evaluate

  12. Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)

    NASA Technical Reports Server (NTRS)

    Folta, David C.; Hawkins, Albin; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    NASA's first autonomous formation flying mission completed its primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center (GSFC) implemented a universal 3-axis formation flying algorithm in an autonomous executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard flight design and presents the validation results of this unique system. Results from functionality assessment through fully autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a standalone algorithm.

  13. Intelligent Behavioral Action Aiding for Improved Autonomous Image Navigation

    DTIC Science & Technology

    2012-09-13

    odometry, SICK laser scanning unit ( Lidar ), Inertial Measurement Unit (IMU) and ultrasonic distance measurement system (Figure 32). The Lidar , IMU...2010, July) GPS world. [Online]. http://www.gpsworld.com/tech-talk- blog/gnss-independent-navigation-solution-using-integrated- lidar -data-11378 [4...Milford, David McKinnon, Michael Warren, Gordon Wyeth, and Ben Upcroft, "Feature-based Visual Odometry and Featureless Place Recognition for SLAM in

  14. Autonomous Mars ascent and orbit rendezvous for earth return missions

    NASA Technical Reports Server (NTRS)

    Edwards, H. C.; Balmanno, W. F.; Cruz, Manuel I.; Ilgen, Marc R.

    1991-01-01

    The details of tha assessment of autonomous Mars ascent and orbit rendezvous for earth return missions are presented. Analyses addressing navigation system assessments, trajectory planning, targeting approaches, flight control guidance strategies, and performance sensitivities are included. Tradeoffs in the analysis and design process are discussed.

  15. Survey of computer vision technology for UVA navigation

    NASA Astrophysics Data System (ADS)

    Xie, Bo; Fan, Xiang; Li, Sijian

    2017-11-01

    Navigation based on computer version technology, which has the characteristics of strong independence, high precision and is not susceptible to electrical interference, has attracted more and more attention in the filed of UAV navigation research. Early navigation project based on computer version technology mainly applied to autonomous ground robot. In recent years, the visual navigation system is widely applied to unmanned machine, deep space detector and underwater robot. That further stimulate the research of integrated navigation algorithm based on computer version technology. In China, with many types of UAV development and two lunar exploration, the three phase of the project started, there has been significant progress in the study of visual navigation. The paper expounds the development of navigation based on computer version technology in the filed of UAV navigation research and draw a conclusion that visual navigation is mainly applied to three aspects as follows.(1) Acquisition of UAV navigation parameters. The parameters, including UAV attitude, position and velocity information could be got according to the relationship between the images from sensors and carrier's attitude, the relationship between instant matching images and the reference images and the relationship between carrier's velocity and characteristics of sequential images.(2) Autonomous obstacle avoidance. There are many ways to achieve obstacle avoidance in UAV navigation. The methods based on computer version technology ,including feature matching, template matching, image frames and so on, are mainly introduced. (3) The target tracking, positioning. Using the obtained images, UAV position is calculated by using optical flow method, MeanShift algorithm, CamShift algorithm, Kalman filtering and particle filter algotithm. The paper expounds three kinds of mainstream visual system. (1) High speed visual system. It uses parallel structure, with which image detection and processing are

  16. Simulating the Liaison Navigation Concept in a Geo + Earth-Moon Halo Constellation

    NASA Technical Reports Server (NTRS)

    Fujimoto, K.; Leonard, J. M.; McGranaghan, R. M.; Parker, J. S.; Anderson, R. L.; Born, G. H.

    2012-01-01

    Linked Autonomous Interplanetary Satellite Orbit Navigation, or LiAISON, is a novel satellite navigation technique where relative radiometric measurements between two or more spacecraft in a constellation are processed to obtain the absolute state of all spacecraft. The method leverages the asymmetry of the gravity field that the constellation exists in. This paper takes a step forward in developing a high fidelity navigation simulation for the LiAISON concept in an Earth-Moon constellation. In particular, we aim to process two-way Doppler measurements between a satellite in GEO orbit and another in a halo orbit about the Earth-Moon L1 point.

  17. Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion

    NASA Astrophysics Data System (ADS)

    Steinhaus, Peter; Strand, Marcus; Dillmann, Rüdiger

    2007-12-01

    Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO) for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media) in Karlsruhe.

  18. Dynamic multisensor fusion for mobile robot navigation in an indoor environment

    NASA Astrophysics Data System (ADS)

    Jin, Taeseok; Lee, Jang-Myung; Luk, Bing L.; Tso, Shiu K.

    2001-10-01

    In this study, as the preliminary step for developing a multi-purpose Autonomous robust carrier mobile robot to transport trolleys or heavy goods and serve as robotic nursing assistant in hospital wards. The aim of this paper is to present the use of multi-sensor data fusion such as sonar, CCD camera dn IR sensor for map-building mobile robot to navigate, and presents an experimental mobile robot designed to operate autonomously within both indoor and outdoor environments. Smart sensory systems are crucial for successful autonomous systems. We will give an explanation for the robot system architecture designed and implemented in this study and a short review of existing techniques, since there exist several recent thorough books and review paper on this paper. Instead we will focus on the main results with relevance to the intelligent service robot project at the Centre of Intelligent Design, Automation & Manufacturing (CIDAM). We will conclude by discussing some possible future extensions of the project. It is first dealt with the general principle of the navigation and guidance architecture, then the detailed functions recognizing environments updated, obstacle detection and motion assessment, with the first results form the simulations run.

  19. Autonomous self-navigating drug-delivery vehicles: from science fiction to reality.

    PubMed

    Petrenko, Valery A

    2017-12-01

    Low efficacy of targeted nanomedicines in biological experiments enforced us to challenge the traditional concept of drug targeting and suggest a paradigm of 'addressed self-navigating drug-delivery vehicles,' in which affinity selection of targeting peptides and vasculature-directed in vivo phage screening is replaced by the migration selection, which explores ability of 'promiscuous' phages and their proteins to migrate through the tumor-surrounding cellular barriers, using a 'hub and spoke' delivery strategy, and penetrate into the tumor affecting the diverse tumor cell population. The 'self-navigating' drug-delivery paradigm can be used as a theoretical and technical platform in design of a novel generation of molecular medications and imaging probes for precise and personal medicine. [Formula: see text].

  20. Unmanned ground vehicles for integrated force protection

    NASA Astrophysics Data System (ADS)

    Carroll, Daniel M.; Mikell, Kenneth; Denewiler, Thomas

    2004-09-01

    The combination of Command and Control (C2) systems with Unmanned Ground Vehicles (UGVs) provides Integrated Force Protection from the Robotic Operation Command Center. Autonomous UGVs are directed as Force Projection units. UGV payloads and fixed sensors provide situational awareness while unattended munitions provide a less-than-lethal response capability. Remote resources serve as automated interfaces to legacy physical devices such as manned response vehicles, barrier gates, fence openings, garage doors, and remote power on/off capability for unmanned systems. The Robotic Operations Command Center executes the Multiple Resource Host Architecture (MRHA) to simultaneously control heterogeneous unmanned systems. The MRHA graphically displays video, map, and status for each resource using wireless digital communications for integrated data, video, and audio. Events are prioritized and the user is prompted with audio alerts and text instructions for alarms and warnings. A control hierarchy of missions and duty rosters support autonomous operations. This paper provides an overview of the key technology enablers for Integrated Force Protection with details on a force-on-force scenario to test and demonstrate concept of operations using Unmanned Ground Vehicles. Special attention is given to development and applications for the Remote Detection Challenge and Response (REDCAR) initiative for Integrated Base Defense.

  1. Robust analysis of an underwater navigational strategy in electrically heterogeneous corridors.

    PubMed

    Dimble, Kedar D; Ranganathan, Badri N; Keshavan, Jishnu; Humbert, J Sean

    2016-08-01

    Obstacles and other global stimuli provide relevant navigational cues to a weakly electric fish. In this work, robust analysis of a control strategy based on electrolocation for performing obstacle avoidance in electrically heterogeneous corridors is presented and validated. Static output feedback control is shown to achieve the desired goal of reflexive obstacle avoidance in such environments in simulation and experimentation. The proposed approach is computationally inexpensive and readily implementable on a small scale underwater vehicle, making underwater autonomous navigation feasible in real-time.

  2. Image Dependent Relative Formation Navigation for Autonomous Aerial Refueling

    DTIC Science & Technology

    2011-03-01

    and local variations of the Earth’s surface make a mathematical model difficult to create and use. The definition of an equipotential surface ...controlled with flight control surfaces attached to it. To refuel using this method, the receiver pilot flies the aircraft to within a defined refueling...I-frame would unnecessarily complicate aircraft navigation that, by definition, is limited to altitudes relatively close to the surface of the Earth

  3. Bio-Inspired Navigation of Chemical Plumes

    DTIC Science & Technology

    2006-07-01

    Bio-Inspired Navigation of Chemical Plumes Maynard J. Porter III, Captain, USAF Department of Electrical and Computer Engineering Air Force Institute...Li. " Chemical plume tracing via an autonomous underwater vehicle". IEEE Journal of Ocean Engineering , 30(2):428— 442, 2005. [6] G. A. Nevitt...Electrical and Computer Engineering Air Force Institute of Technology Dayton, OH 45433-7765, U.S.A. juan.vasquez@afit.edu May 31, 2006 Abstract - The

  4. The study of stereo vision technique for the autonomous vehicle

    NASA Astrophysics Data System (ADS)

    Li, Pei; Wang, Xi; Wang, Jiang-feng

    2015-08-01

    The stereo vision technology by two or more cameras could recovery 3D information of the field of view. This technology can effectively help the autonomous navigation system of unmanned vehicle to judge the pavement conditions within the field of view, and to measure the obstacles on the road. In this paper, the stereo vision technology in measuring the avoidance of the autonomous vehicle is studied and the key techniques are analyzed and discussed. The system hardware of the system is built and the software is debugged, and finally the measurement effect is explained by the measured data. Experiments show that the 3D reconstruction, within the field of view, can be rebuilt by the stereo vision technology effectively, and provide the basis for pavement condition judgment. Compared with unmanned vehicle navigation radar used in measuring system, the stereo vision system has the advantages of low cost, distance and so on, it has a good application prospect.

  5. The Ship Movement Trajectory Prediction Algorithm Using Navigational Data Fusion.

    PubMed

    Borkowski, Piotr

    2017-06-20

    It is essential for the marine navigator conducting maneuvers of his ship at sea to know future positions of himself and target ships in a specific time span to effectively solve collision situations. This article presents an algorithm of ship movement trajectory prediction, which, through data fusion, takes into account measurements of the ship's current position from a number of doubled autonomous devices. This increases the reliability and accuracy of prediction. The algorithm has been implemented in NAVDEC, a navigation decision support system and practically used on board ships.

  6. The Ship Movement Trajectory Prediction Algorithm Using Navigational Data Fusion

    PubMed Central

    Borkowski, Piotr

    2017-01-01

    It is essential for the marine navigator conducting maneuvers of his ship at sea to know future positions of himself and target ships in a specific time span to effectively solve collision situations. This article presents an algorithm of ship movement trajectory prediction, which, through data fusion, takes into account measurements of the ship’s current position from a number of doubled autonomous devices. This increases the reliability and accuracy of prediction. The algorithm has been implemented in NAVDEC, a navigation decision support system and practically used on board ships. PMID:28632176

  7. Intelligent behavior generator for autonomous mobile robots using planning-based AI decision making and supervisory control logic

    NASA Astrophysics Data System (ADS)

    Shah, Hitesh K.; Bahl, Vikas; Martin, Jason; Flann, Nicholas S.; Moore, Kevin L.

    2002-07-01

    In earlier research the Center for Self-Organizing and Intelligent Systems (CSOIS) at Utah State University (USU) have been funded by the US Army Tank-Automotive and Armaments Command's (TACOM) Intelligent Mobility Program to develop and demonstrate enhanced mobility concepts for unmanned ground vehicles (UGVs). One among the several out growths of this work has been the development of a grammar-based approach to intelligent behavior generation for commanding autonomous robotic vehicles. In this paper we describe the use of this grammar for enabling autonomous behaviors. A supervisory task controller (STC) sequences high-level action commands (taken from the grammar) to be executed by the robot. It takes as input a set of goals and a partial (static) map of the environment and produces, from the grammar, a flexible script (or sequence) of the high-level commands that are to be executed by the robot. The sequence is derived by a planning function that uses a graph-based heuristic search (A* -algorithm). Each action command has specific exit conditions that are evaluated by the STC following each task completion or interruption (in the case of disturbances or new operator requests). Depending on the system's state at task completion or interruption (including updated environmental and robot sensor information), the STC invokes a reactive response. This can include sequencing the pending tasks or initiating a re-planning event, if necessary. Though applicable to a wide variety of autonomous robots, an application of this approach is demonstrated via simulations of ODIS, an omni-directional inspection system developed for security applications.

  8. Integrated INS/GPS Navigation from a Popular Perspective

    NASA Technical Reports Server (NTRS)

    Omerbashich, Mensur

    2002-01-01

    Inertial navigation, blended with other navigation aids, Global Positioning System (GPS) in particular, has gained significance due to enhanced navigation and inertial reference performance and dissimilarity for fault tolerance and anti-jamming. Relatively new concepts based upon using Differential GPS (DGPS) blended with Inertial (and visual) Navigation Sensors (INS) offer the possibility of low cost, autonomous aircraft landing. The FAA has decided to implement the system in a sophisticated form as a new standard navigation tool during this decade. There have been a number of new inertial sensor concepts in the recent past that emphasize increased accuracy of INS/GPS versus INS and reliability of navigation, as well as lower size and weight, and higher power, fault tolerance, and long life. The principles of GPS are not discussed; rather the attention is directed towards general concepts and comparative advantages. A short introduction to the problems faced in kinematics is presented. The intention is to relate the basic principles of kinematics to probably the most used navigation method in the future-INS/GPS. An example of the airborne INS is presented, with emphasis on how it works. The discussion of the error types and sources in navigation, and of the role of filters in optimal estimation of the errors then follows. The main question this paper is trying to answer is 'What are the benefits of the integration of INS and GPS and how is this, navigation concept of the future achieved in reality?' The main goal is to communicate the idea about what stands behind a modern navigation method.

  9. Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm

    PubMed Central

    Kyristsis, Sarantis; Antonopoulos, Angelos; Chanialakis, Theofilos; Stefanakis, Emmanouel; Linardos, Christos; Tripolitsiotis, Achilles; Partsinevelos, Panagiotis

    2016-01-01

    Nowadays, various unmanned aerial vehicle (UAV) applications become increasingly demanding since they require real-time, autonomous and intelligent functions. Towards this end, in the present study, a fully autonomous UAV scenario is implemented, including the tasks of area scanning, target recognition, geo-location, monitoring, following and finally landing on a high speed moving platform. The underlying methodology includes AprilTag target identification through Graphics Processing Unit (GPU) parallelized processing, image processing and several optimized locations and approach algorithms employing gimbal movement, Global Navigation Satellite System (GNSS) readings and UAV navigation. For the experimentation, a commercial and a custom made quad-copter prototype were used, portraying a high and a low-computational embedded platform alternative. Among the successful targeting and follow procedures, it is shown that the landing approach can be successfully performed even under high platform speeds. PMID:27827883

  10. Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm.

    PubMed

    Kyristsis, Sarantis; Antonopoulos, Angelos; Chanialakis, Theofilos; Stefanakis, Emmanouel; Linardos, Christos; Tripolitsiotis, Achilles; Partsinevelos, Panagiotis

    2016-11-03

    Nowadays, various unmanned aerial vehicle (UAV) applications become increasingly demanding since they require real-time, autonomous and intelligent functions. Towards this end, in the present study, a fully autonomous UAV scenario is implemented, including the tasks of area scanning, target recognition, geo-location, monitoring, following and finally landing on a high speed moving platform. The underlying methodology includes AprilTag target identification through Graphics Processing Unit (GPU) parallelized processing, image processing and several optimized locations and approach algorithms employing gimbal movement, Global Navigation Satellite System (GNSS) readings and UAV navigation. For the experimentation, a commercial and a custom made quad-copter prototype were used, portraying a high and a low-computational embedded platform alternative. Among the successful targeting and follow procedures, it is shown that the landing approach can be successfully performed even under high platform speeds.

  11. Target Trailing With Safe Navigation for Maritime Autonomous Surface Vehicles

    NASA Technical Reports Server (NTRS)

    Wolf, Michael; Kuwata, Yoshiaki; Zarzhitsky, Dimitri V.

    2013-01-01

    This software implements a motion-planning module for a maritime autonomous surface vehicle (ASV). The module trails a given target while also avoiding static and dynamic surface hazards. When surface hazards are other moving boats, the motion planner must apply International Regulations for Avoiding Collisions at Sea (COLREGS). A key subset of these rules has been implemented in the software. In case contact with the target is lost, the software can receive and follow a "reacquisition route," provided by a complementary system, until the target is reacquired. The programmatic intention is that the trailed target is a submarine, although any mobile naval platform could serve as the target. The algorithmic approach to combining motion with a (possibly moving) goal location, while avoiding local hazards, may be applicable to robotic rovers, automated landing systems, and autonomous airships. The software operates in JPL s CARACaS (Control Architecture for Robotic Agent Command and Sensing) software architecture and relies on other modules for environmental perception data and information on the predicted detectability of the target, as well as the low-level interface to the boat controls.

  12. Preliminary Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)

    NASA Technical Reports Server (NTRS)

    Folta, David; Hawkins, Albin

    2001-01-01

    NASA's first autonomous formation flying mission is completing a primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center has implemented an autonomous universal three-axis formation flying algorithm in executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard design and presents the preliminary validation results of this unique system. Results from functionality assessment and autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a stand-alone algorithm.

  13. First Results from a Hardware-in-the-Loop Demonstration of Closed-Loop Autonomous Formation Flying

    NASA Technical Reports Server (NTRS)

    Gill, E.; Naasz, Bo; Ebinuma, T.

    2003-01-01

    A closed-loop system for the demonstration of formation flying technologies has been developed at NASA s Goddard Space Flight Center. Making use of a GPS signal simulator with a dual radio frequency outlet, the system includes two GPS space receivers as well as a powerful onboard navigation processor dedicated to the GPS-based guidance, navigation, and control of a satellite formation in real-time. The closed-loop system allows realistic simulations of autonomous formation flying scenarios, enabling research in the fields of tracking and orbit control strategies for a wide range of applications. A sample scenario has been set up where the autonomous transition of a satellite formation from an initial along-track separation of 800 m to a final distance of 100 m has been demonstrated. As a result, a typical control accuracy of about 5 m has been achieved which proves the applicability of autonomous formation flying techniques to formations of satellites as close as 50 m.

  14. Mobile Autonomous Humanoid Assistant

    NASA Technical Reports Server (NTRS)

    Diftler, M. A.; Ambrose, R. O.; Tyree, K. S.; Goza, S. M.; Huber, E. L.

    2004-01-01

    A mobile autonomous humanoid robot is assisting human co-workers at the Johnson Space Center with tool handling tasks. This robot combines the upper body of the National Aeronautics and Space Administration (NASA)/Defense Advanced Research Projects Agency (DARPA) Robonaut system with a Segway(TradeMark) Robotic Mobility Platform yielding a dexterous, maneuverable humanoid perfect for aiding human co-workers in a range of environments. This system uses stereo vision to locate human team mates and tools and a navigation system that uses laser range and vision data to follow humans while avoiding obstacles. Tactile sensors provide information to grasping algorithms for efficient tool exchanges. The autonomous architecture utilizes these pre-programmed skills to form human assistant behaviors. The initial behavior demonstrates a robust capability to assist a human by acquiring a tool from a remotely located individual and then following the human in a cluttered environment with the tool for future use.

  15. Bioinspired magnetoreception and navigation using magnetic signatures as waypoints.

    PubMed

    Taylor, Brian K

    2018-05-15

    Diverse taxa use Earth's magnetic field in conjunction with other sensory modalities to accomplish navigation tasks ranging from local homing to long-distance migration across continents and ocean basins. However, despite extensive research, the mechanisms that underlie animal magnetoreception are not clearly understood, and how animals use Earth's magnetic field to navigate is an active area of investigation. Concurrently, Earth's magnetic field offers a signal that engineered systems can leverage for navigation in environments where man-made systems such as GPS are unavailable or unreliable. Using a proxy for Earth's magnetic field, and inspired by migratory animal behavior, this work implements a behavioral strategy that uses combinations of magnetic field properties as rare or unique signatures that mark specific locations. Using a discrete number of these signatures as goal waypoints, the strategy navigates through a closed set of points several times in a variety of environmental conditions, and with various levels of sensor noise. The results from this engineering/quantitative biology approach support existing notions that some animals may use combinations of magnetic properties as navigational markers, and provides insights into features and constraints that would enable navigational success or failure. The findings also offer insights into how autonomous engineered platforms might be designed to leverage the magnetic field as a navigational resource.

  16. Optical Navigation for the Orion Vehicle

    NASA Technical Reports Server (NTRS)

    Crain, Timothy; Getchius, Joel; D'Souza, Christopher

    2008-01-01

    The Orion vehicle is being designed to provide nominal crew transport to the lunar transportation stack in low Earth orbit, crew abort prior during transit to the moon, and crew return to Earth once lunar orbit is achieved. One of the design requirements levied on the Orion vehicle is the ability to return to the vehicle and crew to Earth in the case of loss of communications and command with the Mission Control Center. Central to fulfilling this requirement, is the ability of Orion to navigate autonomously. In low-Earth orbit, this may be solved with the use of GPS, but in cis-lunar and lunar orbit this requires optical navigation. This paper documents the preliminary analyses performed by members of the Orion Orbit GN&C System team.

  17. GPS/DR Error Estimation for Autonomous Vehicle Localization.

    PubMed

    Lee, Byung-Hyun; Song, Jong-Hwa; Im, Jun-Hyuck; Im, Sung-Hyuck; Heo, Moon-Beom; Jee, Gyu-In

    2015-08-21

    Autonomous vehicles require highly reliable navigation capabilities. For example, a lane-following method cannot be applied in an intersection without lanes, and since typical lane detection is performed using a straight-line model, errors can occur when the lateral distance is estimated in curved sections due to a model mismatch. Therefore, this paper proposes a localization method that uses GPS/DR error estimation based on a lane detection method with curved lane models, stop line detection, and curve matching in order to improve the performance during waypoint following procedures. The advantage of using the proposed method is that position information can be provided for autonomous driving through intersections, in sections with sharp curves, and in curved sections following a straight section. The proposed method was applied in autonomous vehicles at an experimental site to evaluate its performance, and the results indicate that the positioning achieved accuracy at the sub-meter level.

  18. Systems and Methods for Automated Vessel Navigation Using Sea State Prediction

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance L. (Inventor); Howard, Andrew B. (Inventor); Reinhart, Rene Felix (Inventor); Aghazarian, Hrand (Inventor); Rankin, Arturo (Inventor)

    2017-01-01

    Systems and methods for sea state prediction and autonomous navigation in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes a method of predicting a future sea state including generating a sequence of at least two 3D images of a sea surface using at least two image sensors, detecting peaks and troughs in the 3D images using a processor, identifying at least one wavefront in each 3D image based upon the detected peaks and troughs using the processor, characterizing at least one propagating wave based upon the propagation of wavefronts detected in the sequence of 3D images using the processor, and predicting a future sea state using at least one propagating wave characterizing the propagation of wavefronts in the sequence of 3D images using the processor. Another embodiment includes a method of autonomous vessel navigation based upon a predicted sea state and target location.

  19. Systems and Methods for Automated Vessel Navigation Using Sea State Prediction

    NASA Technical Reports Server (NTRS)

    Aghazarian, Hrand (Inventor); Reinhart, Rene Felix (Inventor); Huntsberger, Terrance L. (Inventor); Rankin, Arturo (Inventor); Howard, Andrew B. (Inventor)

    2015-01-01

    Systems and methods for sea state prediction and autonomous navigation in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes a method of predicting a future sea state including generating a sequence of at least two 3D images of a sea surface using at least two image sensors, detecting peaks and troughs in the 3D images using a processor, identifying at least one wavefront in each 3D image based upon the detected peaks and troughs using the processor, characterizing at least one propagating wave based upon the propagation of wavefronts detected in the sequence of 3D images using the processor, and predicting a future sea state using at least one propagating wave characterizing the propagation of wavefronts in the sequence of 3D images using the processor. Another embodiment includes a method of autonomous vessel navigation based upon a predicted sea state and target location.

  20. Analysis of Unmanned Systems in Military Logistics

    DTIC Science & Technology

    2016-12-01

    opportunities to employ unmanned systems to support logistic operations. 14. SUBJECT TERMS unmanned systems, robotics , UAVs, UGVs, USVs, UUVs, military...Industrial Robots at Warehouses / Distribution Centers .............................................................................. 17 2. Unmanned...Autonomous Robot Gun Turret. Source: Blain (2010)................................................... 33 Figure 4. Robot Sentries for Base Patrol

  1. Fundamentals of satellite navigation

    NASA Astrophysics Data System (ADS)

    Stiller, A. H.

    The basic operating principles and capabilities of conventional and satellite-based navigation systems for air, sea, and land vehicles are reviewed and illustrated with diagrams. Consideration is given to autonomous onboard systems; systems based on visible or radio beacons; the Transit, Cicada, Navstar-GPS, and Glonass satellite systems; the physical laws and parameters of satellite motion; the definition of time in satellite systems; and the content of the demodulated GPS data signal. The GPS and Glonass data format frames are presented graphically, and tables listing the GPS and Glonass satellites, their technical characteristics, and the (past or scheduled) launch dates are provided.

  2. Map based navigation for autonomous underwater vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuohy, S.T.; Leonard, J.J.; Bellingham, J.G.

    1995-12-31

    In this work, a map based navigation algorithm is developed wherein measured geophysical properties are matched to a priori maps. The objectives is a complete algorithm applicable to a small, power-limited AUV which performs in real time to a required resolution with bounded position error. Interval B-Splines are introduced for the non-linear representation of two-dimensional geophysical parameters that have measurement uncertainty. Fine-scale position determination involves the solution of a system of nonlinear polynomial equations with interval coefficients. This system represents the complete set of possible vehicle locations and is formulated as the intersection of contours established on each map frommore » the simultaneous measurement of associated geophysical parameters. A standard filter mechanisms, based on a bounded interval error model, predicts the position of the vehicle and, therefore, screens extraneous solutions. When multiple solutions are found, a tracking mechanisms is applied until a unique vehicle location is determined.« less

  3. Autonomous mobile robot teams

    NASA Technical Reports Server (NTRS)

    Agah, Arvin; Bekey, George A.

    1994-01-01

    This paper describes autonomous mobile robot teams performing tasks in unstructured environments. The behavior and the intelligence of the group is distributed, and the system does not include a central command base or leader. The novel concept of the Tropism-Based Cognitive Architecture is introduced, which is used by the robots in order to produce behavior transforming their sensory information to proper action. The results of a number of simulation experiments are presented. These experiments include worlds where the robot teams must locate, decompose, and gather objects, and defend themselves against hostile predators, while navigating around stationary and mobile obstacles.

  4. 3D photo mosaicing of Tagiri shallow vent field by an autonomous underwater vehicle (3rd report) - Mosaicing method based on navigation data and visual features -

    NASA Astrophysics Data System (ADS)

    Maki, Toshihiro; Ura, Tamaki; Singh, Hanumant; Sakamaki, Takashi

    Large-area seafloor imaging will bring significant benefits to various fields such as academics, resource survey, marine development, security, and search-and-rescue. The authors have proposed a navigation method of an autonomous underwater vehicle for seafloor imaging, and verified its performance through mapping tubeworm colonies with the area of 3,000 square meters using the AUV Tri-Dog 1 at Tagiri vent field, Kagoshima bay in Japan (Maki et al., 2008, 2009). This paper proposes a post-processing method to build a natural photo mosaic from a number of pictures taken by an underwater platform. The method firstly removes lens distortion, invariances of color and lighting from each image, and then ortho-rectification is performed based on camera pose and seafloor estimated by navigation data. The image alignment is based on both navigation data and visual characteristics, implemented as an expansion of the image based method (Pizarro et al., 2003). Using the two types of information realizes an image alignment that is consistent both globally and locally, as well as making the method applicable to data sets with little visual keys. The method was evaluated using a data set obtained by the AUV Tri-Dog 1 at the vent field in Sep. 2009. A seamless, uniformly illuminated photo mosaic covering the area of around 500 square meters was created from 391 pictures, which covers unique features of the field such as bacteria mats and tubeworm colonies.

  5. Laser Range and Bearing Finder for Autonomous Missions

    NASA Technical Reports Server (NTRS)

    Granade, Stephen R.

    2004-01-01

    NASA has recently re-confirmed their interest in autonomous systems as an enabling technology for future missions. In order for autonomous missions to be possible, highly-capable relative sensor systems are needed to determine an object's distance, direction, and orientation. This is true whether the mission is autonomous in-space assembly, rendezvous and docking, or rover surface navigation. Advanced Optical Systems, Inc. has developed a wide-angle laser range and bearing finder (RBF) for autonomous space missions. The laser RBF has a number of features that make it well-suited for autonomous missions. It has an operating range of 10 m to 5 km, with a 5 deg field of view. Its wide field of view removes the need for scanning systems such as gimbals, eliminating moving parts and making the sensor simpler and space qualification easier. Its range accuracy is 1% or better. It is designed to operate either as a stand-alone sensor or in tandem with a sensor that returns range, bearing, and orientation at close ranges, such as NASA's Advanced Video Guidance Sensor. We have assembled the initial prototype and are currently testing it. We will discuss the laser RBF's design and specifications. Keywords: laser range and bearing finder, autonomous rendezvous and docking, space sensors, on-orbit sensors, advanced video guidance sensor

  6. Augmented reality user interface for mobile ground robots with manipulator arms

    NASA Astrophysics Data System (ADS)

    Vozar, Steven; Tilbury, Dawn M.

    2011-01-01

    Augmented Reality (AR) is a technology in which real-world visual data is combined with an overlay of computer graphics, enhancing the original feed. AR is an attractive tool for teleoperated UGV UIs as it can improve communication between robots and users via an intuitive spatial and visual dialogue, thereby increasing operator situational awareness. The successful operation of UGVs often relies upon both chassis navigation and manipulator arm control, and since existing literature usually focuses on one task or the other, there is a gap in mobile robot UIs that take advantage of AR for both applications. This work describes the development and analysis of an AR UI system for a UGV with an attached manipulator arm. The system supplements a video feed shown to an operator with information about geometric relationships within the robot task space to improve the operator's situational awareness. Previous studies on AR systems and preliminary analyses indicate that such an implementation of AR for a mobile robot with a manipulator arm is anticipated to improve operator performance. A full user-study can determine if this hypothesis is supported by performing an analysis of variance on common test metrics associated with UGV teleoperation.

  7. Preliminary Operational Results of the TDRSS Onboard Navigation System (TONS) for the Terra Mission

    NASA Technical Reports Server (NTRS)

    Gramling, Cheryl; Lorah, John; Santoro, Ernest; Work, Kevin; Chambers, Robert; Bauer, Frank H. (Technical Monitor)

    2000-01-01

    The Earth Observing System Terra spacecraft was launched on December 18, 1999, to provide data for the characterization of the terrestrial and oceanic surfaces, clouds, radiation, aerosols, and radiative balance. The Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (ONS) (TONS) flying on Terra provides the spacecraft with an operational real-time navigation solution. TONS is a passive system that makes judicious use of Terra's communication and computer subsystems. An objective of the ONS developed by NASA's Goddard Space Flight Center (GSFC) Guidance, Navigation and Control Center is to provide autonomous navigation with minimal power, weight, and volume impact on the user spacecraft. TONS relies on extracting tracking measurements onboard from a TDRSS forward-link communication signal and processing these measurements in an onboard extended Kalman filter to estimate Terra's current state. Terra is the first NASA low Earth orbiting mission to fly autonomous navigation which produces accurate results. The science orbital accuracy requirements for Terra are 150 meters (m) (3sigma) per axis with a goal of 5m (1 sigma) RSS which TONS is expected to meet. The TONS solutions are telemetered in real-time to the mission scientists along with their science data for immediate processing. Once set in the operational mode, TONS eliminates the need for ground orbit determination and allows for a smooth flow from the spacecraft telemetry to planning products for the mission team. This paper will present the preliminary results of the operational TONS solution available from Terra.

  8. Water Detection Based on Sky Reflections

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.; Matthies, Larry H.

    2010-01-01

    This software has been designed to detect water bodies that are out in the open on cross-country terrain at mid- to far-range (approximately 20 100 meters), using imagery acquired from a stereo pair of color cameras mounted on a terrestrial, unmanned ground vehicle (UGV). Non-traversable water bodies, such as large puddles, ponds, and lakes, are indirectly detected by detecting reflections of the sky below the horizon in color imagery. The appearance of water bodies in color imagery largely depends on the ratio of light reflected off the water surface to the light coming out of the water body. When a water body is far away, the angle of incidence is large, and the light reflected off the water surface dominates. We have exploited this behavior to detect water bodies out in the open at mid- to far-range. When a water body is detected at far range, a UGV s path planner can begin to look for alternate routes to the goal position sooner, rather than later. As a result, detecting water hazards at far range generally reduces the time required to reach a goal position during autonomous navigation. This software implements a new water detector based on sky reflections that geometrically locates the exact pixel in the sky that is reflecting on a candidate water pixel on the ground, and predicts if the ground pixel is water based on color similarity and local terrain features

  9. Daytime Water Detection Based on Color Variation

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.; Matthies, Larry H.

    2010-01-01

    Robust water detection is a critical perception requirement for unmanned ground vehicle (UGV) autonomous navigation. This is particularly true in wide open areas where water can collect in naturally occurring terrain depressions during periods of heavy precipitation and form large water bodies (such as ponds). At far range, reflections of the sky provide a strong cue for water. But at close range, the color coming out of a water body dominates sky reflections and the water cue from sky reflections is of marginal use. We model this behavior by using water body intensity data from multiple frames of RGB imagery to estimate the total reflection coefficient contribution from surface reflections and the combination of all other factors. Then we describe an algorithm that uses one of the color cameras in a forward- looking, UGV-mounted stereo-vision perception system to detect water bodies in wide open areas. This detector exploits the knowledge that the change in saturation-to-brightness ratio across a water body from the leading to trailing edge is uniform and distinct from other terrain types. In test sequences approaching a pond under clear, overcast, and cloudy sky conditions, the true positive and false negative water detection rates were (95.76%, 96.71%, 98.77%) and (0.45%, 0.60%, 0.62%), respectively. This software has been integrated on an experimental unmanned vehicle and field tested at Ft. Indiantown Gap, PA.

  10. Sextant X-Ray Pulsar Navigation Demonstration: Initial On-Orbit Results

    NASA Technical Reports Server (NTRS)

    Mitchell, Jason W.; Winternitz, Luke M.; Hassouneh, Munther A.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wolff, Michael T.; Kerr, Matthew; Wood, Kent S.; hide

    2018-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a technology demonstration enhancement to the Neutron-star Interior Composition Explorer (NICER) mission. SEXTANT will be a first demonstration of in-space, autonomous, X-ray pulsar navigation (XNAV). Navigating using millisecond X-ray pulsars which could provide a GPS-like navigation capability available throughout our Solar System and beyond. NICER is a NASA Astrophysics Explorer Mission of Opportunity to the International Space Station that was launched and installed in June of 2017. During NICER's nominal 18-month base mission, SEXTANT will perform a number of experiments to demonstrate XNAV and advance the technology on a number of fronts. In this work, we review the SEXTANT, its goals, and present early results from SEXTANT experiments conducted in the first six months of operation. With these results, SEXTANT has made significant progress toward meeting its primary and secondary mission goals. We also describe the SEXTANT flight operations, calibration activities, and initial results.

  11. Terrain classification in navigation of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Dodds, David R.

    1991-03-01

    In this paper we describe a method of path planning that integrates terrain classification (by means of fractals) the certainty grid method of spatial representation Kehtarnavaz Griswold collision-zones Dubois Prade fuzzy temporal and spatial knowledge and non-point sized qualitative navigational planning. An initially planned (" end-to-end" ) path is piece-wise modified to accommodate known and inferred moving obstacles and includes attention to time-varying multiple subgoals which may influence a section of path at a time after the robot has begun traversing that planned path.

  12. Insect-Based Vision for Autonomous Vehicles: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandyam V.

    1999-01-01

    The aims of the project were to use a high-speed digital video camera to pursue two questions: (1) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; (2) To study the fine structure of insect flight trajectories in order to better understand the characteristics of flight control, orientation and navigation.

  13. Insect-Based Vision for Autonomous Vehicles: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandyam V.

    1999-01-01

    The aims of the project were to use a high-speed digital video camera to pursue two questions: i) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; To study the fine structure of insect flight trajectories with in order to better understand the characteristics of flight control, orientation and navigation.

  14. GPS/DR Error Estimation for Autonomous Vehicle Localization

    PubMed Central

    Lee, Byung-Hyun; Song, Jong-Hwa; Im, Jun-Hyuck; Im, Sung-Hyuck; Heo, Moon-Beom; Jee, Gyu-In

    2015-01-01

    Autonomous vehicles require highly reliable navigation capabilities. For example, a lane-following method cannot be applied in an intersection without lanes, and since typical lane detection is performed using a straight-line model, errors can occur when the lateral distance is estimated in curved sections due to a model mismatch. Therefore, this paper proposes a localization method that uses GPS/DR error estimation based on a lane detection method with curved lane models, stop line detection, and curve matching in order to improve the performance during waypoint following procedures. The advantage of using the proposed method is that position information can be provided for autonomous driving through intersections, in sections with sharp curves, and in curved sections following a straight section. The proposed method was applied in autonomous vehicles at an experimental site to evaluate its performance, and the results indicate that the positioning achieved accuracy at the sub-meter level. PMID:26307997

  15. An Autonomous Control System for an Intra-Vehicular Spacecraft Mobile Monitor Prototype

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Desiano, Salvatore D.; Gawdiak, Yuri; Nicewarner, Keith

    2003-01-01

    This paper presents an overview of an ongoing research and development effort at the NASA Ames Research Center to create an autonomous control system for an internal spacecraft autonomous mobile monitor. It primary functions are to provide crew support and perform intra- vehicular sensing activities by autonomously navigating onboard the International Space Station. We describe the mission roles and high-level functional requirements for an autonomous mobile monitor. The mobile monitor prototypes, of which two are operational and one is actively being designed, physical test facilities used to perform ground testing, including a 3D micro-gravity test facility, and simulators are briefly described. We provide an overview of the autonomy framework and describe each of its components, including those used for automated planning, goal-oriented task execution, diagnosis, and fault recovery. A sample mission test scenario is also described.

  16. Novel microbial diversity retrieved by autonomous robotic exploration of the world's deepest vertical phreatic sinkhole.

    PubMed

    Sahl, Jason W; Fairfield, Nathaniel; Harris, J Kirk; Wettergreen, David; Stone, William C; Spear, John R

    2010-03-01

    The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (approximately 318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment.

  17. Under-vehicle autonomous inspection through undercarriage signatures

    NASA Astrophysics Data System (ADS)

    Schoenherr, Edward; Smuda, Bill

    2005-05-01

    Increased threats to gate security have caused recent need for improved vehicle inspection methods at security checkpoints in various fields of defense and security. A fast, reliable system of under-vehicle inspection that detects possibly harmful or unwanted materials hidden on vehicle undercarriages and notifies the user of the presence of these materials while allowing the user a safe standoff distance from the inspection site is desirable. An autonomous under-vehicle inspection system would provide for this. The proposed system would function as follows: A low-clearance tele-operated robotic platform would be equipped with sonar/laser range finding sensors as well as a video camera. As a vehicle to be inspected enters a checkpoint, the robot would autonomously navigate under the vehicle, using algorithms to detect tire locations for weigh points. During this navigation, data would be collected from the sonar/laser range finding hardware. This range data would be used to compile an impression of the vehicle undercarriage. Once this impression is complete, the system would compare it to a database of pre-scanned undercarriage impressions. Based on vehicle makes and models, any variance between the undercarriage being inspected and the impression compared against in the database would be marked as potentially threatening. If such variances exist, the robot would navigate to these locations and place the video camera in such a manner that the location in question can be viewed from a standoff position through a TV monitor. At this time, manual control of the robot navigation and camera control can be taken to imply further, more detailed inspection of the area/materials in question. After-market vehicle modifications would provide some difficulty, yet with enough pre-screening of such modifications, the system should still prove accurate. Also, impression scans that are taken in the field can be stored and tagged with a vehicles's license plate number, and future

  18. Efforts toward an autonomous wheelchair - biomed 2011.

    PubMed

    Barrett, Steven; Streeter, Robert

    2011-01-01

    An autonomous wheelchair is in development to provide mobility to those with significant physical challenges. The overall goal of the project is to develop a wheelchair that is fully autonomous with the ability to navigate about an environment and negotiate obstacles. As a starting point for the project, we have reversed engineered the joystick control system of an off-the-shelf commercially available wheelchair. The joystick control has been replaced with a microcontroller based system. The microcontroller has the capability to interface with a number of subsystems currently under development including wheel odometers, obstacle avoidance sensors, and ultrasonic-based wall sensors. This paper will discuss the microcontroller based system and provide a detailed system description. Results of this study may be adapted to commercial or military robot control.

  19. Recent CESAR (Center for Engineering Systems Advanced Research) research activities in sensor based reasoning for autonomous machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pin, F.G.; de Saussure, G.; Spelt, P.F.

    1988-01-01

    This paper describes recent research activities at the Center for Engineering Systems Advanced Research (CESAR) in the area of sensor based reasoning, with emphasis being given to their application and implementation on our HERMIES-IIB autonomous mobile vehicle. These activities, including navigation and exploration in a-priori unknown and dynamic environments, goal recognition, vision-guided manipulation and sensor-driven machine learning, are discussed within the framework of a scenario in which an autonomous robot is asked to navigate through an unknown dynamic environment, explore, find and dock at the panel, read and understand the status of the panel's meters and dials, learn the functioningmore » of a process control panel, and successfully manipulate the control devices of the panel to solve a maintenance emergency problems. A demonstration of the successful implementation of the algorithms on our HERMIES-IIB autonomous robot for resolution of this scenario is presented. Conclusions are drawn concerning the applicability of the methodologies to more general classes of problems and implications for future work on sensor-driven reasoning for autonomous robots are discussed. 8 refs., 3 figs.« less

  20. Semi autonomous mine detection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Douglas Few; Roelof Versteeg; Herman Herman

    2010-04-01

    CMMAD is a risk reduction effort for the AMDS program. As part of CMMAD, multiple instances of semi autonomous robotic mine detection systems were created. Each instance consists of a robotic vehicle equipped with sensors required for navigation and marking, a countermine sensors and a number of integrated software packages which provide for real time processing of the countermine sensor data as well as integrated control of the robotic vehicle, the sensor actuator and the sensor. These systems were used to investigate critical interest functions (CIF) related to countermine robotic systems. To address the autonomy CIF, the INL developed RIKmore » was extended to allow for interaction with a mine sensor processing code (MSPC). In limited field testing this system performed well in detecting, marking and avoiding both AT and AP mines. Based on the results of the CMMAD investigation we conclude that autonomous robotic mine detection is feasible. In addition, CMMAD contributed critical technical advances with regard to sensing, data processing and sensor manipulation, which will advance the performance of future fieldable systems. As a result, no substantial technical barriers exist which preclude – from an autonomous robotic perspective – the rapid development and deployment of fieldable systems.« less

  1. Relative navigation for spacecraft formation flying

    NASA Technical Reports Server (NTRS)

    Hartman, Kate R.; Gramling, Cheryl J.; Lee, Taesul; Kelbel, David A.; Long, Anne C.

    1998-01-01

    The Goddard Space Flight Center Guidance, Navigation, and Control Center (GNCC) is currently developing and implementing advanced satellite systems to provide autonomous control of formation flyers. The initial formation maintenance capability will be flight-demonstrated on the Earth-Orbiter-1 (EO-1) satellite, which is planned under the National Aeronautics and Space Administration New Millennium Program to be a coflight with the Landsat-7 (L-7) satellite. Formation flying imposes relative navigation accuracy requirements in addition to the orbit accuracy requirements for the individual satellites. In the case of EO-1 and L-7, the two satellites are in nearly coplanar orbits, with a small difference in the longitude of the ascending node to compensate for the Earth's rotation. The GNCC has performed trajectory error analysis for the relative navigation of the EO-1/L-7 formation, as well as for a more advanced tracking configuration using cross-link satellite communications. This paper discusses the orbit determination and prediction accuracy achievable for EO-1 and L-7 under various tracking and orbit determination scenarios and discusses the expected relative separation errors in their formation flying configuration.

  2. Relative Navigation for Spacecraft Formation Flying

    NASA Technical Reports Server (NTRS)

    Hartman, Kate R.; Gramling, Cheryl J.; Lee, Taesul; Kelbel, David A.; Long, Anne C.

    1998-01-01

    The Goddard Space Flight Center Guidance, Navigation, and Control Center (GNCC) is currently developing and implementing advanced satellite systems to provide autonomous control of formation flyers. The initial formation maintenance capability will be flight-demonstrated on the Earth-Orbiter-1 (EO-l) satellite, which is planned under the National Aeronautics and Space Administration New Millennium Program to be a coflight with the Landsat-7 (L-7) satellite. Formation flying imposes relative navigation accuracy requirements in addition to the orbit accuracy requirements for the individual satellites. In the case of EO-1 and L-7, the two satellites are in nearly coplanar orbits, with a small difference in the longitude of the ascending node to compensate for the Earth's rotation. The GNCC has performed trajectory error analysis for the relative navigation of the EO-1/L-7 formation, as well as for a more advanced tracking configuration using cross- link satellite communications. This paper discusses the orbit determination and prediction accuracy achievable for EO-1 and L-7 under various tracking and orbit determination scenarios and discusses the expected relative separation errors in their formation flying configuration.

  3. The Mathematics of Navigating the Solar System

    NASA Technical Reports Server (NTRS)

    Hintz, Gerald

    2000-01-01

    In navigating spacecraft throughout the solar system, the space navigator relies on three academic disciplines - optimization, estimation, and control - that work on mathematical models of the real world. Thus, the navigator determines the flight path that will consume propellant and other resources in an efficient manner, determines where the craft is and predicts where it will go, and transfers it onto the optimal trajectory that meets operational and mission constraints. Mission requirements, for example, demand that observational measurements be made with sufficient precision that relativity must be modeled in collecting and fitting (the estimation process) the data, and propagating the trajectory. Thousands of parameters are now determined in near real-time to model the gravitational forces acting on a spacecraft in the vicinity of an irregularly shaped body. Completing these tasks requires mathematical models, analyses, and processing techniques. Newton, Gauss, Lambert, Legendre, and others are justly famous for their contributions to the mathematics of these tasks. More recently, graduate students participated in research to update the gravity model of the Saturnian system, including higher order gravity harmonics, tidal effects, and the influence of the rings. This investigation was conducted for the Cassini project to incorporate new trajectory modeling features in the navigation software. The resulting trajectory model will be used in navigating the 4-year tour of the Saturnian satellites. Also, undergraduate students are determining the ephemerides (locations versus time) of asteroids that will be used as reference objects in navigating the New Millennium's Deep Space 1 spacecraft autonomously.

  4. Real-time adaptive off-road vehicle navigation and terrain classification

    NASA Astrophysics Data System (ADS)

    Muller, Urs A.; Jackel, Lawrence D.; LeCun, Yann; Flepp, Beat

    2013-05-01

    We are developing a complete, self-contained autonomous navigation system for mobile robots that learns quickly, uses commodity components, and has the added benefit of emitting no radiation signature. It builds on the au­tonomous navigation technology developed by Net-Scale and New York University during the Defense Advanced Research Projects Agency (DARPA) Learning Applied to Ground Robots (LAGR) program and takes advantage of recent scientific advancements achieved during the DARPA Deep Learning program. In this paper we will present our approach and algorithms, show results from our vision system, discuss lessons learned from the past, and present our plans for further advancing vehicle autonomy.

  5. Autonomous detection of indoor and outdoor signs

    NASA Astrophysics Data System (ADS)

    Holden, Steven; Snorrason, Magnus; Goodsell, Thomas; Stevens, Mark R.

    2005-05-01

    Most goal-oriented mobile robot tasks involve navigation to one or more known locations. This is generally done using GPS coordinates and landmarks outdoors, or wall-following and fiducial marks indoors. Such approaches ignore the rich source of navigation information that is already in place for human navigation in all man-made environments: signs. A mobile robot capable of detecting and reading arbitrary signs could be tasked using directions that are intuitive to hu-mans, and it could report its location relative to intuitive landmarks (a street corner, a person's office, etc.). Such ability would not require active marking of the environment and would be functional in the absence of GPS. In this paper we present an updated version of a system we call Sign Understanding in Support of Autonomous Navigation (SUSAN). This system relies on cues common to most signs, the presence of text, vivid color, and compact shape. By not relying on templates, SUSAN can detect a wide variety of signs: traffic signs, street signs, store-name signs, building directories, room signs, etc. In this paper we focus on the text detection capability. We present results summarizing probability of detection and false alarm rate across many scenes containing signs of very different designs and in a variety of lighting conditions.

  6. New High-Altitude GPS Navigation Results from the Magnetospheric Multiscale Spacecraft and Simulations at Lunar Distances

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke B.; Bamford, William A.; Price, Samuel R.

    2017-01-01

    As reported in a companion work, in its first phase, NASA's 2015 highly elliptic Magnetospheric Multiscale (MMS) mission set a record for the highest altitude operational use of on-board GPS-based navigation, returning state estimates at 12 Earth radii. In early 2017 MMS transitioned to its second phase which doubled the apogee distance to 25 Earth radii, approaching halfway to the Moon. This paper will present results for GPS observability and navigation performance achieved in MMS Phase 2. Additionally, it will provide simulation results predicting the performance of the MMS navigation system applied to a pair of concept missions at Lunar distances. These studies will demonstrate how high-sensitivity GPS (or GNSS) receivers paired with onboard navigation software, as in MMS-Navigation system, can extend the envelope of autonomous onboard GPS navigation far from the Earth.

  7. Autonomous docking ground demonstration (category 3)

    NASA Technical Reports Server (NTRS)

    Lamkin, Steve L.; Eick, Richard E.; Baxter, James M.; Boyd, M. G.; Clark, Fred D.; Lee, Thomas Q.; Othon, L. T.; Prather, Joseph L.; Spehar, Peter T.; Teders, Rebecca J.

    1991-01-01

    The NASA Johnson Space Center (JSC) is involved in the development of an autonomous docking ground demonstration. The demonstration combines the technologies, expertise and facilities of the JSC Tracking and Communications Division (EE), Structures and Mechanics Division (ES), and the Navigation, Guidance and Control Division (EG) and their supporting contractors. The autonomous docking ground demonstration is an evaluation of the capabilities of the laser sensor system to support the docking phase (12ft to contact) when operated in conjunction with the Guidance, Navigation and Control Software. The docking mechanism being used was developed for the Apollo Soyuz Test Program. This demonstration will be conducted using the Six-Degrees of Freedom (6-DOF) Dynamic Test System (DTS). The DTS environment simulates the Space Station Freedom as the stationary or target vehicle and the Orbiter as the active or chase vehicle. For this demonstration the laser sensor will be mounted on the target vehicle and the retroreflectors on the chase vehicle. This arrangement was used to prevent potential damage to the laser. The sensor system. GN&C and 6-DOF DTS will be operated closed-loop. Initial condition to simulate vehicle misalignments, translational and rotational, will be introduced within the constraints of the systems involved. Detailed description of each of the demonstration components (e.g., Sensor System, GN&C, 6-DOF DTS and supporting computer configuration) including their capabilities and limitations will be discussed. A demonstration architecture drawing and photographs of the test configuration will be presented.

  8. Autonomous docking ground demonstration (category 3)

    NASA Astrophysics Data System (ADS)

    Lamkin, Steve L.; Eick, Richard E.; Baxter, James M.; Boyd, M. G.; Clark, Fred D.; Lee, Thomas Q.; Othon, L. T.; Prather, Joseph L.; Spehar, Peter T.; Teders, Rebecca J.

    The NASA Johnson Space Center (JSC) is involved in the development of an autonomous docking ground demonstration. The demonstration combines the technologies, expertise and facilities of the JSC Tracking and Communications Division (EE), Structures and Mechanics Division (ES), and the Navigation, Guidance and Control Division (EG) and their supporting contractors. The autonomous docking ground demonstration is an evaluation of the capabilities of the laser sensor system to support the docking phase (12ft to contact) when operated in conjunction with the Guidance, Navigation and Control Software. The docking mechanism being used was developed for the Apollo Soyuz Test Program. This demonstration will be conducted using the Six-Degrees of Freedom (6-DOF) Dynamic Test System (DTS). The DTS environment simulates the Space Station Freedom as the stationary or target vehicle and the Orbiter as the active or chase vehicle. For this demonstration the laser sensor will be mounted on the target vehicle and the retroreflectors on the chase vehicle. This arrangement was used to prevent potential damage to the laser. The sensor system. GN&C and 6-DOF DTS will be operated closed-loop. Initial condition to simulate vehicle misalignments, translational and rotational, will be introduced within the constraints of the systems involved. Detailed description of each of the demonstration components (e.g., Sensor System, GN&C, 6-DOF DTS and supporting computer configuration) including their capabilities and limitations will be discussed. A demonstration architecture drawing and photographs of the test configuration will be presented.

  9. Integrity Analysis of Real-Time Ppp Technique with Igs-Rts Service for Maritime Navigation

    NASA Astrophysics Data System (ADS)

    El-Diasty, M.

    2017-10-01

    Open sea and inland waterways are the most widely used mode for transporting goods worldwide. It is the International Maritime Organization (IMO) that defines the requirements for position fixing equipment for a worldwide radio-navigation system, in terms of accuracy, integrity, continuity, availability and coverage for the various phases of navigation. Satellite positioning systems can contribute to meet these requirements, as well as optimize marine transportation. Marine navigation usually consists of three major phases identified as Ocean/Coastal/Port approach/Inland waterway, in port navigation and automatic docking with alert limit ranges from 25 m to 0.25 m. GPS positioning is widely used for many applications and is currently recognized by IMO for a future maritime navigation. With the advancement in autonomous GPS positioning techniques such as Precise Point Positioning (PPP) and with the advent of new real-time GNSS correction services such as IGS-Real-Time-Service (RTS), it is necessary to investigate the integrity of the PPP-based positioning technique along with IGS-RTS service in terms of availability and reliability for safe navigation in maritime application. This paper monitors the integrity of an autonomous real-time PPP-based GPS positioning system using the IGS real-time service (RTS) for maritime applications that require minimum availability of integrity of 99.8 % to fulfil the IMO integrity standards. To examine the integrity of the real-time IGS-RTS PPP-based technique for maritime applications, kinematic data from a dual frequency GPS receiver is collected onboard a vessel and investigated with the real-time IGS-RTS PPP-based GPS positioning technique. It is shown that the availability of integrity of the real-time IGS-RTS PPP-based GPS solution is 100 % for all navigation phases and therefore fulfil the IMO integrity standards (99.8 % availability) immediately (after 1 second), after 2 minutes and after

  10. Navigation of the autonomous vehicle reverse movement

    NASA Astrophysics Data System (ADS)

    Rachkov, M.; Petukhov, S.

    2018-02-01

    The paper presents a mathematical formulation of the vehicle reverse motion along a multi-link polygonal trajectory consisting of rectilinear segments interconnected by nodal points. Relevance of the problem is caused by the need to solve a number of tasks: to save the vehicle in the event of а communication break by returning along the trajectory already passed, to avoid a turn on the ground in constrained obstacles or dangerous conditions, or a partial return stroke for the subsequent bypass of the obstacle and continuation of the forward movement. The method of navigation with direct movement assumes that the reverse path is elaborated by using landmarks. To measure landmarks on board, a block of cameras is placed on a vehicle controlled by the operator through the radio channel. Errors in estimating deviation from the nominal trajectory of motion are determined using the multidimensional correlation analysis apparatus based on the dynamics of a lateral deviation error and a vehicle speed error. The result of the experiment showed a relatively high accuracy in determining the state vector that provides the vehicle reverse motion relative to the reference trajectory with a practically acceptable error while returning to the start point.

  11. Fast and reliable obstacle detection and segmentation for cross-country navigation

    NASA Technical Reports Server (NTRS)

    Talukder, A.; Manduchi, R.; Rankin, A.; Matthies, L.

    2002-01-01

    Obstacle detection is one of the main components of the control system of autonomous vehicles. In the case of indoor/urban navigation, obstacles are typically defined as surface points that are higher than the ground plane. This characterization, however, cannot be used in cross-country and unstructured environments, where the notion of ground plane is often not meaningful.

  12. 3D Reconfigurable MPSoC for Unmanned Spacecraft Navigation

    NASA Astrophysics Data System (ADS)

    Dekoulis, George

    2016-07-01

    This paper describes the design of a new lightweight spacecraft navigation system for unmanned space missions. The system addresses the demands for more efficient autonomous navigation in the near-Earth environment or deep space. The proposed instrumentation is directly suitable for unmanned systems operation and testing of new airborne prototypes for remote sensing applications. The system features a new sensor technology and significant improvements over existing solutions. Fluxgate type sensors have been traditionally used in unmanned defense systems such as target drones, guided missiles, rockets and satellites, however, the guidance sensors' configurations exhibit lower specifications than the presented solution. The current implementation is based on a recently developed material in a reengineered optimum sensor configuration for unprecedented low-power consumption. The new sensor's performance characteristics qualify it for spacecraft navigation applications. A major advantage of the system is the efficiency in redundancy reduction achieved in terms of both hardware and software requirements.

  13. Situationally driven local navigation for mobile robots. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Slack, Marc Glenn

    1990-01-01

    For mobile robots to autonomously accommodate dynamically changing navigation tasks in a goal-directed fashion, they must employ navigation plans. Any such plan must provide for the robot's immediate and continuous need for guidance while remaining highly flexible in order to avoid costly computation each time the robot's perception of the world changes. Due to the world's uncertainties, creation and maintenance of navigation plans cannot involve arbitrarily complex processes, as the robot's perception of the world will be in constant flux, requiring modifications to be made quickly if they are to be of any use. This work introduces navigation templates (NaT's) which are building blocks for the construction and maintenance of rough navigation plans which capture the relationship that objects in the world have to the current navigation task. By encoding only the critical relationship between the objects in the world and the navigation task, a NaT-based navigation plan is highly flexible; allowing new constraints to be quickly incorporated into the plan and existing constraints to be updated or deleted from the plan. To satisfy the robot's need for immediate local guidance, the NaT's forming the current navigation plan are passed to a transformation function. The transformation function analyzes the plan with respect to the robot's current location to quickly determine (a few times a second) the locally preferred direction of travel. This dissertation presents NaT's and the transformation function as well as the needed support systems to demonstrate the usefulness of the technique for controlling the actions of a mobile robot operating in an uncertain world.

  14. Bioinspired polarization navigation sensor for autonomous munitions systems

    NASA Astrophysics Data System (ADS)

    Giakos, G. C.; Quang, T.; Farrahi, T.; Deshpande, A.; Narayan, C.; Shrestha, S.; Li, Y.; Agarwal, M.

    2013-05-01

    Small unmanned aerial vehicles UAVs (SUAVs), micro air vehicles (MAVs), Automated Target Recognition (ATR), and munitions guidance, require extreme operational agility and robustness which can be partially offset by efficient bioinspired imaging sensor designs capable to provide enhanced guidance, navigation and control capabilities (GNC). Bioinspired-based imaging technology can be proved useful either for long-distance surveillance of targets in a cluttered environment, or at close distances limited by space surroundings and obstructions. The purpose of this study is to explore the phenomenology of image formation by different insect eye architectures, which would directly benefit the areas of defense and security, on the following four distinct areas: a) fabrication of the bioinspired sensor b) optical architecture, c) topology, and d) artificial intelligence. The outcome of this study indicates that bioinspired imaging can impact the areas of defense and security significantly by dedicated designs fitting into different combat scenarios and applications.

  15. Performance Characterization of a Landmark Measurement System for ARRM Terrain Relative Navigation

    NASA Technical Reports Server (NTRS)

    Shoemaker, Michael A.; Wright, Cinnamon; Liounis, Andrew J.; Getzandanner, Kenneth M.; Van Eepoel, John M.; DeWeese, Keith D.

    2016-01-01

    This paper describes the landmark measurement system being developed for terrain relative navigation on NASAs Asteroid Redirect Robotic Mission (ARRM),and the results of a performance characterization study given realistic navigational and model errors. The system is called Retina, and is derived from the stereo-photoclinometry methods widely used on other small-body missions. The system is simulated using synthetic imagery of the asteroid surface and discussion is given on various algorithmic design choices. Unlike other missions, ARRMs Retina is the first planned autonomous use of these methods during the close-proximity and descent phase of the mission.

  16. Performance Characterization of a Landmark Measurement System for ARRM Terrain Relative Navigation

    NASA Technical Reports Server (NTRS)

    Shoemaker, Michael; Wright, Cinnamon; Liounis, Andrew; Getzandanner, Kenneth; Van Eepoel, John; Deweese, Keith

    2016-01-01

    This paper describes the landmark measurement system being developed for terrain relative navigation on NASAs Asteroid Redirect Robotic Mission (ARRM),and the results of a performance characterization study given realistic navigational and model errors. The system is called Retina, and is derived from the stereophotoclinometry methods widely used on other small-body missions. The system is simulated using synthetic imagery of the asteroid surface and discussion is given on various algorithmic design choices. Unlike other missions, ARRMs Retina is the first planned autonomous use of these methods during the close-proximity and descent phase of the mission.

  17. Nonlinear unbiased minimum-variance filter for Mars entry autonomous navigation under large uncertainties and unknown measurement bias.

    PubMed

    Xiao, Mengli; Zhang, Yongbo; Fu, Huimin; Wang, Zhihua

    2018-05-01

    High-precision navigation algorithm is essential for the future Mars pinpoint landing mission. The unknown inputs caused by large uncertainties of atmospheric density and aerodynamic coefficients as well as unknown measurement biases may cause large estimation errors of conventional Kalman filters. This paper proposes a derivative-free version of nonlinear unbiased minimum variance filter for Mars entry navigation. This filter has been designed to solve this problem by estimating the state and unknown measurement biases simultaneously with derivative-free character, leading to a high-precision algorithm for the Mars entry navigation. IMU/radio beacons integrated navigation is introduced in the simulation, and the result shows that with or without radio blackout, our proposed filter could achieve an accurate state estimation, much better than the conventional unscented Kalman filter, showing the ability of high-precision Mars entry navigation algorithm. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  18. The Design of a Navigator for a Testbed Autonomous Underwater Vehicle

    DTIC Science & Technology

    1989-12-01

    AD-A231 733 NAVAL POSTGRADUATE SCHOOL Monterey, California DTC C ’S B’- i I A VDI ELECTE 1i EB 0 6 991| D THESIS E THE DESIGN OF A NAVIGATOR FOR A...255,255,0); /* yellow (512-1023) * for (i= 1024; i< 2048 ; ++i) mapcolor(i,255 ,0,255); 1* magenta (1024-2047 )*/ color(BLACK); clearo; swapbufferso; 1

  19. Relative navigation and attitude determination using a GPS/INS integrated system near the International Space Station

    NASA Astrophysics Data System (ADS)

    Um, Jaeyong

    2001-08-01

    The Space Integrated GPS/INS (SIGI) sensor is the primary navigation and attitude determination source for the International Space Station (ISS). The SIGI was successfully demonstrated on-orbit for the first time in the SIGI Orbital Attitude Readiness (SOAR) demonstration on the Space Shuttle Atlantis in May 2000. Numerous proximity operations near the ISS have been and will be performed over the lifetime of the Station. The development of an autonomous relative navigation system is needed to improve the safety and efficiency of vehicle operations near the ISS. A hardware simulation study was performed for the GPS-based relative navigation using the state vector difference approach and the interferometric approach in the absence of multipath. The interferometric approach, where the relative states are estimated directly, showed comparable results for a 1 km baseline. One of the most pressing current technical issues is the design of an autonomous relative navigation system in the proximity of the ISS, where GPS signals are blocked and maneuvers happen frequently. An integrated GPS/INS system is investigated for the possibility of a fully autonomous relative navigation system. Another application of GPS measurements is determination of the vehicle's orientation in space. This study used the SOAR experiment data to characterize the SICI's on-orbit performance for attitude determination. A cold start initialization algorithm was developed for integer ambiguity resolution in any initial orientation. The original algorithm that was used in the SIGI had an operational limitation in the integer ambiguity resolution, which was developed for terrestrial applications, and limited its effectiveness in space. The new algorithm was tested using the SOAR data and has been incorporated in the current SIGI flight software. The attitude estimation performance was examined using two different GPS/INS integration algorithms. The GPS/INS attitude solution using the SOAR data was as

  20. Advancing Navigation, Timing, and Science with the Deep Space Atomic Clock

    NASA Technical Reports Server (NTRS)

    Ely, Todd A.; Seubert, Jill; Bell, Julia

    2014-01-01

    NASA's Deep Space Atomic Clock mission is developing a small, highly stable mercury ion atomic clock with an Allan deviation of at most 1e-14 at one day, and with current estimates near 3e-15. This stability enables one-way radiometric tracking data with accuracy equivalent to and, in certain conditions, better than current two-way deep space tracking data; allowing a shift to a more efficient and flexible one-way deep space navigation architecture. DSAC-enabled one-way tracking will benefit navigation and radio science by increasing the quantity and quality of tracking data. Additionally, DSAC would be a key component to fully-autonomous onboard radio navigation useful for time-sensitive situations. Potential deep space applications of DSAC are presented, including orbit determination of a Mars orbiter and gravity science on a Europa flyby mission.

  1. LWIR passive perception system for stealthy unmanned ground vehicle night operations

    NASA Astrophysics Data System (ADS)

    Lee, Daren; Rankin, Arturo; Huertas, Andres; Nash, Jeremy; Ahuja, Gaurav; Matthies, Larry

    2016-05-01

    Resupplying forward-deployed units in rugged terrain in the presence of hostile forces creates a high threat to manned air and ground vehicles. An autonomous unmanned ground vehicle (UGV) capable of navigating stealthily at night in off-road and on-road terrain could significantly increase the safety and success rate of such resupply missions for warfighters. Passive night-time perception of terrain and obstacle features is a vital requirement for such missions. As part of the ONR 30 Autonomy Team, the Jet Propulsion Laboratory developed a passive, low-cost night-time perception system under the ONR Expeditionary Maneuver Warfare and Combating Terrorism Applied Research program. Using a stereo pair of forward looking LWIR uncooled microbolometer cameras, the perception system generates disparity maps using a local window-based stereo correlator to achieve real-time performance while maintaining low power consumption. To overcome the lower signal-to-noise ratio and spatial resolution of LWIR thermal imaging technologies, a series of pre-filters were applied to the input images to increase the image contrast and stereo correlator enhancements were applied to increase the disparity density. To overcome false positives generated by mixed pixels, noisy disparities from repeated textures, and uncertainty in far range measurements, a series of consistency, multi-resolution, and temporal based post-filters were employed to improve the fidelity of the output range measurements. The stereo processing leverages multi-core processors and runs under the Robot Operating System (ROS). The night-time passive perception system was tested and evaluated on fully autonomous testbed ground vehicles at SPAWAR Systems Center Pacific (SSC Pacific) and Marine Corps Base Camp Pendleton, California. This paper describes the challenges, techniques, and experimental results of developing a passive, low-cost perception system for night-time autonomous navigation.

  2. Interaction dynamics of multiple mobile robots with simple navigation strategies

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1989-01-01

    The global dynamic behavior of multiple interacting autonomous mobile robots with simple navigation strategies is studied. Here, the effective spatial domain of each robot is taken to be a closed ball about its mass center. It is assumed that each robot has a specified cone of visibility such that interaction with other robots takes place only when they enter its visibility cone. Based on a particle model for the robots, various simple homing and collision-avoidance navigation strategies are derived. Then, an analysis of the dynamical behavior of the interacting robots in unbounded spatial domains is made. The article concludes with the results of computer simulations studies of two or more interacting robots.

  3. Constrained navigation for unmanned systems

    NASA Astrophysics Data System (ADS)

    Vasseur, Laurent; Gosset, Philippe; Carpentier, Luc; Marion, Vincent; Morillon, Joel G.; Ropars, Patrice

    2005-05-01

    The French Military Robotic Study Program (introduced in Aerosense 2003), sponsored by the French Defense Procurement Agency and managed by Thales as the prime contractor, focuses on about 15 robotic themes which can provide an immediate "operational add-on value". The paper details the "constrained navigation" study (named TEL2), which main goal is to identify and test a well-balanced task sharing between man and machine to accomplish a robotic task that cannot be performed autonomously at the moment because of technological limitations. The chosen function is "obstacle avoidance" on rough ground and quite high speed (40 km/h). State of the art algorithms have been implemented to perform autonomous obstacle avoidance and following of forest borders, using scanner laser sensor and standard localization functions. Such an "obstacle avoidance" function works well most of the time, BUT fails sometimes. The study analyzed how the remote operator can manage such failures so that the system remains fully operationally reliable; he can act according to two ways: a) finely adjust the vehicle current heading; b) take the control of the vehicle "on the fly" (without stopping) and bring it back to autonomous behavior when motion is secured again. The paper also presents the results got from the military acceptance tests performed on French 4x4 DARDS ATD.

  4. Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) Project Status as of May 2010

    NASA Technical Reports Server (NTRS)

    Striepe, Scott A.; Epp, Chirold D.; Robertson, Edward A.

    2010-01-01

    This paper includes the current status of NASA s Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) Project. The ALHAT team has completed several flight tests and two major design analysis cycles. These tests and analyses examine terrain relative navigation sensors, hazard detection and avoidance sensors and algorithms, and hazard relative navigation algorithms, and the guidance and navigation system using these ALHAT functions. The next flight test is scheduled for July 2010. The paper contains results from completed flight tests and analysis cycles. ALHAT system status, upcoming tests and analyses is also addressed. The current ALHAT plans as of May 2010 are discussed. Application of the ALHAT system to landing on bodies other than the Moon is included

  5. Autonomous Robotic Inspection in Tunnels

    NASA Astrophysics Data System (ADS)

    Protopapadakis, E.; Stentoumis, C.; Doulamis, N.; Doulamis, A.; Loupos, K.; Makantasis, K.; Kopsiaftis, G.; Amditis, A.

    2016-06-01

    In this paper, an automatic robotic inspector for tunnel assessment is presented. The proposed platform is able to autonomously navigate within the civil infrastructures, grab stereo images and process/analyse them, in order to identify defect types. At first, there is the crack detection via deep learning approaches. Then, a detailed 3D model of the cracked area is created, utilizing photogrammetric methods. Finally, a laser profiling of the tunnel's lining, for a narrow region close to detected crack is performed; allowing for the deduction of potential deformations. The robotic platform consists of an autonomous mobile vehicle; a crane arm, guided by the computer vision-based crack detector, carrying ultrasound sensors, the stereo cameras and the laser scanner. Visual inspection is based on convolutional neural networks, which support the creation of high-level discriminative features for complex non-linear pattern classification. Then, real-time 3D information is accurately calculated and the crack position and orientation is passed to the robotic platform. The entire system has been evaluated in railway and road tunnels, i.e. in Egnatia Highway and London underground infrastructure.

  6. An integrated autonomous rendezvous and docking system architecture using Centaur modern avionics

    NASA Technical Reports Server (NTRS)

    Nelson, Kurt

    1991-01-01

    The avionics system for the Centaur upper stage is in the process of being modernized with the current state-of-the-art in strapdown inertial guidance equipment. This equipment includes an integrated flight control processor with a ring laser gyro based inertial guidance system. This inertial navigation unit (INU) uses two MIL-STD-1750A processors and communicates over the MIL-STD-1553B data bus. Commands are translated into load activation through a Remote Control Unit (RCU) which incorporates the use of solid state relays. Also, a programmable data acquisition system replaces separate multiplexer and signal conditioning units. This modern avionics suite is currently being enhanced through independent research and development programs to provide autonomous rendezvous and docking capability using advanced cruise missile image processing technology and integrated GPS navigational aids. A system concept was developed to combine these technologies in order to achieve a fully autonomous rendezvous, docking, and autoland capability. The current system architecture and the evolution of this architecture using advanced modular avionics concepts being pursued for the National Launch System are discussed.

  7. Advancing Underwater Acoustic Communication for Autonomous Distributed Networks via Sparse Channel Sensing, Coding, and Navigation Support

    DTIC Science & Technology

    2014-09-30

    underwater acoustic communication technologies for autonomous distributed underwater networks , through innovative signal processing, coding, and...4. TITLE AND SUBTITLE Advancing Underwater Acoustic Communication for Autonomous Distributed Networks via Sparse Channel Sensing, Coding, and...coding: 3) OFDM modulated dynamic coded cooperation in underwater acoustic channels; 3 Localization, Networking , and Testbed: 4) On-demand

  8. Autonomous Flight Safety System

    NASA Technical Reports Server (NTRS)

    Ferrell, Bob; Santuro, Steve; Simpson, James; Zoerner, Roger; Bull, Barton; Lanzi, Jim

    2004-01-01

    Autonomous Flight Safety System (AFSS) is an independent flight safety system designed for small to medium sized expendable launch vehicles launching from or needing range safety protection while overlying relatively remote locations. AFSS replaces the need for a man-in-the-loop to make decisions for flight termination. AFSS could also serve as the prototype for an autonomous manned flight crew escape advisory system. AFSS utilizes onboard sensors and processors to emulate the human decision-making process using rule-based software logic and can dramatically reduce safety response time during critical launch phases. The Range Safety flight path nominal trajectory, its deviation allowances, limit zones and other flight safety rules are stored in the onboard computers. Position, velocity and attitude data obtained from onboard global positioning system (GPS) and inertial navigation system (INS) sensors are compared with these rules to determine the appropriate action to ensure that people and property are not jeopardized. The final system will be fully redundant and independent with multiple processors, sensors, and dead man switches to prevent inadvertent flight termination. AFSS is currently in Phase III which includes updated algorithms, integrated GPS/INS sensors, large scale simulation testing and initial aircraft flight testing.

  9. Free-Flight Terrestrial Rocket Lander Demonstration for NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) System

    NASA Technical Reports Server (NTRS)

    Rutishauser, David K.; Epp, Chirold; Robertson, Ed

    2012-01-01

    The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. Since its inception in 2006, the ALHAT Project has executed four field test campaigns to characterize and mature sensors and algorithms that support real-time hazard detection and global/local precision navigation for planetary landings. The driving objective for Government Fiscal Year 2012 (GFY2012) is to successfully demonstrate autonomous, real-time, closed loop operation of the ALHAT system in a realistic free flight scenario on Earth using the Morpheus lander developed at the Johnson Space Center (JSC). This goal represents an aggressive target consistent with a lean engineering culture of rapid prototyping and development. This culture is characterized by prioritizing early implementation to gain practical lessons learned and then building on this knowledge with subsequent prototyping design cycles of increasing complexity culminating in the implementation of the baseline design. This paper provides an overview of the ALHAT/Morpheus flight demonstration activities in GFY2012, including accomplishments, current status, results, and lessons learned. The ALHAT/Morpheus effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).

  10. Relative Navigation for Formation Flying of Spacecraft

    NASA Technical Reports Server (NTRS)

    Alonso, Roberto; Du, Ju-Young; Hughes, Declan; Junkins, John L.; Crassidis, John L.

    2001-01-01

    This paper presents a robust and efficient approach for relative navigation and attitude estimation of spacecraft flying in formation. This approach uses measurements from a new optical sensor that provides a line of sight vector from the master spacecraft to the secondary satellite. The overall system provides a novel, reliable, and autonomous relative navigation and attitude determination system, employing relatively simple electronic circuits with modest digital signal processing requirements and is fully independent of any external systems. Experimental calibration results are presented, which are used to achieve accurate line of sight measurements. State estimation for formation flying is achieved through an optimal observer design. Also, because the rotational and translational motions are coupled through the observation vectors, three approaches are suggested to separate both signals just for stability analysis. Simulation and experimental results indicate that the combined sensor/estimator approach provides accurate relative position and attitude estimates.

  11. Light Detection and Ranging-Based Terrain Navigation: A Concept Exploration

    NASA Technical Reports Server (NTRS)

    Campbell, Jacob; UijtdeHaag, Maarten; vanGraas, Frank; Young, Steve

    2003-01-01

    This paper discusses the use of Airborne Light Detection And Ranging (LiDAR) equipment for terrain navigation. Airborne LiDAR is a relatively new technology used primarily by the geo-spatial mapping community to produce highly accurate and dense terrain elevation maps. In this paper, the term LiDAR refers to a scanning laser ranger rigidly mounted to an aircraft, as opposed to an integrated sensor system that consists of a scanning laser ranger integrated with Global Positioning System (GPS) and Inertial Measurement Unit (IMU) data. Data from the laser range scanner and IMU will be integrated with a terrain database to estimate the aircraft position and data from the laser range scanner will be integrated with GPS to estimate the aircraft attitude. LiDAR data was collected using NASA Dryden's DC-8 flying laboratory in Reno, NV and was used to test the proposed terrain navigation system. The results of LiDAR-based terrain navigation shown in this paper indicate that airborne LiDAR is a viable technology enabler for fully autonomous aircraft navigation. The navigation performance is highly dependent on the quality of the terrain databases used for positioning and therefore high-resolution (2 m post-spacing) data was used as the terrain reference.

  12. Navigating a Mobile Robot Across Terrain Using Fuzzy Logic

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun; Howard, Ayanna; Bon, Bruce

    2003-01-01

    A strategy for autonomous navigation of a robotic vehicle across hazardous terrain involves the use of a measure of traversability of terrain within a fuzzy-logic conceptual framework. This navigation strategy requires no a priori information about the environment. Fuzzy logic was selected as a basic element of this strategy because it provides a formal methodology for representing and implementing a human driver s heuristic knowledge and operational experience. Within a fuzzy-logic framework, the attributes of human reasoning and decision- making can be formulated by simple IF (antecedent), THEN (consequent) rules coupled with easily understandable and natural linguistic representations. The linguistic values in the rule antecedents convey the imprecision associated with measurements taken by sensors onboard a mobile robot, while the linguistic values in the rule consequents represent the vagueness inherent in the reasoning processes to generate the control actions. The operational strategies of the human expert driver can be transferred, via fuzzy logic, to a robot-navigation strategy in the form of a set of simple conditional statements composed of linguistic variables. These linguistic variables are defined by fuzzy sets in accordance with user-defined membership functions. The main advantages of a fuzzy navigation strategy lie in the ability to extract heuristic rules from human experience and to obviate the need for an analytical model of the robot navigation process.

  13. Mapping and navigational control for a “smart” wheelchair.

    PubMed

    Schultz, Dana L; Shea, Kathleen M; Barrett, Steven F

    2012-01-01

    A “smart” wheelchair is in development to provide mobility to those unable to control a traditional wheelchair. A “smart” wheelchair is an autonomous machine with the ability to navigate a mapped environment while avoiding obstacles. The flexibility and complex design of “smart” wheelchairs have made those currently available expensive. Ongoing research at the University of Wyoming has been aimed at designing a cheaper, alternative control system that could be interfaced with a typical powered wheelchair. The goal of this project is to determine methods for mapping and navigational control for the wheelchair. The control system acquires data from eighteen sensors and uses the data to navigate around a pre-programmed map which is stored on a micro SD card. The control system also provides a user interface in the form of a touchscreen LCD. The designed system will be an easy-to-use and cost effective alternative to current “smart” wheelchair technology.

  14. Dual RF Astrodynamic GPS Orbital Navigator Satellite

    NASA Technical Reports Server (NTRS)

    Kanipe, David B.; Provence, Robert Steve; Straube, Timothy M.; Reed, Helen; Bishop, Robert; Lightsey, Glenn

    2009-01-01

    Dual RF Astrodynamic GPS Orbital Navigator Satellite (DRAGONSat) will demonstrate autonomous rendezvous and docking (ARD) in low Earth orbit (LEO) and gather flight data with a global positioning system (GPS) receiver strictly designed for space applications. ARD is the capability of two independent spacecraft to rendezvous in orbit and dock without crew intervention. DRAGONSat consists of two picosatellites (one built by the University of Texas and one built by Texas A and M University) and the Space Shuttle Payload Launcher (SSPL); this project will ultimately demonstrate ARD in LEO.

  15. A design strategy for autonomous systems

    NASA Technical Reports Server (NTRS)

    Forster, Pete

    1989-01-01

    Some solutions to crucial issues regarding the competent performance of an autonomously operating robot are identified; namely, that of handling multiple and variable data sources containing overlapping information and maintaining coherent operation while responding adequately to changes in the environment. Support for the ideas developed for the construction of such behavior are extracted from speculations in the study of cognitive psychology, an understanding of the behavior of controlled mechanisms, and the development of behavior-based robots in a few robot research laboratories. The validity of these ideas is supported by some simple simulation experiments in the field of mobile robot navigation and guidance.

  16. Application of a distributed systems architecture for increased speed in image processing on an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Wright, Adam A.; Momin, Orko; Shin, Young Ho; Shakya, Rahul; Nepal, Kumud; Ahlgren, David J.

    2010-01-01

    This paper presents the application of a distributed systems architecture to an autonomous ground vehicle, Q, that participates in both the autonomous and navigation challenges of the Intelligent Ground Vehicle Competition. In the autonomous challenge the vehicle is required to follow a course, while avoiding obstacles and staying within the course boundaries, which are marked by white lines. For the navigation challenge, the vehicle is required to reach a set of target destinations, known as way points, with given GPS coordinates and avoid obstacles that it encounters in the process. Previously the vehicle utilized a single laptop to execute all processing activities including image processing, sensor interfacing and data processing, path planning and navigation algorithms and motor control. National Instruments' (NI) LabVIEW served as the programming language for software implementation. As an upgrade to last year's design, a NI compact Reconfigurable Input/Output system (cRIO) was incorporated to the system architecture. The cRIO is NI's solution for rapid prototyping that is equipped with a real time processor, an FPGA and modular input/output. Under the current system, the real time processor handles the path planning and navigation algorithms, the FPGA gathers and processes sensor data. This setup leaves the laptop to focus on running the image processing algorithm. Image processing as previously presented by Nepal et. al. is a multi-step line extraction algorithm and constitutes the largest processor load. This distributed approach results in a faster image processing algorithm which was previously Q's bottleneck. Additionally, the path planning and navigation algorithms are executed more reliably on the real time processor due to the deterministic nature of operation. The implementation of this architecture required exploration of various inter-system communication techniques. Data transfer between the laptop and the real time processor using UDP packets

  17. Single-Frequency GPS Relative Navigation in a High Ionosphere Orbital Environment

    NASA Technical Reports Server (NTRS)

    Conrad, Patrick R.; Naasz, Bo J.

    2007-01-01

    The Global Positioning System (GPS) provides a convenient source for space vehicle relative navigation measurements, especially for low Earth orbit formation flying and autonomous rendezvous mission concepts. For single-frequency GPS receivers, ionospheric path delay can be a significant error source if not properly mitigated. In particular, ionospheric effects are known to cause significant radial position error bias and add dramatically to relative state estimation error if the onboard navigation software does not force the use of measurements from common or shared GPS space vehicles. Results from GPS navigation simulations are presented for a pair of space vehicles flying in formation and using GPS pseudorange measurements to perform absolute and relative orbit determination. With careful measurement selection techniques relative state estimation accuracy to less than 20 cm with standard GPS pseudorange processing and less than 10 cm with single-differenced pseudorange processing is shown.

  18. Precision analysis of autonomous orbit determination using star sensor for Beidou MEO satellite

    NASA Astrophysics Data System (ADS)

    Shang, Lin; Chang, Jiachao; Zhang, Jun; Li, Guotong

    2018-04-01

    This paper focuses on the autonomous orbit determination accuracy of Beidou MEO satellite using the onboard observations of the star sensors and infrared horizon sensor. A polynomial fitting method is proposed to calibrate the periodic error in the observation of the infrared horizon sensor, which will greatly influence the accuracy of autonomous orbit determination. Test results show that the periodic error can be eliminated using the polynomial fitting method. The User Range Error (URE) of Beidou MEO satellite is less than 2 km using the observations of the star sensors and infrared horizon sensor for autonomous orbit determination. The error of the Right Ascension of Ascending Node (RAAN) is less than 60 μrad and the observations of star sensors can be used as a spatial basis for Beidou MEO navigation constellation.

  19. Advanced Integration of WiFi and Inertial Navigation Systems for Indoor Mobile Positioning

    NASA Astrophysics Data System (ADS)

    Evennou, Frédéric; Marx, François

    2006-12-01

    This paper presents an aided dead-reckoning navigation structure and signal processing algorithms for self localization of an autonomous mobile device by fusing pedestrian dead reckoning and WiFi signal strength measurements. WiFi and inertial navigation systems (INS) are used for positioning and attitude determination in a wide range of applications. Over the last few years, a number of low-cost inertial sensors have become available. Although they exhibit large errors, WiFi measurements can be used to correct the drift weakening the navigation based on this technology. On the other hand, INS sensors can interact with the WiFi positioning system as they provide high-accuracy real-time navigation. A structure based on a Kalman filter and a particle filter is proposed. It fuses the heterogeneous information coming from those two independent technologies. Finally, the benefits of the proposed architecture are evaluated and compared with the pure WiFi and INS positioning systems.

  20. The Deep Space Atomic Clock: Ushering in a New Paradigm for Radio Navigation and Science

    NASA Technical Reports Server (NTRS)

    Ely, Todd; Seubert, Jill; Prestage, John; Tjoelker, Robert

    2013-01-01

    The Deep Space Atomic Clock (DSAC) mission will demonstrate the on-orbit performance of a high-accuracy, high-stability miniaturized mercury ion atomic clock during a year-long experiment in Low Earth Orbit. DSAC's timing error requirement provides the frequency stability necessary to perform deep space navigation based solely on one-way radiometric tracking data. Compared to a two-way tracking paradigm, DSAC-enabled one-way tracking will benefit navigation and radio science by increasing the quantity and quality of tracking data. Additionally, DSAC also enables fully-autonomous onboard navigation useful for time-sensitive situations. The technology behind the mercury ion atomic clock and a DSAC mission overview are presented. Example deep space applications of DSAC, including navigation of a Mars orbiter and Europa flyby gravity science, highlight the benefits of DSAC-enabled one-way Doppler tracking.

  1. Orion Optical Navigation Progress Toward Exploration: Mission 1

    NASA Technical Reports Server (NTRS)

    Holt, Greg N.; D'Souza, Christopher N.; Saley, David

    2018-01-01

    Optical navigation of human spacecraft was proposed on Gemini and implemented successfully on Apollo as a means of autonomously operating the vehicle in the event of lost communication with controllers on Earth. It shares a history with the "method of lunar distances" that was used in the 18th century and gained some notoriety after its use by Captain James Cook during his 1768 Pacific voyage of the HMS Endeavor. The Orion emergency return system utilizing optical navigation has matured in design over the last several years, and is currently undergoing the final implementation and test phase in preparation for Exploration Mission 1 (EM-1) in 2019. The software development is being worked as a Government Furnished Equipment (GFE) project delivered as an application within the Core Flight Software of the Orion camera controller module. The mathematical formulation behind the initial ellipse fit in the image processing is detailed in Christian. The non-linear least squares refinement then follows the technique of Mortari as an estimation process of the planetary limb using the sigmoid function. The Orion optical navigation system uses a body fixed camera, a decision that was driven by mass and mechanism constraints. The general concept of operations involves a 2-hour pass once every 24 hours, with passes specifically placed before all maneuvers to supply accurate navigation information to guidance and targeting. The pass lengths are limited by thermal constraints on the vehicle since the OpNav attitude generally deviates from the thermally stable tail-to-sun attitude maintained during the rest of the orbit coast phase. Calibration is scheduled prior to every pass due to the unknown nature of thermal effects on the lens distortion and the mounting platform deformations between the camera and star trackers. The calibration technique is described in detail by Christian, et al. and simultaneously estimates the Brown-Conrady coefficients and the Star Tracker

  2. A Behavior-Based Strategy for Single and Multi-Robot Autonomous Exploration

    PubMed Central

    Cepeda, Jesus S.; Chaimowicz, Luiz; Soto, Rogelio; Gordillo, José L.; Alanís-Reyes, Edén A.; Carrillo-Arce, Luis C.

    2012-01-01

    In this paper, we consider the problem of autonomous exploration of unknown environments with single and multiple robots. This is a challenging task, with several potential applications. We propose a simple yet effective approach that combines a behavior-based navigation with an efficient data structure to store previously visited regions. This allows robots to safely navigate, disperse and efficiently explore the environment. A series of experiments performed using a realistic robotic simulator and a real testbed scenario demonstrate that our technique effectively distributes the robots over the environment and allows them to quickly accomplish their mission in large open spaces, narrow cluttered environments, dead-end corridors, as well as rooms with minimum exits.

  3. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1991-01-01

    A program of research embracing teleoperator and automatic navigational control of freely flying satellite robots is presented. Current research goals include: (1) developing visual operator interfaces for improved vehicle teleoperation; (2) determining the effects of different visual interface system designs on operator performance; and (3) achieving autonomous vision-based vehicle navigation and control. This research program combines virtual-environment teleoperation studies and neutral-buoyancy experiments using a space-robot simulator vehicle currently under development. Visual-interface design options under investigation include monoscopic versus stereoscopic displays and cameras, helmet-mounted versus panel-mounted display monitors, head-tracking versus fixed or manually steerable remote cameras, and the provision of vehicle-fixed visual cues, or markers, in the remote scene for improved sensing of vehicle position, orientation, and motion.

  4. Landmark-Based Navigation of an Unmanned Ground Vehicle (UGV)

    DTIC Science & Technology

    2009-03-01

    against large measurement errors. 20090710280 RELEASE LIMITATION Approved for public release 4p fv^-Jo-osiit? Published by Weapons Systems Division...achieved as numerous low cost gyroscopes in the market meet this requirement. 24 DSTO-TR-2260 3.5.4 Sensitivity to Vehicle Speed In this subsection

  5. Application of parallelized software architecture to an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Shakya, Rahul; Wright, Adam; Shin, Young Ho; Momin, Orko; Petkovsek, Steven; Wortman, Paul; Gautam, Prasanna; Norton, Adam

    2011-01-01

    This paper presents improvements made to Q, an autonomous ground vehicle designed to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2010 IGVC, Q was upgraded with a new parallelized software architecture and a new vision processor. Improvements were made to the power system reducing the number of batteries required for operation from six to one. In previous years, a single state machine was used to execute the bulk of processing activities including sensor interfacing, data processing, path planning, navigation algorithms and motor control. This inefficient approach led to poor software performance and made it difficult to maintain or modify. For IGVC 2010, the team implemented a modular parallel architecture using the National Instruments (NI) LabVIEW programming language. The new architecture divides all the necessary tasks - motor control, navigation, sensor data collection, etc. into well-organized components that execute in parallel, providing considerable flexibility and facilitating efficient use of processing power. Computer vision is used to detect white lines on the ground and determine their location relative to the robot. With the new vision processor and some optimization of the image processing algorithm used last year, two frames can be acquired and processed in 70ms. With all these improvements, Q placed 2nd in the autonomous challenge.

  6. Unmanned air vehicle: autonomous takeoff and landing

    NASA Astrophysics Data System (ADS)

    Lim, K. L.; Gitano-Briggs, Horizon Walker

    2010-03-01

    UAVs are increasing in popularity and sophistication due to the demonstrated performance which cannot be attained by manned aircraft1. These developments have been made possible by development of sensors, instrumentation, telemetry and controls during the last few decades. UAVs are now common in areas such as aerial observation and as communication relays3. Most UAVs, however, are still flown by a human pilot via remote control from a ground station. Even the existing autonomous UAVs often require a human pilot to handle the most difficult tasks of take off and landing2 (TOL). This is mainly because the navigation of the airplane requires observation, constant situational assessment and hours of experience from the pilot himself4. Therefore, an autonomous takeoff and landing system (TLS) for UAVs using a few practical design rules with various sensors, instrumentation, etc has been developed. This paper details the design and modeling of the UAV TLS. The model indicates that the UAV's TLS shows promising stability.

  7. Unmanned air vehicle: autonomous takeoff and landing

    NASA Astrophysics Data System (ADS)

    Lim, K. L.; Gitano-Briggs, Horizon Walker

    2009-12-01

    UAVs are increasing in popularity and sophistication due to the demonstrated performance which cannot be attained by manned aircraft1. These developments have been made possible by development of sensors, instrumentation, telemetry and controls during the last few decades. UAVs are now common in areas such as aerial observation and as communication relays3. Most UAVs, however, are still flown by a human pilot via remote control from a ground station. Even the existing autonomous UAVs often require a human pilot to handle the most difficult tasks of take off and landing2 (TOL). This is mainly because the navigation of the airplane requires observation, constant situational assessment and hours of experience from the pilot himself4. Therefore, an autonomous takeoff and landing system (TLS) for UAVs using a few practical design rules with various sensors, instrumentation, etc has been developed. This paper details the design and modeling of the UAV TLS. The model indicates that the UAV's TLS shows promising stability.

  8. Navigation through unknown and dynamic open spaces using topological notions

    NASA Astrophysics Data System (ADS)

    Miguel-Tomé, Sergio

    2018-04-01

    Until now, most algorithms used for navigation have had the purpose of directing system towards one point in space. However, humans communicate tasks by specifying spatial relations among elements or places. In addition, the environments in which humans develop their activities are extremely dynamic. The only option that allows for successful navigation in dynamic and unknown environments is making real-time decisions. Therefore, robots capable of collaborating closely with human beings must be able to make decisions based on the local information registered by the sensors and interpret and express spatial relations. Furthermore, when one person is asked to perform a task in an environment, this task is communicated given a category of goals so the person does not need to be supervised. Thus, two problems appear when one wants to create multifunctional robots: how to navigate in dynamic and unknown environments using spatial relations and how to accomplish this without supervision. In this article, a new architecture to address the two cited problems is presented, called the topological qualitative navigation architecture. In previous works, a qualitative heuristic called the heuristic of topological qualitative semantics (HTQS) has been developed to establish and identify spatial relations. However, that heuristic only allows for establishing one spatial relation with a specific object. In contrast, navigation requires a temporal sequence of goals with different objects. The new architecture attains continuous generation of goals and resolves them using HTQS. Thus, the new architecture achieves autonomous navigation in dynamic or unknown open environments.

  9. Wind-based navigation of a hot-air balloon on Titan: a feasibility study

    NASA Astrophysics Data System (ADS)

    Furfaro, Roberto; Lunine, Jonathan I.; Elfes, Alberto; Reh, Kim

    2008-04-01

    Current analysis of data streamed back to Earth by the Cassini spacecraft features Titan as one of the most exciting places in the solar system. NASA centers and universities around the US, as well as the European Space Agency, are studying the possibility of sending, as part of the next mission to this giant moon of Saturn, a hot-air balloon (Montgolfier-type) for further and more in-depth exploration. The basic idea would be to design a reliable, semi-autonomous, and yet cheap Montgolfier capable of using continuous flow of waste heat from a power source to lift the balloon and sustain its altitude in the Titan environment. In this paper we study the problem of locally navigating a hot-air balloon in the nitrogen-based Titan atmosphere. The basic idea is to define a strategy (i.e. design of a suitable guidance system) that allows autonomous and semi-autonomous navigation of the balloon using the available (and partial) knowledge of the wind structure blowing on the saturnian satellite surface. Starting from first principles we determined the appropriate thermal and dynamical models describing (a) the vertical dynamics of the balloon and (b) the dynamics of the balloon moving on a vertical plane (2-D motion). Next, various non-linear fuzzy-based control strategies have been evaluated, analyzed and implemented in MATLAB to numerically simulate the capability of the system to simultaneously maintain altitude, as well as a scientifically desirable trajectory. We also looked at the ability of the balloon to perform station keeping. The results of the simulation are encouraging and show the effectiveness of such a system to cheaply and effectively perform semi-autonomous exploration of Titan.

  10. Improved Modeling in a Matlab-Based Navigation System

    NASA Technical Reports Server (NTRS)

    Deutschmann, Julie; Bar-Itzhack, Itzhack; Harman, Rick; Larimore, Wallace E.

    1999-01-01

    An innovative approach to autonomous navigation is available for low earth orbit satellites. The system is developed in Matlab and utilizes an Extended Kalman Filter (EKF) to estimate the attitude and trajectory based on spacecraft magnetometer and gyro data. Preliminary tests of the system with real spacecraft data from the Rossi X-Ray Timing Explorer Satellite (RXTE) indicate the existence of unmodeled errors in the magnetometer data. Incorporating into the EKF a statistical model that describes the colored component of the effective measurement of the magnetic field vector could improve the accuracy of the trajectory and attitude estimates and also improve the convergence time. This model is identified as a first order Markov process. With the addition of the model, the EKF attempts to identify the non-white components of the noise allowing for more accurate estimation of the original state vector, i.e. the orbital elements and the attitude. Working in Matlab allows for easy incorporation of new models into the EKF and the resulting navigation system is generic and can easily be applied to future missions resulting in an alternative in onboard or ground-based navigation.

  11. Approach-Phase Precision Landing with Hazard Relative Navigation: Terrestrial Test Campaign Results of the Morpheus/ALHAT Project

    NASA Technical Reports Server (NTRS)

    Crain, Timothy P.; Bishop, Robert H.; Carson, John M., III; Trawny, Nikolas; Hanak, Chad; Sullivan, Jacob; Christian, John; DeMars, Kyle; Campbell, Tom; Getchius, Joel

    2016-01-01

    The Morpheus Project began in late 2009 as an ambitious e ort code-named Project M to integrate three ongoing multi-center NASA technology developments: humanoid robotics, liquid oxygen/liquid methane (LOX/LCH4) propulsion and Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) into a single engineering demonstration mission to be own to the Moon by 2013. The humanoid robot e ort was redirected to a deploy- ment of Robonaut 2 on the International Space Station in February of 2011 while Morpheus continued as a terrestrial eld test project integrating the existing ALHAT Project's tech- nologies into a sub-orbital ight system using the world's rst LOX/LCH4 main propulsion and reaction control system fed from the same blowdown tanks. A series of 33 tethered tests with the Morpheus 1.0 vehicle and Morpheus 1.5 vehicle were conducted from April 2011 - December 2013 before successful, sustained free ights with the primary Vertical Testbed (VTB) navigation con guration began with Free Flight 3 on December 10, 2013. Over the course of the following 12 free ights and 3 tethered ights, components of the ALHAT navigation system were integrated into the Morpheus vehicle, operations, and ight control loop. The ALHAT navigation system was integrated and run concurrently with the VTB navigation system as a reference and fail-safe option in ight (see touchdown position esti- mate comparisons in Fig. 1). Flight testing completed with Free Flight 15 on December 15, 2014 with a completely autonomous Hazard Detection and Avoidance (HDA), integration of surface relative and Hazard Relative Navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman lter software, and landing within 2 meters of the VTB GPS-based navigation solution at the safe landing site target. This paper describes the Mor- pheus joint VTB/ALHAT navigation architecture, the sensors utilized during the terrestrial ight campaign, issues resolved during testing, and the navigation

  12. Supporting the joint warfighter by development, training, and fielding of man-portable UGVs

    NASA Astrophysics Data System (ADS)

    Ebert, Kenneth A.; Stratton, Benjamin V.

    2005-05-01

    The Robotic Systems Pool (RSP), sponsored by the Joint Robotics Program (JRP), is an inventory of small robotic systems, payloads, and components intended to expedite the development and integration of technology into effective, supportable, fielded robotic assets. The RSP loans systems to multiple users including the military, first-responders, research organizations, and academia. These users provide feedback in their specific domain, accelerating research and development improvements of robotic systems, which in turn allow the joint warfighter to benefit from such changes more quickly than from traditional acquisition cycles. Over the past year, RSP assets have been used extensively for pre-deployment operator and field training of joint Explosive Ordnance Disposal (EOD) teams, and for the training of Navy Reservist repair technicians. These Reservists are part of the Robotic Systems Combat Support Platoon (RSCSP), attached to Space and Naval Warfare Systems Center, San Diego. The RSCSP maintains and repairs RSP assets and provides deployable technical support for users of robotic systems. Currently, a small team from the RSCSP is deployed at Camp Victory repairing and maintaining man-portable unmanned ground vehicles (UGVs) used by joint EOD teams in Operation Iraqi Freedom. The focus of this paper is to elaborate on the RSP and RSCSP and their role as invaluable resources for spiral development in the robotics community by gaining first-hand technical feedback from the warfighter and other users.

  13. Autonomous Visual Navigation of an Indoor Environment Using a Parsimonious, Insect Inspired Familiarity Algorithm

    PubMed Central

    Brayfield, Brad P.

    2016-01-01

    The navigation of bees and ants from hive to food and back has captivated people for more than a century. Recently, the Navigation by Scene Familiarity Hypothesis (NSFH) has been proposed as a parsimonious approach that is congruent with the limited neural elements of these insects’ brains. In the NSFH approach, an agent completes an initial training excursion, storing images along the way. To retrace the path, the agent scans the area and compares the current scenes to those previously experienced. By turning and moving to minimize the pixel-by-pixel differences between encountered and stored scenes, the agent is guided along the path without having memorized the sequence. An important premise of the NSFH is that the visual information of the environment is adequate to guide navigation without aliasing. Here we demonstrate that an image landscape of an indoor setting possesses ample navigational information. We produced a visual landscape of our laboratory and part of the adjoining corridor consisting of 2816 panoramic snapshots arranged in a grid at 12.7-cm centers. We show that pixel-by-pixel comparisons of these images yield robust translational and rotational visual information. We also produced a simple algorithm that tracks previously experienced routes within our lab based on an insect-inspired scene familiarity approach and demonstrate that adequate visual information exists for an agent to retrace complex training routes, including those where the path’s end is not visible from its origin. We used this landscape to systematically test the interplay of sensor morphology, angles of inspection, and similarity threshold with the recapitulation performance of the agent. Finally, we compared the relative information content and chance of aliasing within our visually rich laboratory landscape to scenes acquired from indoor corridors with more repetitive scenery. PMID:27119720

  14. Trajectory generation for an on-road autonomous vehicle

    NASA Astrophysics Data System (ADS)

    Horst, John; Barbera, Anthony

    2006-05-01

    We describe an algorithm that generates a smooth trajectory (position, velocity, and acceleration at uniformly sampled instants of time) for a car-like vehicle autonomously navigating within the constraints of lanes in a road. The technique models both vehicle paths and lane segments as straight line segments and circular arcs for mathematical simplicity and elegance, which we contrast with cubic spline approaches. We develop the path in an idealized space, warp the path into real space and compute path length, generate a one-dimensional trajectory along the path length that achieves target speeds and positions, and finally, warp, translate, and rotate the one-dimensional trajectory points onto the path in real space. The algorithm moves a vehicle in lane safely and efficiently within speed and acceleration maximums. The algorithm functions in the context of other autonomous driving functions within a carefully designed vehicle control hierarchy.

  15. Experiment D005: Star occultation navigation

    NASA Technical Reports Server (NTRS)

    Silva, R. M.; Jorris, T. R.; Vallerie, E. M., III

    1971-01-01

    The usefulness of star occultation measurements for space navigation and the determination of a horizon density profile which could be used to update atmospheric models for horizon-based measurement systems were studied. The time of occultation of a known star by a celestial body, as seen by an orbiting observer, determines a cylinder of position, the axis of which is the line through the star and the body center, and the radius of which is equal to the occulting-body radius. The dimming percentage, with respect to the altitude of this grazing ray from the star to the observer, is a percentage altitude for occultation. That is, the star can be assumed to be occulted when it reaches a predetermined percentage of its unattenuated value. The procedure used was to measure this attenuation with respect to time to determine the usefulness of the measurements for autonomous space navigation. In this experiment, the crewmembers had to accomplish star acquisition, identification, calibration, and tracking. Instrumentation was required only for measurement of the relative intensity of the star as it set into the atmosphere.

  16. Teaching Young Adults with Intellectual and Developmental Disabilities Community-Based Navigation Skills to Take Public Transportation.

    PubMed

    Price, Richard; Marsh, Abbie J; Fisher, Marisa H

    2018-03-01

    Facilitating the use of public transportation enhances opportunities for independent living and competitive, community-based employment for individuals with intellectual and developmental disabilities (IDD). Four young adults with IDD were taught through total-task chaining to use the Google Maps application, a self-prompting, visual navigation system, to take the bus to locations around a college campus and the community. Three of four participants learned to use Google Maps to independently navigate public transportation. Google Maps may be helpful in supporting independent travel, highlighting the importance of future research in teaching navigation skills. Learning to independently use public transportation increases access to autonomous activities, such as opportunities to work and to attend postsecondary education programs on large college campuses.Individuals with IDD can be taught through chaining procedures to use the Google Maps application to navigate public transportation.Mobile map applications are an effective and functional modern tool that can be used to teach community navigation.

  17. Slime mold uses an externalized spatial "memory" to navigate in complex environments.

    PubMed

    Reid, Chris R; Latty, Tanya; Dussutour, Audrey; Beekman, Madeleine

    2012-10-23

    Spatial memory enhances an organism's navigational ability. Memory typically resides within the brain, but what if an organism has no brain? We show that the brainless slime mold Physarum polycephalum constructs a form of spatial memory by avoiding areas it has previously explored. This mechanism allows the slime mold to solve the U-shaped trap problem--a classic test of autonomous navigational ability commonly used in robotics--requiring the slime mold to reach a chemoattractive goal behind a U-shaped barrier. Drawn into the trap, the organism must rely on other methods than gradient-following to escape and reach the goal. Our data show that spatial memory enhances the organism's ability to navigate in complex environments. We provide a unique demonstration of a spatial memory system in a nonneuronal organism, supporting the theory that an externalized spatial memory may be the functional precursor to the internal memory of higher organisms.

  18. Hybrid optical navigation by crater detection for lunar pin-point landing: trajectories from helicopter flight tests

    NASA Astrophysics Data System (ADS)

    Trigo, Guilherme F.; Maass, Bolko; Krüger, Hans; Theil, Stephan

    2018-01-01

    Accurate autonomous navigation capabilities are essential for future lunar robotic landing missions with a pin-point landing requirement, since in the absence of direct line of sight to ground control during critical approach and landing phases, or when facing long signal delays the herein before mentioned capability is needed to establish a guidance solution to reach the landing site reliably. This paper focuses on the processing and evaluation of data collected from flight tests that consisted of scaled descent scenarios where the unmanned helicopter of approximately 85 kg approached a landing site from altitudes of 50 m down to 1 m for a downrange distance of 200 m. Printed crater targets were distributed along the ground track and their detection provided earth-fixed measurements. The Crater Navigation (CNav) algorithm used to detect and match the crater targets is an unmodified method used for real lunar imagery. We analyze the absolute position and attitude solutions of CNav obtained and recorded during these flight tests, and investigate the attainable quality of vehicle pose estimation using both CNav and measurements from a tactical-grade inertial measurement unit. The navigation filter proposed for this end corrects and calibrates the high-rate inertial propagation with the less frequent crater navigation fixes through a closed-loop, loosely coupled hybrid setup. Finally, the attainable accuracy of the fused solution is evaluated by comparison with the on-board ground-truth solution of a dual-antenna high-grade GNSS receiver. It is shown that the CNav is an enabler for building autonomous navigation systems with high quality and suitability for exploration mission scenarios.

  19. In-motion initial alignment and positioning with INS/CNS/ODO integrated navigation system for lunar rovers

    NASA Astrophysics Data System (ADS)

    Lu, Jiazhen; Lei, Chaohua; Yang, Yanqiang; Liu, Ming

    2017-06-01

    Many countries have been paying great attention to space exploration, especially about the Moon and the Mars. Autonomous and high-accuracy navigation systems are needed for probers and rovers to accomplish missions. Inertial navigation system (INS)/celestial navigation system (CNS) based navigation system has been used widely on the lunar rovers. Initialization is a particularly important step for navigation. This paper presents an in-motion alignment and positioning method for lunar rovers by INS/CNS/odometer integrated navigation. The method can estimate not only the position and attitude errors, but also the biases of the accelerometers and gyros using the standard Kalman filter. The differences between the platform star azimuth, elevation angles and the computed star azimuth, elevation angles, and the difference between the velocity measured by odometer and the velocity measured by inertial sensors are taken as measurements. The semi-physical experiments are implemented to demonstrate that the position error can reduce to 10 m and attitude error is within 2″ during 5 min. The experiment results prove that it is an effective and attractive initialization approach for lunar rovers.

  20. Enabling Autonomous Rover Science through Dynamic Planning and Scheduling

    NASA Technical Reports Server (NTRS)

    Estlin, Tara A.; Gaines, Daniel; Chouinard, Caroline; Fisher, Forest; Castano, Rebecca; Judd, Michele; Nesnas, Issa

    2005-01-01

    This paper describes how dynamic planning and scheduling techniques can be used onboard a rover to autonomously adjust rover activities in support of science goals. These goals could be identified by scientists on the ground or could be identified by onboard data-analysis software. Several different types of dynamic decisions are described, including the handling of opportunistic science goals identified during rover traverses, preserving high priority science targets when resources, such as power, are unexpectedly over-subscribed, and dynamically adding additional, ground-specified science targets when rover actions are executed more quickly than expected. After describing our specific system approach, we discuss some of the particular challenges we have examined to support autonomous rover decision-making. These include interaction with rover navigation and path-planning software and handling large amounts of uncertainty in state and resource estimations.

  1. A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application.

    PubMed

    Vivacqua, Rafael; Vassallo, Raquel; Martins, Felipe

    2017-10-16

    Autonomous driving in public roads requires precise localization within the range of few centimeters. Even the best current precise localization system based on the Global Navigation Satellite System (GNSS) can not always reach this level of precision, especially in an urban environment, where the signal is disturbed by surrounding buildings and artifacts. Laser range finder and stereo vision have been successfully used for obstacle detection, mapping and localization to solve the autonomous driving problem. Unfortunately, Light Detection and Ranging (LIDARs) are very expensive sensors and stereo vision requires powerful dedicated hardware to process the cameras information. In this context, this article presents a low-cost architecture of sensors and data fusion algorithm capable of autonomous driving in narrow two-way roads. Our approach exploits a combination of a short-range visual lane marking detector and a dead reckoning system to build a long and precise perception of the lane markings in the vehicle's backwards. This information is used to localize the vehicle in a map, that also contains the reference trajectory for autonomous driving. Experimental results show the successful application of the proposed system on a real autonomous driving situation.

  2. A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application

    PubMed Central

    Vassallo, Raquel

    2017-01-01

    Autonomous driving in public roads requires precise localization within the range of few centimeters. Even the best current precise localization system based on the Global Navigation Satellite System (GNSS) can not always reach this level of precision, especially in an urban environment, where the signal is disturbed by surrounding buildings and artifacts. Laser range finder and stereo vision have been successfully used for obstacle detection, mapping and localization to solve the autonomous driving problem. Unfortunately, Light Detection and Ranging (LIDARs) are very expensive sensors and stereo vision requires powerful dedicated hardware to process the cameras information. In this context, this article presents a low-cost architecture of sensors and data fusion algorithm capable of autonomous driving in narrow two-way roads. Our approach exploits a combination of a short-range visual lane marking detector and a dead reckoning system to build a long and precise perception of the lane markings in the vehicle’s backwards. This information is used to localize the vehicle in a map, that also contains the reference trajectory for autonomous driving. Experimental results show the successful application of the proposed system on a real autonomous driving situation. PMID:29035334

  3. Autonomous unmanned air vehicles (UAV) techniques

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Lee, Ting N.

    2007-04-01

    The UAVs (Unmanned Air Vehicles) have great potentials in different civilian applications, such as oil pipeline surveillance, precision farming, forest fire fighting (yearly), search and rescue, boarder patrol, etc. The related industries of UAVs can create billions of dollars for each year. However, the road block of adopting UAVs is that it is against FAA (Federal Aviation Administration) and ATC (Air Traffic Control) regulations. In this paper, we have reviewed the latest technologies and researches on UAV navigation and obstacle avoidance. We have purposed a system design of Jittering Mosaic Image Processing (JMIP) with stereo vision and optical flow to fulfill the functionalities of autonomous UAVs.

  4. Self Navigating Wheelchair - The Future Of Mobility

    NASA Astrophysics Data System (ADS)

    Nayak, M.

    2017-12-01

    In a hospital environment, about 10% of the patients use wheelchairs, and among all of those people there is one common problem: How can they be independent while in a wheelchair? The goal of this project is to develop the overall system to autonomously navigate a wheelchair from one location in a hospital to another. I have designed a navigation system which will not only determine the location of the wheelchair, but also determine the destination location and then autonomously move the wheelchair to the destination. The design consists of a system of Bluetooth Low Energy Beacon (BLEB) network that allows a BLEB reader to determine its location in the hospital. BLE beacons transmit the signal. The network was designed to consist of a minimum of 4 BLEBs. The four BLEBs were in a quadrilateral arrangement with one BLEB at each corner. BLEBs were placed at or near wheelchair height which is 45 inches to minimize signal loss due to distance between BLEB and BLEB reader. A microcontroller based robot is used as a wheelchair prototype which was placed in the center position. The navigation system used this BLEB network to map out a course from one location to a second location in a hospital. The system is based on the Raspberry Pi as the central device that reads the signals from the BLEBs in the network. Raspberry Pi software interprets signal and changes it into a pair of coordinates. Each location in the hospital is in the form of a coordinates. Upon reading the signals, it deciphered and recognized each BELB by its unique address value and determined the RSSI signal strength from each BLELB in its vicinity to determine the distance from each BLEB. Then the user could interact with the central device to input the location desired for navigation. Upon obtaining the user input, the central device was able to determine its location and the signal strength with respect to the network of BLEBs. Wheels and motors can be controlled through the application. It then

  5. Using Planning, Scheduling and Execution for Autonomous Mars Rover Operations

    NASA Technical Reports Server (NTRS)

    Estlin, Tara A.; Gaines, Daniel M.; Chouinard, Caroline M.; Fisher, Forest W.; Castano, Rebecca; Judd, Michele J.; Nesnas, Issa A.

    2006-01-01

    With each new rover mission to Mars, rovers are traveling significantly longer distances. This distance increase raises not only the opportunities for science data collection, but also amplifies the amount of environment and rover state uncertainty that must be handled in rover operations. This paper describes how planning, scheduling and execution techniques can be used onboard a rover to autonomously generate and execute rover activities and in particular to handle new science opportunities that have been identified dynamically. We also discuss some of the particular challenges we face in supporting autonomous rover decision-making. These include interaction with rover navigation and path-planning software and handling large amounts of uncertainty in state and resource estimations. Finally, we describe our experiences in testing this work using several Mars rover prototypes in a realistic environment.

  6. First Results from a Hardware-in-the-Loop Demonstration of Closed-Loop Autonomous Formation Flying

    NASA Technical Reports Server (NTRS)

    Gill, E.; Naasz, Bo; Ebinuma, T.

    2003-01-01

    A closed-loop system for the demonstration of autonomous satellite formation flying technologies using hardware-in-the-loop has been developed. Making use of a GPS signal simulator with a dual radio frequency outlet, the system includes two GPS space receivers as well as a powerful onboard navigation processor dedicated to the GPS-based guidance, navigation, and control of a satellite formation in real-time. The closed-loop system allows realistic simulations of autonomous formation flying scenarios, enabling research in the fields of tracking and orbit control strategies for a wide range of applications. The autonomous closed-loop formation acquisition and keeping strategy is based on Lyapunov's direct control method as applied to the standard set of Keplerian elements. This approach not only assures global and asymptotic stability of the control but also maintains valuable physical insight into the applied control vectors. Furthermore, the approach can account for system uncertainties and effectively avoids a computationally expensive solution of the two point boundary problem, which renders the concept particularly attractive for implementation in onboard processors. A guidance law has been developed which strictly separates the relative from the absolute motion, thus avoiding the numerical integration of a target trajectory in the onboard processor. Moreover, upon using precise kinematic relative GPS solutions, a dynamical modeling or filtering is avoided which provides for an efficient implementation of the process on an onboard processor. A sample formation flying scenario has been created aiming at the autonomous transition of a Low Earth Orbit satellite formation from an initial along-track separation of 800 m to a target distance of 100 m. Assuming a low-thrust actuator which may be accommodated on a small satellite, a typical control accuracy of less than 5 m has been achieved which proves the applicability of autonomous formation flying techniques to

  7. Autonomous landing and ingress of micro-air-vehicles in urban environments based on monocular vision

    NASA Astrophysics Data System (ADS)

    Brockers, Roland; Bouffard, Patrick; Ma, Jeremy; Matthies, Larry; Tomlin, Claire

    2011-06-01

    Unmanned micro air vehicles (MAVs) will play an important role in future reconnaissance and search and rescue applications. In order to conduct persistent surveillance and to conserve energy, MAVs need the ability to land, and they need the ability to enter (ingress) buildings and other structures to conduct reconnaissance. To be safe and practical under a wide range of environmental conditions, landing and ingress maneuvers must be autonomous, using real-time, onboard sensor feedback. To address these key behaviors, we present a novel method for vision-based autonomous MAV landing and ingress using a single camera for two urban scenarios: landing on an elevated surface, representative of a rooftop, and ingress through a rectangular opening, representative of a door or window. Real-world scenarios will not include special navigation markers, so we rely on tracking arbitrary scene features; however, we do currently exploit planarity of the scene. Our vision system uses a planar homography decomposition to detect navigation targets and to produce approach waypoints as inputs to the vehicle control algorithm. Scene perception, planning, and control run onboard in real-time; at present we obtain aircraft position knowledge from an external motion capture system, but we expect to replace this in the near future with a fully self-contained, onboard, vision-aided state estimation algorithm. We demonstrate autonomous vision-based landing and ingress target detection with two different quadrotor MAV platforms. To our knowledge, this is the first demonstration of onboard, vision-based autonomous landing and ingress algorithms that do not use special purpose scene markers to identify the destination.

  8. Autonomous Landing and Ingress of Micro-Air-Vehicles in Urban Environments Based on Monocular Vision

    NASA Technical Reports Server (NTRS)

    Brockers, Roland; Bouffard, Patrick; Ma, Jeremy; Matthies, Larry; Tomlin, Claire

    2011-01-01

    Unmanned micro air vehicles (MAVs) will play an important role in future reconnaissance and search and rescue applications. In order to conduct persistent surveillance and to conserve energy, MAVs need the ability to land, and they need the ability to enter (ingress) buildings and other structures to conduct reconnaissance. To be safe and practical under a wide range of environmental conditions, landing and ingress maneuvers must be autonomous, using real-time, onboard sensor feedback. To address these key behaviors, we present a novel method for vision-based autonomous MAV landing and ingress using a single camera for two urban scenarios: landing on an elevated surface, representative of a rooftop, and ingress through a rectangular opening, representative of a door or window. Real-world scenarios will not include special navigation markers, so we rely on tracking arbitrary scene features; however, we do currently exploit planarity of the scene. Our vision system uses a planar homography decomposition to detect navigation targets and to produce approach waypoints as inputs to the vehicle control algorithm. Scene perception, planning, and control run onboard in real-time; at present we obtain aircraft position knowledge from an external motion capture system, but we expect to replace this in the near future with a fully self-contained, onboard, vision-aided state estimation algorithm. We demonstrate autonomous vision-based landing and ingress target detection with two different quadrotor MAV platforms. To our knowledge, this is the first demonstration of onboard, vision-based autonomous landing and ingress algorithms that do not use special purpose scene markers to identify the destination.

  9. The Personal Satellite Assistant: An Internal Spacecraft Autonomous Mobile Monitor

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Gawdiak, Yuri; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper presents an overview of the research and development effort at the NASA Ames Research Center to create an internal spacecraft autonomous mobile monitor capable of performing intra-vehicular sensing activities by autonomously navigating onboard the International Space Station. We describe the capabilities, mission roles, rationale, high-level functional requirements, and design challenges for an autonomous mobile monitor. The rapid prototyping design methodology used, in which five prototypes of increasing fidelity are designed, is described as well as the status of these prototypes, of which two are operational and being tested, and one is actively being designed. The physical test facilities used to perform ground testing are briefly described, including a micro-gravity test facility that permits a prototype to propel itself in 3 dimensions with 6 degrees-of-freedom as if it were in an micro-gravity environment. We also describe an overview of the autonomy framework and its components including the software simulators used in the development process. Sample mission test scenarios are also described. The paper concludes with a discussion of future and related work followed by the summary.

  10. Recent Advances in Bathymetric Surveying of Continental Shelf Regions Using Autonomous Vehicles

    NASA Astrophysics Data System (ADS)

    Holland, K. T.; Calantoni, J.; Slocum, D.

    2016-02-01

    Obtaining bathymetric observations within the continental shelf in areas closer to the shore is often time consuming and dangerous, especially when uncharted shoals and rocks present safety concerns to survey ships and launches. However, surveys in these regions are critically important to numerical simulation of oceanographic processes, as bathymetry serves as the bottom boundary condition in operational forecasting models. We will present recent progress in bathymetric surveying using both traditional vessels retrofitted for autonomous operations and relatively inexpensive, small team deployable, Autonomous Underwater Vehicles (AUV). Both systems include either high-resolution multibeam echo sounders or interferometric sidescan sonar sensors with integrated inertial navigation system capabilities consistent with present commercial-grade survey operations. The advantages and limitations of these two configurations employing both unmanned and autonomous strategies are compared using results from several recent survey operations. We will demonstrate how sensor data collected from unmanned platforms can augment or even replace traditional data collection technologies. Oceanographic observations (e.g., sound speed, temperature and currents) collected simultaneously with bathymetry using autonomous technologies provide additional opportunities for advanced data assimilation in numerical forecasts. Discussion focuses on our vision for unmanned and autonomous systems working in conjunction with manned or in-situ systems to optimally and simultaneously collect data in environmentally hostile or difficult to reach areas.

  11. Method and System for Gamma-Ray Localization Induced Spacecraft Navigation Using Celestial Gamma-Ray Sources

    NASA Technical Reports Server (NTRS)

    Hisamoto, Chuck (Inventor); Arzoumanian, Zaven (Inventor); Sheikh, Suneel I. (Inventor)

    2015-01-01

    A method and system for spacecraft navigation using distant celestial gamma-ray bursts which offer detectable, bright, high-energy events that provide well-defined characteristics conducive to accurate time-alignment among spatially separated spacecraft. Utilizing assemblages of photons from distant gamma-ray bursts, relative range between two spacecraft can be accurately computed along the direction to each burst's source based upon the difference in arrival time of the burst emission at each spacecraft's location. Correlation methods used to time-align the high-energy burst profiles are provided. The spacecraft navigation may be carried out autonomously or in a central control mode of operation.

  12. A Hybrid FPGA/Tilera Compute Element for Autonomous Hazard Detection and Navigation

    NASA Technical Reports Server (NTRS)

    Villalpando, Carlos Y.; Werner, Robert A.; Carson, John M., III; Khanoyan, Garen; Stern, Ryan A.; Trawny, Nikolas

    2013-01-01

    To increase safety for future missions landing on other planetary or lunar bodies, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is developing an integrated sensor for autonomous surface analysis and hazard determination. The ALHAT Hazard Detection System (HDS) consists of a Flash LIDAR for measuring the topography of the landing site, a gimbal to scan across the terrain, and an Inertial Measurement Unit (IMU), along with terrain analysis algorithms to identify the landing site and the local hazards. An FPGA and Manycore processor system was developed to interface all the devices in the HDS, to provide high-resolution timing to accurately measure system state, and to run the surface analysis algorithms quickly and efficiently. In this paper, we will describe how we integrated COTS components such as an FPGA evaluation board, a TILExpress64, and multi-threaded/multi-core aware software to build the HDS Compute Element (HDSCE). The ALHAT program is also working with the NASA Morpheus Project and has integrated the HDS as a sensor on the Morpheus Lander. This paper will also describe how the HDS is integrated with the Morpheus lander and the results of the initial test flights with the HDS installed. We will also describe future improvements to the HDSCE.

  13. A hybrid FPGA/Tilera compute element for autonomous hazard detection and navigation

    NASA Astrophysics Data System (ADS)

    Villalpando, C. Y.; Werner, R. A.; Carson, J. M.; Khanoyan, G.; Stern, R. A.; Trawny, N.

    To increase safety for future missions landing on other planetary or lunar bodies, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is developing an integrated sensor for autonomous surface analysis and hazard determination. The ALHAT Hazard Detection System (HDS) consists of a Flash LIDAR for measuring the topography of the landing site, a gimbal to scan across the terrain, and an Inertial Measurement Unit (IMU), along with terrain analysis algorithms to identify the landing site and the local hazards. An FPGA and Manycore processor system was developed to interface all the devices in the HDS, to provide high-resolution timing to accurately measure system state, and to run the surface analysis algorithms quickly and efficiently. In this paper, we will describe how we integrated COTS components such as an FPGA evaluation board, a TILExpress64, and multi-threaded/multi-core aware software to build the HDS Compute Element (HDSCE). The ALHAT program is also working with the NASA Morpheus Project and has integrated the HDS as a sensor on the Morpheus Lander. This paper will also describe how the HDS is integrated with the Morpheus lander and the results of the initial test flights with the HDS installed. We will also describe future improvements to the HDSCE.

  14. Spoofing Detection Using GNSS/INS/Odometer Coupling for Vehicular Navigation

    PubMed Central

    Broumandan, Ali; Lachapelle, Gérard

    2018-01-01

    Location information is one of the most vital information required to achieve intelligent and context-aware capability for various applications such as driverless cars. However, related security and privacy threats are a major holdback. With increasing focus on using Global Navigation Satellite Systems (GNSS) for autonomous navigation and related applications, it is important to provide robust navigation solutions, yet signal spoofing for illegal or covert transportation and misleading receiver timing is increasing and now frequent. Hence, detection and mitigation of spoofing attacks has become an important topic. Several contributions on spoofing detection have been made, focusing on different layers of a GNSS receiver. This paper focuses on spoofing detection utilizing self-contained sensors, namely inertial measurement units (IMUs) and vehicle odometer outputs. A spoofing detection approach based on a consistency check between GNSS and IMU/odometer mechanization is proposed. To detect a spoofing attack, the method analyses GNSS and IMU/odometer measurements independently during a pre-selected observation window and cross checks the solutions provided by GNSS and inertial navigation solution (INS)/odometer mechanization. The performance of the proposed method is verified in real vehicular environments. Mean spoofing detection time and detection performance in terms of receiver operation characteristics (ROC) in sub-urban and dense urban environments are evaluated. PMID:29695064

  15. Spoofing Detection Using GNSS/INS/Odometer Coupling for Vehicular Navigation.

    PubMed

    Broumandan, Ali; Lachapelle, Gérard

    2018-04-24

    Location information is one of the most vital information required to achieve intelligent and context-aware capability for various applications such as driverless cars. However, related security and privacy threats are a major holdback. With increasing focus on using Global Navigation Satellite Systems (GNSS) for autonomous navigation and related applications, it is important to provide robust navigation solutions, yet signal spoofing for illegal or covert transportation and misleading receiver timing is increasing and now frequent. Hence, detection and mitigation of spoofing attacks has become an important topic. Several contributions on spoofing detection have been made, focusing on different layers of a GNSS receiver. This paper focuses on spoofing detection utilizing self-contained sensors, namely inertial measurement units (IMUs) and vehicle odometer outputs. A spoofing detection approach based on a consistency check between GNSS and IMU/odometer mechanization is proposed. To detect a spoofing attack, the method analyses GNSS and IMU/odometer measurements independently during a pre-selected observation window and cross checks the solutions provided by GNSS and inertial navigation solution (INS)/odometer mechanization. The performance of the proposed method is verified in real vehicular environments. Mean spoofing detection time and detection performance in terms of receiver operation characteristics (ROC) in sub-urban and dense urban environments are evaluated.

  16. Global Precipitation Measurement (GPM) Orbit Design and Autonomous Maneuvers

    NASA Technical Reports Server (NTRS)

    Folta, David; Mendelsohn, Chad

    2003-01-01

    The NASA Goddard Space Flight Center's Global Precipitation Measurement (GPM) mission will meet a challenge of measuring worldwide precipitation every three hours. The GPM spacecraft, part of a constellation, will be required to maintain a circular orbit in a high drag environment to accomplish this challenge. Analysis by the Flight Dynamics Analysis Branch has shown that the prime orbit altitude of 40% is necessary to prevent ground track repeating. Combined with goals to minimize maneuver impacts to science data collection and enabling reasonable long-term orbit predictions, the GPM project has decided to fly an autonomous maneuver system. This system is a derivative of the successful New Millennium Program technology flown onboard the Earth Observing-1 mission. This paper presents the driving science requirements and goals of the mission and shows how they will be met. Analysis of the orbit optimization and the AV requirements for several ballistic properties are presented. The architecture of the autonomous maneuvering system to meet the goals and requirements is presented along with simulations using a GPM prototype. Additionally, the use of the GPM autonomous system to mitigate possible collision avoidance and to aid other spacecraft systems during navigation outages is explored.

  17. An Outdoor Navigation Platform with a 3D Scanner and Gyro-assisted Odometry

    NASA Astrophysics Data System (ADS)

    Yoshida, Tomoaki; Irie, Kiyoshi; Koyanagi, Eiji; Tomono, Masahiro

    This paper proposes a light-weight navigation platform that consists of gyro-assisted odometry, a 3D laser scanner and map-based localization for human-scale robots. The gyro-assisted odometry provides highly accurate positioning only by dead-reckoning. The 3D laser scanner has a wide field of view and uniform measuring-point distribution. The map-based localization is robust and computationally inexpensive by utilizing a particle filter on a 2D grid map generated by projecting 3D points on to the ground. The system uses small and low-cost sensors, and can be applied to a variety of mobile robots in human-scale environments. Outdoor navigation experiments were conducted at the Tsukuba Challenge held in 2009 and 2010, which is an open proving ground for human-scale robots. Our robot successfully navigated the assigned 1-km courses in a fully autonomous mode multiple times.

  18. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  19. Telecommunications, navigation and information management concept overview for the Space Exploration Initiative program

    NASA Technical Reports Server (NTRS)

    Bell, Jerome A.; Stephens, Elaine; Barton, Gregg

    1991-01-01

    An overview is provided of the Space Exploration Initiative (SEI) concepts for telecommunications, information systems, and navigation (TISN), and engineering and architecture issues are discussed. The SEI program data system is reviewed to identify mission TISN interfaces, and reference TISN concepts are described for nominal, degraded, and mission-critical data services. The infrastructures reviewed include telecommunications for robotics support, autonomous navigation without earth-based support, and information networks for tracking and data acquisition. Four options for TISN support architectures are examined which relate to unique SEI exploration strategies. Detailed support estimates are given for: (1) a manned stay on Mars; (2) permanent lunar and Martian settlements; short-duration missions; and (4) systematic exploration of the moon and Mars.

  20. Ultra-Wideband Tracking System Design for Relative Navigation

    NASA Technical Reports Server (NTRS)

    Ni, Jianjun David; Arndt, Dickey; Bgo, Phong; Dekome, Kent; Dusl, John

    2011-01-01

    This presentation briefly discusses a design effort for a prototype ultra-wideband (UWB) time-difference-of-arrival (TDOA) tracking system that is currently under development at NASA Johnson Space Center (JSC). The system is being designed for use in localization and navigation of a rover in a GPS deprived environment for surface missions. In one application enabled by the UWB tracking, a robotic vehicle carrying equipments can autonomously follow a crewed rover from work site to work site such that resources can be carried from one landing mission to the next thereby saving up-mass. The UWB Systems Group at JSC has developed a UWB TDOA High Resolution Proximity Tracking System which can achieve sub-inch tracking accuracy of a target within the radius of the tracking baseline [1]. By extending the tracking capability beyond the radius of the tracking baseline, a tracking system is being designed to enable relative navigation between two vehicles for surface missions. A prototype UWB TDOA tracking system has been designed, implemented, tested, and proven feasible for relative navigation of robotic vehicles. Future work includes testing the system with the application code to increase the tracking update rate and evaluating the linear tracking baseline to improve the flexibility of antenna mounting on the following vehicle.

  1. Lunar Navigation Determination System - LaNDS

    NASA Technical Reports Server (NTRS)

    Quinn, David; Talabac, Stephen

    2012-01-01

    A portable comprehensive navigational system has been developed that both robotic and human explorers can use to determine their location, attitude, and heading anywhere on the lunar surface independent of external infrastructure (needs no Lunar satellite network, line of sight to the Sun or Earth, etc.). The system combines robust processing power with an extensive topographical database to create a real-time atlas (GIS Geospatial Information System) that is able to autonomously control and monitor both single unmanned rovers and fleets of rovers, as well as science payload stations. The system includes provisions for teleoperation and tele-presence. The system accepts (but does not require) inputs from a wide range of sensors. A means was needed to establish a location when the search is taken deep in a crater (looking for water ice) and out of view of Earth or any other references. A star camera can be employed to determine the user's attitude in menial space and stellar map in body space. A local nadir reference (e.g., an accelerometer that orients the nadir vector in body space) can be used in conjunction with a digital ephemeris and gravity model of the Moon to isolate the latitude, longitude, and azimuth of the user on the surface. That information can be used in conjunction with a Lunar GIS and advanced navigation planning algorithms to aid astronauts (or other assets) to navigate on the Lunar surface.

  2. Key Issues for Navigation and Time Dissemination in NASA's Space Exploration Program

    NASA Technical Reports Server (NTRS)

    Nelson, R. A.; Brodsky, B.; Oria, A. J.; Connolly, J. W.; Sands, O. S.; Welch, B. W.; Ely T.; Orr, R.; Schuchman, L.

    2006-01-01

    The renewed emphasis on robotic and human missions within NASA's space exploration program warrants a detailed consideration of how the positions of objects in space will be determined and tracked, whether they be spacecraft, human explorers, robots, surface vehicles, or science instrumentation. The Navigation Team within the NASA Space Communications Architecture Working Group (SCAWG) has addressed several key technical issues in this area and the principle findings are reported here. For navigation in the vicinity of the Moon, a variety of satellite constellations have been investigated that provide global or regional surface position determination and timely services analogous to those offered by GPS at Earth. In the vicinity of Mars, there are options for satellite constellations not available at the Moon due to the gravitational perturbations from Earth, such as two satellites in an aerostationary orbit. Alternate methods of radiometric navigation as considered, including one- and two-way signals, as well as autonomous navigation. The use of a software radio capable of receiving all available signal sources, such as GPS, pseudolites, and communication channels, is discussed. Methods of time transfer and dissemination are also considered in this paper.

  3. ODYSSEUS autonomous walking robot: The leg/arm design

    NASA Technical Reports Server (NTRS)

    Bourbakis, N. G.; Maas, M.; Tascillo, A.; Vandewinckel, C.

    1994-01-01

    ODYSSEUS is an autonomous walking robot, which makes use of three wheels and three legs for its movement in the free navigation space. More specifically, it makes use of its autonomous wheels to move around in an environment where the surface is smooth and not uneven. However, in the case that there are small height obstacles, stairs, or small height unevenness in the navigation environment, the robot makes use of both wheels and legs to travel efficiently. In this paper we present the detailed hardware design and the simulated behavior of the extended leg/arm part of the robot, since it plays a very significant role in the robot actions (movements, selection of objects, etc.). In particular, the leg/arm consists of three major parts: The first part is a pipe attached to the robot base with a flexible 3-D joint. This pipe has a rotated bar as an extended part, which terminates in a 3-D flexible joint. The second part of the leg/arm is also a pipe similar to the first. The extended bar of the second part ends at a 2-D joint. The last part of the leg/arm is a clip-hand. It is used for selecting several small weight and size objects, and when it is in a 'closed' mode, it is used as a supporting part of the robot leg. The entire leg/arm part is controlled and synchronized by a microcontroller (68CH11) attached to the robot base.

  4. Moth-inspired navigation algorithm in a turbulent odor plume from a pulsating source.

    PubMed

    Liberzon, Alexander; Harrington, Kyra; Daniel, Nimrod; Gurka, Roi; Harari, Ally; Zilman, Gregory

    2018-01-01

    Some female moths attract male moths by emitting series of pulses of pheromone filaments propagating downwind. The turbulent nature of the wind creates a complex flow environment, and causes the filaments to propagate in the form of patches with varying concentration distributions. Inspired by moth navigation capabilities, we propose a navigation strategy that enables a flier to locate an upwind pulsating odor source in a windy environment using a single threshold-based detection sensor. This optomotor anemotaxis strategy is constructed based on the physical properties of the turbulent flow carrying discrete puffs of odor and does not involve learning, memory, complex decision making or statistical methods. We suggest that in turbulent plumes from a pulsating point source, an instantaneously measurable quantity referred as a "puff crossing time", improves the success rate as compared to the navigation strategies based on temporally regular zigzags due to intermittent contact, or an "internal counter", that do not use this information. Using computer simulations of fliers navigating in turbulent plumes of the pulsating point source for varying flow parameters such as turbulent intensities, plume meandering and wind gusts, we obtained statistics of navigation paths towards the pheromone sources. We quantified the probability of a successful navigation as well as the flight parameters such as the time spent searching and the total flight time, with respect to different turbulent intensities, meandering or gusts. The concepts learned using this model may help to design odor-based navigation of miniature airborne autonomous vehicles.

  5. Volunteers Oriented Interface Design for the Remote Navigation of Rescue Robots at Large-Scale Disaster Sites

    NASA Astrophysics Data System (ADS)

    Yang, Zhixiao; Ito, Kazuyuki; Saijo, Kazuhiko; Hirotsune, Kazuyuki; Gofuku, Akio; Matsuno, Fumitoshi

    This paper aims at constructing an efficient interface being similar to those widely used in human daily life, to fulfill the need of many volunteer rescuers operating rescue robots at large-scale disaster sites. The developed system includes a force feedback steering wheel interface and an artificial neural network (ANN) based mouse-screen interface. The former consists of a force feedback steering control and a six monitors’ wall. It provides a manual operation like driving cars to navigate a rescue robot. The latter consists of a mouse and a camera’s view displayed in a monitor. It provides a semi-autonomous operation by mouse clicking to navigate a rescue robot. Results of experiments show that a novice volunteer can skillfully navigate a tank rescue robot through both interfaces after 20 to 30 minutes of learning their operation respectively. The steering wheel interface has high navigating speed in open areas, without restriction of terrains and surface conditions of a disaster site. The mouse-screen interface is good at exact navigation in complex structures, while bringing little tension to operators. The two interfaces are designed to switch into each other at any time to provide a combined efficient navigation method.

  6. IPS - a vision aided navigation system

    NASA Astrophysics Data System (ADS)

    Börner, Anko; Baumbach, Dirk; Buder, Maximilian; Choinowski, Andre; Ernst, Ines; Funk, Eugen; Grießbach, Denis; Schischmanow, Adrian; Wohlfeil, Jürgen; Zuev, Sergey

    2017-04-01

    Ego localization is an important prerequisite for several scientific, commercial, and statutory tasks. Only by knowing one's own position, can guidance be provided, inspections be executed, and autonomous vehicles be operated. Localization becomes challenging if satellite-based navigation systems are not available, or data quality is not sufficient. To overcome this problem, a team of the German Aerospace Center (DLR) developed a multi-sensor system based on the human head and its navigation sensors - the eyes and the vestibular system. This system is called integrated positioning system (IPS) and contains a stereo camera and an inertial measurement unit for determining an ego pose in six degrees of freedom in a local coordinate system. IPS is able to operate in real time and can be applied for indoor and outdoor scenarios without any external reference or prior knowledge. In this paper, the system and its key hardware and software components are introduced. The main issues during the development of such complex multi-sensor measurement systems are identified and discussed, and the performance of this technology is demonstrated. The developer team started from scratch and transfers this technology into a commercial product right now. The paper finishes with an outlook.

  7. Immune systems are not just for making you feel better: they are for controlling autonomous robots

    NASA Astrophysics Data System (ADS)

    Rosenblum, Mark

    2005-05-01

    The typical algorithm for robot autonomous navigation in off-road complex environments involves building a 3D map of the robot's surrounding environment using a 3D sensing modality such as stereo vision or active laser scanning, and generating an instantaneous plan to navigate around hazards. Although there has been steady progress using these methods, these systems suffer from several limitations that cannot be overcome with 3D sensing and planning alone. Geometric sensing alone has no ability to distinguish between compressible and non-compressible materials. As a result, these systems have difficulty in heavily vegetated environments and require sensitivity adjustments across different terrain types. On the planning side, these systems have no ability to learn from their mistakes and avoid problematic environmental situations on subsequent encounters. We have implemented an adaptive terrain classification system based on the Artificial Immune System (AIS) computational model, which is loosely based on the biological immune system, that combines various forms of imaging sensor inputs to produce a "feature labeled" image of the scene categorizing areas as benign or detrimental for autonomous robot navigation. Because of the qualities of the AIS computation model, the resulting system will be able to learn and adapt on its own through interaction with the environment by modifying its interpretation of the sensor data. The feature labeled results from the AIS analysis are inserted into a map and can then be used by a planner to generate a safe route to a goal point. The coupling of diverse visual cues with the malleable AIS computational model will lead to autonomous robotic ground vehicles that require less human intervention for deployment in novel environments and more robust operation as a result of the system's ability to improve its performance through interaction with the environment.

  8. Test Operations Procedure (TOP) 02-2-546 Teleoperated Unmanned Ground Vehicle (UGV) Latency Measurements

    DTIC Science & Technology

    2017-01-11

    discrete system components or measurements of latency in autonomous systems. 15. SUBJECT TERMS Unmanned Ground Vehicles, Basic Video Latency, End-to... discrete system components or measurements of latency in autonomous systems. 1.1 Basic Video Latency. Teleoperation latency, or lag, describes

  9. Slime mold uses an externalized spatial “memory” to navigate in complex environments

    PubMed Central

    Reid, Chris R.; Latty, Tanya; Dussutour, Audrey; Beekman, Madeleine

    2012-01-01

    Spatial memory enhances an organism’s navigational ability. Memory typically resides within the brain, but what if an organism has no brain? We show that the brainless slime mold Physarum polycephalum constructs a form of spatial memory by avoiding areas it has previously explored. This mechanism allows the slime mold to solve the U-shaped trap problem—a classic test of autonomous navigational ability commonly used in robotics—requiring the slime mold to reach a chemoattractive goal behind a U-shaped barrier. Drawn into the trap, the organism must rely on other methods than gradient-following to escape and reach the goal. Our data show that spatial memory enhances the organism’s ability to navigate in complex environments. We provide a unique demonstration of a spatial memory system in a nonneuronal organism, supporting the theory that an externalized spatial memory may be the functional precursor to the internal memory of higher organisms. PMID:23045640

  10. Optimal Path Planning Program for Autonomous Speed Sprayer in Orchard Using Order-Picking Algorithm

    NASA Astrophysics Data System (ADS)

    Park, T. S.; Park, S. J.; Hwang, K. Y.; Cho, S. I.

    This study was conducted to develop a software program which computes optimal path for autonomous navigation in orchard, especially for speed sprayer. Possibilities of autonomous navigation in orchard were shown by other researches which have minimized distance error between planned path and performed path. But, research of planning an optimal path for speed sprayer in orchard is hardly founded. In this study, a digital map and a database for orchard which contains GPS coordinate information (coordinates of trees and boundary of orchard) and entity information (heights and widths of trees, radius of main stem of trees, disease of trees) was designed. An orderpicking algorithm which has been used for management of warehouse was used to calculate optimum path based on the digital map. Database for digital map was created by using Microsoft Access and graphic interface for database was made by using Microsoft Visual C++ 6.0. It was possible to search and display information about boundary of an orchard, locations of trees, daily plan for scattering chemicals and plan optimal path on different orchard based on digital map, on each circumstance (starting speed sprayer in different location, scattering chemicals for only selected trees).

  11. An innovative information fusion method with adaptive Kalman filter for integrated INS/GPS navigation of autonomous vehicles

    NASA Astrophysics Data System (ADS)

    Liu, Yahui; Fan, Xiaoqian; Lv, Chen; Wu, Jian; Li, Liang; Ding, Dawei

    2018-02-01

    Information fusion method of INS/GPS navigation system based on filtering technology is a research focus at present. In order to improve the precision of navigation information, a navigation technology based on Adaptive Kalman Filter with attenuation factor is proposed to restrain noise in this paper. The algorithm continuously updates the measurement noise variance and processes noise variance of the system by collecting the estimated and measured values, and this method can suppress white noise. Because a measured value closer to the current time would more accurately reflect the characteristics of the noise, an attenuation factor is introduced to increase the weight of the current value, in order to deal with the noise variance caused by environment disturbance. To validate the effectiveness of the proposed algorithm, a series of road tests are carried out in urban environment. The GPS and IMU data of the experiments were collected and processed by dSPACE and MATLAB/Simulink. Based on the test results, the accuracy of the proposed algorithm is 20% higher than that of a traditional Adaptive Kalman Filter. It also shows that the precision of the integrated navigation can be improved due to the reduction of the influence of environment noise.

  12. Autonomous Flight Rules - A Concept for Self-Separation in U.S. Domestic Airspace

    NASA Technical Reports Server (NTRS)

    Wing, David J.; Cotton, William B.

    2011-01-01

    Autonomous Flight Rules (AFR) are proposed as a new set of operating regulations in which aircraft navigate on tracks of their choice while self-separating from traffic and weather. AFR would exist alongside Instrument and Visual Flight Rules (IFR and VFR) as one of three available flight options for any appropriately trained and qualified operator with the necessary certified equipment. Historically, ground-based separation services evolved by necessity as aircraft began operating in the clouds and were unable to see each other. Today, technologies for global navigation, airborne surveillance, and onboard computing enable the functions of traffic conflict management to be fully integrated with navigation procedures onboard the aircraft. By self-separating, aircraft can operate with more flexibility and fewer restrictions than are required when using ground-based separation. The AFR concept is described in detail and provides practical means by which self-separating aircraft could share the same airspace as IFR and VFR aircraft without disrupting the ongoing processes of Air Traffic Control.

  13. X-Ray Detection and Processing Models for Spacecraft Navigation and Timing

    NASA Technical Reports Server (NTRS)

    Sheikh, Suneel; Hanson, John

    2013-01-01

    The current primary method of deepspace navigation is the NASA Deep Space Network (DSN). High-performance navigation is achieved using Delta Differential One-Way Range techniques that utilize simultaneous observations from multiple DSN sites, and incorporate observations of quasars near the line-of-sight to a spacecraft in order to improve the range and angle measurement accuracies. Over the past four decades, x-ray astronomers have identified a number of xray pulsars with pulsed emissions having stabilities comparable to atomic clocks. The x-ray pulsar-based navigation and time determination (XNAV) system uses phase measurements from these sources to establish autonomously the position of the detector, and thus the spacecraft, relative to a known reference frame, much as the Global Positioning System (GPS) uses phase measurements from radio signals from several satellites to establish the position of the user relative to an Earth-centered fixed frame of reference. While a GPS receiver uses an antenna to detect the radio signals, XNAV uses a detector array to capture the individual xray photons from the x-ray pulsars. The navigation solution relies on detailed xray source models, signal processing, navigation and timing algorithms, and analytical tools that form the basis of an autonomous XNAV system. Through previous XNAV development efforts, some techniques have been established to utilize a pulsar pulse time-of-arrival (TOA) measurement to correct a position estimate. One well-studied approach, based upon Kalman filter methods, optimally adjusts a dynamic orbit propagation solution based upon the offset in measured and predicted pulse TOA. In this delta position estimator scheme, previously estimated values of spacecraft position and velocity are utilized from an onboard orbit propagator. Using these estimated values, the detected arrival times at the spacecraft of pulses from a pulsar are compared to the predicted arrival times defined by the pulsar s pulse

  14. Model-based software engineering for an optical navigation system for spacecraft

    NASA Astrophysics Data System (ADS)

    Franz, T.; Lüdtke, D.; Maibaum, O.; Gerndt, A.

    2017-09-01

    The project Autonomous Terrain-based Optical Navigation (ATON) at the German Aerospace Center (DLR) is developing an optical navigation system for future landing missions on celestial bodies such as the moon or asteroids. Image data obtained by optical sensors can be used for autonomous determination of the spacecraft's position and attitude. Camera-in-the-loop experiments in the Testbed for Robotic Optical Navigation (TRON) laboratory and flight campaigns with unmanned aerial vehicle (UAV) are performed to gather flight data for further development and to test the system in a closed-loop scenario. The software modules are executed in the C++ Tasking Framework that provides the means to concurrently run the modules in separated tasks, send messages between tasks, and schedule task execution based on events. Since the project is developed in collaboration with several institutes in different domains at DLR, clearly defined and well-documented interfaces are necessary. Preventing misconceptions caused by differences between various development philosophies and standards turned out to be challenging. After the first development cycles with manual Interface Control Documents (ICD) and manual implementation of the complex interactions between modules, we switched to a model-based approach. The ATON model covers a graphical description of the modules, their parameters and communication patterns. Type and consistency checks on this formal level help to reduce errors in the system. The model enables the generation of interfaces and unified data types as well as their documentation. Furthermore, the C++ code for the exchange of data between the modules and the scheduling of the software tasks is created automatically. With this approach, changing the data flow in the system or adding additional components (e.g., a second camera) have become trivial.

  15. Model-based software engineering for an optical navigation system for spacecraft

    NASA Astrophysics Data System (ADS)

    Franz, T.; Lüdtke, D.; Maibaum, O.; Gerndt, A.

    2018-06-01

    The project Autonomous Terrain-based Optical Navigation (ATON) at the German Aerospace Center (DLR) is developing an optical navigation system for future landing missions on celestial bodies such as the moon or asteroids. Image data obtained by optical sensors can be used for autonomous determination of the spacecraft's position and attitude. Camera-in-the-loop experiments in the Testbed for Robotic Optical Navigation (TRON) laboratory and flight campaigns with unmanned aerial vehicle (UAV) are performed to gather flight data for further development and to test the system in a closed-loop scenario. The software modules are executed in the C++ Tasking Framework that provides the means to concurrently run the modules in separated tasks, send messages between tasks, and schedule task execution based on events. Since the project is developed in collaboration with several institutes in different domains at DLR, clearly defined and well-documented interfaces are necessary. Preventing misconceptions caused by differences between various development philosophies and standards turned out to be challenging. After the first development cycles with manual Interface Control Documents (ICD) and manual implementation of the complex interactions between modules, we switched to a model-based approach. The ATON model covers a graphical description of the modules, their parameters and communication patterns. Type and consistency checks on this formal level help to reduce errors in the system. The model enables the generation of interfaces and unified data types as well as their documentation. Furthermore, the C++ code for the exchange of data between the modules and the scheduling of the software tasks is created automatically. With this approach, changing the data flow in the system or adding additional components (e.g., a second camera) have become trivial.

  16. Reconnaissance and Autonomy for Small Robots (RASR)

    DTIC Science & Technology

    2012-06-29

    The Reconnaissance and Autonomy for Small Robots (RASR) team developed a system for the coordination of groups of unmanned ground vehicles (UGVs...development of a system that used 1) a relevant deployable platform; 2) a minimum set of relatively inexpensive navigation and LADAR sensors; 3) an...expandable and modular control system with innovative software algorithms to minimize computing footprint; and that minimized 4) required communications

  17. Global Precipitation Measurement (GPM) Orbit Design and Autonomous Maneuvers

    NASA Technical Reports Server (NTRS)

    Folta, David; Mendelsohn, Chad; Mailhe, Laurie

    2003-01-01

    The NASA Goddard Space Flight Center's Global Precipitation Measurement (GPM) mission must meet the challenge of measuring worldwide precipitation every three hours. The GPM core spacecraft, part of a constellation, will be required to maintain a circular orbit in a high drag environment at a near-critical inclination. Analysis shows that a mean orbit altitude of 407 km is necessary to prevent ground track repeating. Combined with goals to minimize maneuver operation impacts to science data collection and to enable reasonable long-term orbit predictions, the GPM project has decided to fly the GSFC autonomous maneuver system, AutoCon(TM). This system is a follow-up version of the highly successful New Millennium Program technology flown onboard the Earth Observing-1 formation flying mission. This paper presents the driving science requirements and goals of the GPM mission and shows how they will be met. Selection of the mean semi-major axis, eccentricity, and the AV budget for several ballistic properties are presented. The architecture of the autonomous maneuvering system to meet the goals and requirements is presented along with simulations using GPM parameters. Additionally, the use of the GPM autonomous system to mitigate possible collision avoidance and to aid other spacecraft systems during navigation outages is explored.

  18. Aspect-dependent radiated noise analysis of an underway autonomous underwater vehicle.

    PubMed

    Gebbie, John; Siderius, Martin; Allen, John S

    2012-11-01

    This paper presents an analysis of the acoustic emissions emitted by an underway REMUS-100 autonomous underwater vehicle (AUV) that were obtained near Honolulu Harbor, HI using a fixed, bottom-mounted horizontal line array (HLA). Spectral analysis, beamforming, and cross-correlation facilitate identification of independent sources of noise originating from the AUV. Fusion of navigational records from the AUV with acoustic data from the HLA allows for an aspect-dependent presentation of calculated source levels of the strongest propulsion tone.

  19. A Spatial Cognitive Map and a Human-Like Memory Model Dedicated to Pedestrian Navigation in Virtual Urban Environments

    NASA Astrophysics Data System (ADS)

    Thomas, Romain; Donikian, Stéphane

    Many articles dealing with agent navigation in an urban environment involve the use of various heuristics. Among them, one is prevalent: the search of the shortest path between two points. This strategy impairs the realism of the resulting behaviour. Indeed, psychological studies state that such a navigation behaviour is conditioned by the knowledge the subject has of its environment. Furthermore, the path a city dweller can follow may be influenced by many factors like his daily habits, or the path simplicity in term of minimum of direction changes. It appeared interesting to us to investigate how to mimic human navigation behavior with an autonomous agent. The solution we propose relies on an architecture based on a generic model of informed environment, a spatial cognitive map model merged with a human-like memory model, representing the agent's temporal knowledge of the environment, it gained along its experiences of navigation.

  20. Terrain discovery and navigation of a multi-articulated linear robot using map-seeking circuits

    NASA Astrophysics Data System (ADS)

    Snider, Ross K.; Arathorn, David W.

    2006-05-01

    A significant challenge in robotics is providing a robot with the ability to sense its environment and then autonomously move while accommodating obstacles. The DARPA Grand Challenge, one of the most visible examples, set the goal of driving a vehicle autonomously for over a hundred miles avoiding obstacles along a predetermined path. Map-Seeking Circuits have shown their biomimetic capability in both vision and inverse kinematics and here we demonstrate their potential usefulness for intelligent exploration of unknown terrain using a multi-articulated linear robot. A robot that could handle any degree of terrain complexity would be useful for exploring inaccessible crowded spaces such as rubble piles in emergency situations, patrolling/intelligence gathering in tough terrain, tunnel exploration, and possibly even planetary exploration. Here we simulate autonomous exploratory navigation by an interaction of terrain discovery using the multi-articulated linear robot to build a local terrain map and exploitation of that growing terrain map to solve the propulsion problem of the robot.

  1. Loosely Coupled GPS-Aided Inertial Navigation System for Range Safety

    NASA Technical Reports Server (NTRS)

    Heatwole, Scott; Lanzi, Raymond J.

    2010-01-01

    The Autonomous Flight Safety System (AFSS) aims to replace the human element of range safety operations, as well as reduce reliance on expensive, downrange assets for launches of expendable launch vehicles (ELVs). The system consists of multiple navigation sensors and flight computers that provide a highly reliable platform. It is designed to ensure that single-event failures in a flight computer or sensor will not bring down the whole system. The flight computer uses a rules-based structure derived from range safety requirements to make decisions whether or not to destroy the rocket.

  2. Ranging Consistency Based on Ranging-Compensated Temperature-Sensing Sensor for Inter-Satellite Link of Navigation Constellation

    PubMed Central

    Meng, Zhijun; Yang, Jun; Guo, Xiye; Zhou, Yongbin

    2017-01-01

    Global Navigation Satellite System performance can be significantly enhanced by introducing inter-satellite links (ISLs) in navigation constellation. The improvement in position, velocity, and time accuracy as well as the realization of autonomous functions requires ISL distance measurement data as the original input. To build a high-performance ISL, the ranging consistency among navigation satellites is an urgent problem to be solved. In this study, we focus on the variation in the ranging delay caused by the sensitivity of the ISL payload equipment to the ambient temperature in space and propose a simple and low-power temperature-sensing ranging compensation sensor suitable for onboard equipment. The experimental results show that, after the temperature-sensing ranging compensation of the ISL payload equipment, the ranging consistency becomes less than 0.2 ns when the temperature change is 90 °C. PMID:28608809

  3. AltiVec performance increases for autonomous robotics for the MARSSCAPE architecture program

    NASA Astrophysics Data System (ADS)

    Gothard, Benny M.

    2002-02-01

    One of the main tall poles that must be overcome to develop a fully autonomous vehicle is the inability of the computer to understand its surrounding environment to a level that is required for the intended task. The military mission scenario requires a robot to interact in a complex, unstructured, dynamic environment. Reference A High Fidelity Multi-Sensor Scene Understanding System for Autonomous Navigation The Mobile Autonomous Robot Software Self Composing Adaptive Programming Environment (MarsScape) perception research addresses three aspects of the problem; sensor system design, processing architectures, and algorithm enhancements. A prototype perception system has been demonstrated on robotic High Mobility Multi-purpose Wheeled Vehicle and All Terrain Vehicle testbeds. This paper addresses the tall pole of processing requirements and the performance improvements based on the selected MarsScape Processing Architecture. The processor chosen is the Motorola Altivec-G4 Power PC(PPC) (1998 Motorola, Inc.), a highly parallized commercial Single Instruction Multiple Data processor. Both derived perception benchmarks and actual perception subsystems code will be benchmarked and compared against previous Demo II-Semi-autonomous Surrogate Vehicle processing architectures along with desktop Personal Computers(PC). Performance gains are highlighted with progress to date, and lessons learned and future directions are described.

  4. Results of the Magnetometer Navigation (MAGNAV)lnflight Experiment

    NASA Technical Reports Server (NTRS)

    Thienel, Julie K.; Harman, Richard R.; Bar-Itzhack, Itzhack Y.; Lambertson, Mike

    2004-01-01

    The Magnetometer Navigation (MAGNAV) algorithm is currently running as a flight experiment as part of the Wide Field Infrared Explorer (WIRE) Post-Science Engineering Testbed. Initialization of MAGNAV occurred on September 4, 2003. MAGNAV is designed to autonomously estimate the spacecraft orbit, attitude, and rate using magnetometer and sun sensor data. Since the Earth's magnetic field is a function of time and position, and since time is known quite precisely, the differences between the computed magnetic field and measured magnetic field components, as measured by the magnetometer throughout the entire spacecraft orbit, are a function of the spacecraft trajectory and attitude errors. Therefore, these errors are used to estimate both trajectory and attitude. In addition, the time rate of change of the magnetic field vector is used to estimate the spacecraft rotation rate. The estimation of the attitude and trajectory is augmented with the rate estimation into an Extended Kalman filter blended with a pseudo-linear Kalman filter. Sun sensor data is also used to improve the accuracy and observability of the attitude and rate estimates. This test serves to validate MAGNAV as a single low cost navigation system which utilizes reliable, flight qualified sensors. MAGNAV is intended as a backup algorithm, an initialization algorithm, or possibly a prime navigation algorithm for a mission with coarse requirements. Results from the first six months of operation are presented.

  5. Ship navigation using Navstar GPS - An application study

    NASA Technical Reports Server (NTRS)

    Mohan, S. N.

    1982-01-01

    Ocean current measurement applications in physical oceanography require knowledge of inertial ship velocity to a precision of 1-2 cm/sec over a typical five minute averaging interval. The navigation accuracy must be commensurate with data precision obtainable from ship borne acoustic profilers used in sensing ocean currents. The Navstar Global Positioning System is viewed as a step in user technological simplification, extension in coverage availability, and enhancement in performance accuracy as well as reliability over the existing systems, namely, Loran-C, Transit, and Omega. Error analyses have shown the possibility of attaining the 1-2 cm/sec accuracy during active GPS coverage at a data rate of four position fixes per minute under varying sea-states. This paper is intended to present results of data validation exercises leading to design of an experiment at sea for deployment of both a GPS y-set and a direct Doppler measurement system as the autonomous navigation system used in conjunction with an acoustic Doppler as the sensor for ocean current measurement.

  6. Towards Autonomous Agriculture: Automatic Ground Detection Using Trinocular Stereovision

    PubMed Central

    Reina, Giulio; Milella, Annalisa

    2012-01-01

    Autonomous driving is a challenging problem, particularly when the domain is unstructured, as in an outdoor agricultural setting. Thus, advanced perception systems are primarily required to sense and understand the surrounding environment recognizing artificial and natural structures, topology, vegetation and paths. In this paper, a self-learning framework is proposed to automatically train a ground classifier for scene interpretation and autonomous navigation based on multi-baseline stereovision. The use of rich 3D data is emphasized where the sensor output includes range and color information of the surrounding environment. Two distinct classifiers are presented, one based on geometric data that can detect the broad class of ground and one based on color data that can further segment ground into subclasses. The geometry-based classifier features two main stages: an adaptive training stage and a classification stage. During the training stage, the system automatically learns to associate geometric appearance of 3D stereo-generated data with class labels. Then, it makes predictions based on past observations. It serves as well to provide training labels to the color-based classifier. Once trained, the color-based classifier is able to recognize similar terrain classes in stereo imagery. The system is continuously updated online using the latest stereo readings, thus making it feasible for long range and long duration navigation, over changing environments. Experimental results, obtained with a tractor test platform operating in a rural environment, are presented to validate this approach, showing an average classification precision and recall of 91.0% and 77.3%, respectively.

  7. Autonomous docking ground demonstration

    NASA Technical Reports Server (NTRS)

    Lamkin, Steve L.; Le, Thomas Quan; Othon, L. T.; Prather, Joseph L.; Eick, Richard E.; Baxter, Jim M.; Boyd, M. G.; Clark, Fred D.; Spehar, Peter T.; Teters, Rebecca T.

    1991-01-01

    The Autonomous Docking Ground Demonstration is an evaluation of the laser sensor system to support the docking phase (12 ft to contact) when operated in conjunction with the guidance, navigation, and control (GN&C) software. The docking mechanism being used was developed for the Apollo/Soyuz Test Program. This demonstration will be conducted using the 6-DOF Dynamic Test System (DTS). The DTS simulates the Space Station Freedom as the stationary or target vehicle and the Orbiter as the active or chase vehicle. For this demonstration, the laser sensor will be mounted on the target vehicle and the retroflectors will be on the chase vehicle. This arrangement was chosen to prevent potential damage to the laser. The laser sensor system, GN&C, and 6-DOF DTS will be operated closed-loop. Initial conditions to simulate vehicle misalignments, translational and rotational, will be introduced within the constraints of the systems involved.

  8. GNC architecture for autonomous robotic capture of a non-cooperative target: Preliminary concept design

    NASA Astrophysics Data System (ADS)

    Jankovic, Marko; Paul, Jan; Kirchner, Frank

    2016-04-01

    Recent studies of the space debris population in low Earth orbit (LEO) have concluded that certain regions have already reached a critical density of objects. This will eventually lead to a cascading process called the Kessler syndrome. The time may have come to seriously consider active debris removal (ADR) missions as the only viable way of preserving the space environment for future generations. Among all objects in the current environment, the SL-8 (Kosmos 3M second stages) rocket bodies (R/Bs) are some of the most suitable targets for future robotic ADR missions. However, to date, an autonomous relative navigation to and capture of an non-cooperative target has never been performed. Therefore, there is a need for more advanced, autonomous and modular systems that can cope with uncontrolled, tumbling objects. The guidance, navigation and control (GNC) system is one of the most critical ones. The main objective of this paper is to present a preliminary concept of a modular GNC architecture that should enable a safe and fuel-efficient capture of a known but uncooperative target, such as Kosmos 3M R/B. In particular, the concept was developed having in mind the most critical part of an ADR mission, i.e. close range proximity operations, and state of the art algorithms in the field of autonomous rendezvous and docking. In the end, a brief description of the hardware in the loop (HIL) testing facility is made, foreseen for the practical evaluation of the developed architecture.

  9. Navigation system for a mobile robot with a visual sensor using a fish-eye lens

    NASA Astrophysics Data System (ADS)

    Kurata, Junichi; Grattan, Kenneth T. V.; Uchiyama, Hironobu

    1998-02-01

    Various position sensing and navigation systems have been proposed for the autonomous control of mobile robots. Some of these systems have been installed with an omnidirectional visual sensor system that proved very useful in obtaining information on the environment around the mobile robot for position reckoning. In this article, this type of navigation system is discussed. The sensor is composed of one TV camera with a fish-eye lens, using a reference target on a ceiling and hybrid image processing circuits. The position of the robot, with respect to the floor, is calculated by integrating the information obtained from a visual sensor and a gyroscope mounted in the mobile robot, and the use of a simple algorithm based on PTP control for guidance is discussed. An experimental trial showed that the proposed system was both valid and useful for the navigation of an indoor vehicle.

  10. Autonomous navigation accuracy using simulated horizon sensor and sun sensor observations

    NASA Technical Reports Server (NTRS)

    Pease, G. E.; Hendrickson, H. T.

    1980-01-01

    A relatively simple autonomous system which would use horizon crossing indicators, a sun sensor, a quartz oscillator, and a microprogrammed computer is discussed. The sensor combination is required only to effectively measure the angle between the centers of the Earth and the Sun. Simulations for a particular orbit indicate that 2 km r.m.s. orbit determination uncertainties may be expected from a system with 0.06 deg measurement uncertainty. A key finding is that knowledge of the satellite orbit plane orientation can be maintained to this level because of the annual motion of the Sun and the predictable effects of Earth oblateness. The basic system described can be updated periodically by transits of the Moon through the IR horizon crossing indicator fields of view.

  11. Wind-Based Navigation of a Hot-air Balloon on Titan: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Furfaro, Roberto; Lunine, Jonathan I.; Elfes, Alberto; Reh, Kim

    2008-01-01

    Current analysis of data streamed back to Earth by the Cassini spacecraft features Titan as one of the most exciting places in the solar system. NASA centers and universities around the US, as well as the European Space Agency, are studying the possibility of sending, as part of the next mission to this giant moon of Saturn, a hot-air balloon (Montgolfier-type) for further and more in-depth exploration. The basic idea would be to design a reliable, semi-autonomous, and yet cheap Montgolfier capable of using continuous flow of waste heat from a power source to lift the balloon and sustain its altitude in the Titan environment. In this paper we study the problem of locally navigating a hot-air balloon in the nitrogen-based Titan atmosphere. The basic idea is to define a strategy (i.e. design of a suitable guidance system) that allows autonomous and semi-autonomous navigation of the balloon using the available (and partial) knowledge of the wind structure blowing on the saturnian satellite surface. Starting from first principles we determined the appropriate thermal and dynamical models describing (a) the vertical dynamics of the balloon and (b) the dynamics of the balloon moving on a vertical plane (2-D motion). Next, various non-linear fuzzy-based control strategies have been evaluated, analyzed and implemented in MATLAB to numerically simulate the capability of the system to simultaneously maintain altitude, as well as a scientifically desirable trajectory. We also looked at the ability of the balloon to perform station keeping. The results of the simulation are encouraging and show the effectiveness of such a system to cheaply and effectively perform semiautonomous exploration of Titan.

  12. UPenn Multi-Robot Unmanned Vehicle System (MAGIC)

    DTIC Science & Technology

    2014-05-05

    unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 UPenn Multi-Robot Unmanned Vehicle System (MAGIC) AFOSR Final Report PI... user interface, the Strategy/Plan operator allows the system to autonomously task the nearest available UGVs to plan and coordinate their movements and...threats in a dynamic urban environment with minimal human guidance. The custom hardware systems consist of robust and complementary sensors, integrated

  13. Autonomous and Autonomic Swarms

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy

    2005-01-01

    A watershed in systems engineering is represented by the advent of swarm-based systems that accomplish missions through cooperative action by a (large) group of autonomous individuals each having simple capabilities and no global knowledge of the group s objective. Such systems, with individuals capable of surviving in hostile environments, pose unprecedented challenges to system developers. Design and testing and verification at much higher levels will be required, together with the corresponding tools, to bring such systems to fruition. Concepts for possible future NASA space exploration missions include autonomous, autonomic swarms. Engineering swarm-based missions begins with understanding autonomy and autonomicity and how to design, test, and verify systems that have those properties and, simultaneously, the capability to accomplish prescribed mission goals. Formal methods-based technologies, both projected and in development, are described in terms of their potential utility to swarm-based system developers.

  14. Agent Based Software for the Autonomous Control of Formation Flying Spacecraft

    NASA Technical Reports Server (NTRS)

    How, Jonathan P.; Campbell, Mark; Dennehy, Neil (Technical Monitor)

    2003-01-01

    Distributed satellite systems is an enabling technology for many future NASA/DoD earth and space science missions, such as MMS, MAXIM, Leonardo, and LISA [1, 2, 3]. While formation flying offers significant science benefits, to reduce the operating costs for these missions it will be essential that these multiple vehicles effectively act as a single spacecraft by performing coordinated observations. Autonomous guidance, navigation, and control as part of a coordinated fleet-autonomy is a key technology that will help accomplish this complex goal. This is no small task, as most current space missions require significant input from the ground for even relatively simple decisions such as thruster burns. Work for the NMP DS1 mission focused on the development of the New Millennium Remote Agent (NMRA) architecture for autonomous spacecraft control systems. NMRA integrates traditional real-time monitoring and control with components for constraint-based planning, robust multi-threaded execution, and model-based diagnosis and reconfiguration. The complexity of using an autonomous approach for space flight software was evident when most of its capabilities were stripped off prior to launch (although more capability was uplinked subsequently, and the resulting demonstration was very successful).

  15. Differential GNSS and Vision-Based Tracking to Improve Navigation Performance in Cooperative Multi-UAV Systems.

    PubMed

    Vetrella, Amedeo Rodi; Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio

    2016-12-17

    Autonomous navigation of micro-UAVs is typically based on the integration of low cost Global Navigation Satellite System (GNSS) receivers and Micro-Electro-Mechanical Systems (MEMS)-based inertial and magnetic sensors to stabilize and control the flight. The resulting navigation performance in terms of position and attitude accuracy may not suffice for other mission needs, such as the ones relevant to fine sensor pointing. In this framework, this paper presents a cooperative UAV navigation algorithm that allows a chief vehicle, equipped with inertial and magnetic sensors, a Global Positioning System (GPS) receiver, and a vision system, to improve its navigation performance (in real time or in the post processing phase) exploiting formation flying deputy vehicles equipped with GPS receivers. The focus is set on outdoor environments and the key concept is to exploit differential GPS among vehicles and vision-based tracking (DGPS/Vision) to build a virtual additional navigation sensor whose information is then integrated in a sensor fusion algorithm based on an Extended Kalman Filter. The developed concept and processing architecture are described, with a focus on DGPS/Vision attitude determination algorithm. Performance assessment is carried out on the basis of both numerical simulations and flight tests. In the latter ones, navigation estimates derived from the DGPS/Vision approach are compared with those provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit magnetic- and inertial-independent accurate attitude information.

  16. Model-based Executive Control through Reactive Planning for Autonomous Rovers

    NASA Technical Reports Server (NTRS)

    Finzi, Alberto; Ingrand, Felix; Muscettola, Nicola

    2004-01-01

    This paper reports on the design and implementation of a real-time executive for a mobile rover that uses a model-based, declarative approach. The control system is based on the Intelligent Distributed Execution Architecture (IDEA), an approach to planning and execution that provides a unified representational and computational framework for an autonomous agent. The basic hypothesis of IDEA is that a large control system can be structured as a collection of interacting agents, each with the same fundamental structure. We show that planning and real-time response are compatible if the executive minimizes the size of the planning problem. We detail the implementation of this approach on an exploration rover (Gromit an RWI ATRV Junior at NASA Ames) presenting different IDEA controllers of the same domain and comparing them with more classical approaches. We demonstrate that the approach is scalable to complex coordination of functional modules needed for autonomous navigation and exploration.

  17. Initial results of centralized autonomous orbit determination of the new-generation BDS satellites with inter-satellite link measurements

    NASA Astrophysics Data System (ADS)

    Tang, Chengpan; Hu, Xiaogong; Zhou, Shanshi; Liu, Li; Pan, Junyang; Chen, Liucheng; Guo, Rui; Zhu, Lingfeng; Hu, Guangming; Li, Xiaojie; He, Feng; Chang, Zhiqiao

    2018-01-01

    Autonomous orbit determination is the ability of navigation satellites to estimate the orbit parameters on-board using inter-satellite link (ISL) measurements. This study mainly focuses on data processing of the ISL measurements as a new measurement type and its application on the centralized autonomous orbit determination of the new-generation Beidou navigation satellite system satellites for the first time. The ISL measurements are dual one-way measurements that follow a time division multiple access (TDMA) structure. The ranging error of the ISL measurements is less than 0.25 ns. This paper proposes a derivation approach to the satellite clock offsets and the geometric distances from TDMA dual one-way measurements without a loss of accuracy. The derived clock offsets are used for time synchronization, and the derived geometry distances are used for autonomous orbit determination. The clock offsets from the ISL measurements are consistent with the L-band two-way satellite, and time-frequency transfer clock measurements and the detrended residuals vary within 0.5 ns. The centralized autonomous orbit determination is conducted in a batch mode on a ground-capable server for the feasibility study. Constant hardware delays are present in the geometric distances and become the largest source of error in the autonomous orbit determination. Therefore, the hardware delays are estimated simultaneously with the satellite orbits. To avoid uncertainties in the constellation orientation, a ground anchor station that "observes" the satellites with on-board ISL payloads is introduced into the orbit determination. The root-mean-square values of orbit determination residuals are within 10.0 cm, and the standard deviation of the estimated ISL hardware delays is within 0.2 ns. The accuracy of the autonomous orbits is evaluated by analysis of overlap comparison and the satellite laser ranging (SLR) residuals and is compared with the accuracy of the L-band orbits. The results indicate

  18. Markovian robots: Minimal navigation strategies for active particles

    NASA Astrophysics Data System (ADS)

    Nava, Luis Gómez; Großmann, Robert; Peruani, Fernando

    2018-04-01

    We explore minimal navigation strategies for active particles in complex, dynamical, external fields, introducing a class of autonomous, self-propelled particles which we call Markovian robots (MR). These machines are equipped with a navigation control system (NCS) that triggers random changes in the direction of self-propulsion of the robots. The internal state of the NCS is described by a Boolean variable that adopts two values. The temporal dynamics of this Boolean variable is dictated by a closed Markov chain—ensuring the absence of fixed points in the dynamics—with transition rates that may depend exclusively on the instantaneous, local value of the external field. Importantly, the NCS does not store past measurements of this value in continuous, internal variables. We show that despite the strong constraints, it is possible to conceive closed Markov chain motifs that lead to nontrivial motility behaviors of the MR in one, two, and three dimensions. By analytically reducing the complexity of the NCS dynamics, we obtain an effective description of the long-time motility behavior of the MR that allows us to identify the minimum requirements in the design of NCS motifs and transition rates to perform complex navigation tasks such as adaptive gradient following, detection of minima or maxima, or selection of a desired value in a dynamical, external field. We put these ideas in practice by assembling a robot that operates by the proposed minimalistic NCS to evaluate the robustness of MR, providing a proof of concept that is possible to navigate through complex information landscapes with such a simple NCS whose internal state can be stored in one bit. These ideas may prove useful for the engineering of miniaturized robots.

  19. Novel Intersection Type Recognition for Autonomous Vehicles Using a Multi-Layer Laser Scanner.

    PubMed

    An, Jhonghyun; Choi, Baehoon; Sim, Kwee-Bo; Kim, Euntai

    2016-07-20

    There are several types of intersections such as merge-roads, diverge-roads, plus-shape intersections and two types of T-shape junctions in urban roads. When an autonomous vehicle encounters new intersections, it is crucial to recognize the types of intersections for safe navigation. In this paper, a novel intersection type recognition method is proposed for an autonomous vehicle using a multi-layer laser scanner. The proposed method consists of two steps: (1) static local coordinate occupancy grid map (SLOGM) building and (2) intersection classification. In the first step, the SLOGM is built relative to the local coordinate using the dynamic binary Bayes filter. In the second step, the SLOGM is used as an attribute for the classification. The proposed method is applied to a real-world environment and its validity is demonstrated through experimentation.

  20. LIDAR-Aided Inertial Navigation with Extended Kalman Filtering for Pinpoint Landing

    NASA Technical Reports Server (NTRS)

    Busnardo, David M.; Aitken, Matthew L.; Tolson, Robert H.; Pierrottet, Diego; Amzajerdian, Farzin

    2011-01-01

    In support of NASA s Autonomous Landing and Hazard Avoidance Technology (ALHAT) project, an extended Kalman filter routine has been developed for estimating the position, velocity, and attitude of a spacecraft during the landing phase of a planetary mission. The proposed filter combines measurements of acceleration and angular velocity from an inertial measurement unit (IMU) with range and Doppler velocity observations from an onboard light detection and ranging (LIDAR) system. These high-precision LIDAR measurements of distance to the ground and approach velocity will enable both robotic and manned vehicles to land safely and precisely at scientifically interesting sites. The filter has been extensively tested using a lunar landing simulation and shown to improve navigation over flat surfaces or rough terrain. Experimental results from a helicopter flight test performed at NASA Dryden in August 2008 demonstrate that LIDAR can be employed to significantly improve navigation based exclusively on IMU integration.

  1. Visual identification and similarity measures used for on-line motion planning of autonomous robots in unknown environments

    NASA Astrophysics Data System (ADS)

    Martínez, Fredy; Martínez, Fernando; Jacinto, Edwar

    2017-02-01

    In this paper we propose an on-line motion planning strategy for autonomous robots in dynamic and locally observable environments. In this approach, we first visually identify geometric shapes in the environment by filtering images. Then, an ART-2 network is used to establish the similarity between patterns. The proposed algorithm allows that a robot establish its relative location in the environment, and define its navigation path based on images of the environment and its similarity to reference images. This is an efficient and minimalist method that uses the similarity of landmark view patterns to navigate to the desired destination. Laboratory tests on real prototypes demonstrate the performance of the algorithm.

  2. An autonomous rendezvous and docking system using cruise missile technologies

    NASA Technical Reports Server (NTRS)

    Jones, Ruel Edwin

    1991-01-01

    In November 1990 the Autonomous Rendezvous & Docking (AR&D) system was first demonstrated for members of NASA's Strategic Avionics Technology Working Group. This simulation utilized prototype hardware from the Cruise Missile and Advanced Centaur Avionics systems. The object was to show that all the accuracy, reliability and operational requirements established for a space craft to dock with Space Station Freedom could be met by the proposed system. The rapid prototyping capabilities of the Advanced Avionics Systems Development Laboratory were used to evaluate the proposed system in a real time, hardware in the loop simulation of the rendezvous and docking reference mission. The simulation permits manual, supervised automatic and fully autonomous operations to be evaluated. It is also being upgraded to be able to test an Autonomous Approach and Landing (AA&L) system. The AA&L and AR&D systems are very similar. Both use inertial guidance and control systems supplemented by GPS. Both use an Image Processing System (IPS), for target recognition and tracking. The IPS includes a general purpose multiprocessor computer and a selected suite of sensors that will provide the required relative position and orientation data. Graphic displays can also be generated by the computer, providing the astronaut / operator with real-time guidance and navigation data with enhanced video or sensor imagery.

  3. HERMIES-I: a mobile robot for navigation and manipulation experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weisbin, C.R.; Barhen, J.; de Saussure, G.

    1985-01-01

    The purpose of this paper is to report the current status of investigations ongoing at the Center for Engineering Systems Advanced Research (CESAR) in the areas of navigation and manipulation in unstructured environments. The HERMIES-I mobile robot, a prototype of a series which contains many of the major features needed for remote work in hazardous environments is discussed. Initial experimental work at CESAR has begun in the area of navigation. It briefly reviews some of the ongoing research in autonomous navigation and describes initial research with HERMIES-I and associated graphic simulation. Since the HERMIES robots will generally be composed ofmore » a variety of asynchronously controlled hardware components (such as manipulator arms, digital image sensors, sonars, etc.) it seems appropriate to consider future development of the HERMIES brain as a hypercube ensemble machine with concurrent computation and associated message passing. The basic properties of such a hypercube architecture are presented. Decision-making under uncertainty eventually permeates all of our work. Following a survey of existing analytical approaches, it was decided that a stronger theoretical basis is required. As such, this paper presents the framework for a recently developed hybrid uncertainty theory. 21 refs., 2 figs.« less

  4. Visual control of navigation in insects and its relevance for robotics.

    PubMed

    Srinivasan, Mandyam V

    2011-08-01

    Flying insects display remarkable agility, despite their diminutive eyes and brains. This review describes our growing understanding of how these creatures use visual information to stabilize flight, avoid collisions with objects, regulate flight speed, detect and intercept other flying insects such as mates or prey, navigate to a distant food source, and orchestrate flawless landings. It also outlines the ways in which these insights are now being used to develop novel, biologically inspired strategies for the guidance of autonomous, airborne vehicles. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Spatial filtering velocimeter for vehicle navigation with extended measurement range

    NASA Astrophysics Data System (ADS)

    He, Xin; Zhou, Jian; Nie, Xiaoming; Long, Xingwu

    2015-05-01

    The idea of using spatial filtering velocimeter is proposed to provide accurate velocity information for vehicle autonomous navigation system. The presented spatial filtering velocimeter is based on a CMOS linear image sensor. The limited frame rate restricts high speed measurement of the vehicle. To extend measurement range of the velocimeter, a method of frequency shifting is put forward. Theoretical analysis shows that the frequency of output signal can be reduced and the measurement range can be doubled by this method when the shifting direction is set the same with that of image velocity. The approach of fast Fourier transform (FFT) is employed to obtain the power spectra of the spatially filtered signals. Because of limited frequency resolution of FFT, a frequency spectrum correction algorithm, called energy centrobaric correction, is used to improve the frequency resolution. The correction accuracy energy centrobaric correction is analyzed. Experiments are carried out to measure the moving surface of a conveyor belt. The experimental results show that the maximum measurable velocity is about 800deg/s without frequency shifting, 1600deg/s with frequency shifting, when the frame rate of the image is about 8117 Hz. Therefore, the measurement range is doubled by the method of frequency shifting. Furthermore, experiments were carried out to measure the vehicle velocity simultaneously using both the designed SFV and a laser Doppler velocimeter (LDV). The measurement results of the presented SFV are coincident with that of the LDV, but with bigger fluctuation. Therefore, it has the potential of application to vehicular autonomous navigation.

  6. Autonomous vehicles: from paradigms to technology

    NASA Astrophysics Data System (ADS)

    Ionita, Silviu

    2017-10-01

    Mobility is a basic necessity of contemporary society and it is a key factor in global economic development. The basic requirements for the transport of people and goods are: safety and duration of travel, but also a number of additional criteria are very important: energy saving, pollution, passenger comfort. Due to advances in hardware and software, automation has penetrated massively in transport systems both on infrastructure and on vehicles, but man is still the key element in vehicle driving. However, the classic concept of ‘human-in-the-loop’ in terms of ‘hands on’ in driving the cars is competing aside from the self-driving startups working towards so-called ‘Level 4 autonomy’, which is defined as “a self-driving system that does not requires human intervention in most scenarios”. In this paper, a conceptual synthesis of the autonomous vehicle issue is made in connection with the artificial intelligence paradigm. It presents a classification of the tasks that take place during the driving of the vehicle and its modeling from the perspective of traditional control engineering and artificial intelligence. The issue of autonomous vehicle management is addressed on three levels: navigation, movement in traffic, respectively effective maneuver and vehicle dynamics control. Each level is then described in terms of specific tasks, such as: route selection, planning and reconfiguration, recognition of traffic signs and reaction to signaling and traffic events, as well as control of effective speed, distance and direction. The approach will lead to a better understanding of the way technology is moving when talking about autonomous cars, smart/intelligent cars or intelligent transport systems. Keywords: self-driving vehicle, artificial intelligence, deep learning, intelligent transport systems.

  7. Water Detection Based on Color Variation

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.

    2012-01-01

    This software has been designed to detect water bodies that are out in the open on cross-country terrain at close range (out to 30 meters), using imagery acquired from a stereo pair of color cameras mounted on a terrestrial, unmanned ground vehicle (UGV). This detector exploits the fact that the color variation across water bodies is generally larger and more uniform than that of other naturally occurring types of terrain, such as soil and vegetation. Non-traversable water bodies, such as large puddles, ponds, and lakes, are detected based on color variation, image intensity variance, image intensity gradient, size, and shape. At ranges beyond 20 meters, water bodies out in the open can be indirectly detected by detecting reflections of the sky below the horizon in color imagery. But at closer range, the color coming out of a water body dominates sky reflections, and the water cue from sky reflections is of marginal use. Since there may be times during UGV autonomous navigation when a water body does not come into a perception system s field of view until it is at close range, the ability to detect water bodies at close range is critical. Factors that influence the perceived color of a water body at close range are the amount and type of sediment in the water, the water s depth, and the angle of incidence to the water body. Developing a single model of the mixture ratio of light reflected off the water surface (to the camera) to light coming out of the water body (to the camera) for all water bodies would be fairly difficult. Instead, this software detects close water bodies based on local terrain features and the natural, uniform change in color that occurs across the surface from the leading edge to the trailing edge.

  8. Evaluation of semiautonomous navigation assistance system for power wheelchairs with blindfolded nondisabled individuals.

    PubMed

    Sharma, Vinod; Simpson, Richard; Lopresti, Edmund; Schmeler, Mark

    2010-01-01

    Some individuals with disabilities are denied powered mobility because they lack the visual, motor, and/or cognitive skills required to safely operate a power wheelchair. The Drive-Safe System (DSS) is an add-on, distributed, shared-control navigation assistance system for power wheelchairs intended to provide safe and independent mobility to such individuals. The DSS is a human-machine system in which the user is responsible for high-level control of the wheelchair, such as choosing the destination, path planning, and basic navigation actions, while the DSS overrides unsafe maneuvers through autonomous collision avoidance, wall following, and door crossing. In this project, the DSS was clinically evaluated in a controlled laboratory with blindfolded, nondisabled individuals. Further, these individuals' performance with the DSS was compared with standard cane use for navigation assistance by people with visual impairments. Results indicate that compared with a cane, the DSS significantly reduced the number of collisions. Users rated the DSS favorably even though they took longer to navigate the same obstacle course than they would have using a standard long cane. Participants experienced less physical demand, effort, and frustration when using the DSS as compared with a cane. These findings suggest that the DSS can be a viable powered mobility solution for wheelchair users with visual impairments.

  9. NASA Office of Aeronautics and Space Technology Summer Workshop. Volume 3: Navigation, guidance and control panel

    NASA Technical Reports Server (NTRS)

    1975-01-01

    User technology requirements are identified in relation to needed technology advancement for future space missions in the areas of navigation, guidance, and control. Emphasis is placed on: reduction of mission support cost by 50% through autonomous operation, a ten-fold increase in mission output through improved pointing and control, and a hundred-fold increase in human productivity in space through large-scale teleoperator applications.

  10. Quantifying navigational information: The catchment volumes of panoramic snapshots in outdoor scenes.

    PubMed

    Murray, Trevor; Zeil, Jochen

    2017-01-01

    Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its 'catchment area') has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the 'catchment volumes' within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots.

  11. Quantifying navigational information: The catchment volumes of panoramic snapshots in outdoor scenes

    PubMed Central

    Zeil, Jochen

    2017-01-01

    Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its ‘catchment area’) has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the ‘catchment volumes’ within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots. PMID:29088300

  12. A learning-based semi-autonomous controller for robotic exploration of unknown disaster scenes while searching for victims.

    PubMed

    Doroodgar, Barzin; Liu, Yugang; Nejat, Goldie

    2014-12-01

    Semi-autonomous control schemes can address the limitations of both teleoperation and fully autonomous robotic control of rescue robots in disaster environments by allowing a human operator to cooperate and share such tasks with a rescue robot as navigation, exploration, and victim identification. In this paper, we present a unique hierarchical reinforcement learning-based semi-autonomous control architecture for rescue robots operating in cluttered and unknown urban search and rescue (USAR) environments. The aim of the controller is to enable a rescue robot to continuously learn from its own experiences in an environment in order to improve its overall performance in exploration of unknown disaster scenes. A direction-based exploration technique is integrated in the controller to expand the search area of the robot via the classification of regions and the rubble piles within these regions. Both simulations and physical experiments in USAR-like environments verify the robustness of the proposed HRL-based semi-autonomous controller to unknown cluttered scenes with different sizes and varying types of configurations.

  13. New Navigation Post-Processing Tools for Oceanographic Submersibles

    NASA Astrophysics Data System (ADS)

    Kinsey, J. C.; Whitcomb, L. L.; Yoerger, D. R.; Howland, J. C.; Ferrini, V. L.; Hegrenas, O.

    2006-12-01

    We report the development of Navproc, a new set of software tools for post-processing oceanographic submersible navigation data that exploits previously reported improvements in navigation sensing and estimation (e.g. Eos Trans. AGU, 84(46), Fall Meet. Suppl., Abstract OS32A- 0225, 2003). The development of these tools is motivated by the need to have post-processing software that allows users to compensate for errors in vehicle navigation, recompute the vehicle position, and then save the results for use with quantitative science data (e.g. bathymetric sonar data) obtained during the mission. Navproc does not provide real-time navigation or display of data nor is it capable of high-resolution, three dimensional (3D) data display. Navproc supports the ASCII data formats employed by the vehicles of the National Deep Submergence Facility (NDSF) operated by the Woods Hole Oceanographic Institution (WHOI). Post-processing of navigation data with Navproc is comprised of three tasks. First, data is converted from the logged ASCII file to a binary Matlab file. When loaded into Matlab, each sensor has a data structure containing the time stamped data sampled at the native update rate of the sensor. An additional structure contains the real-time vehicle navigation data. Second, the data can be displayed using a Graphical User Interface (GUI), allowing users to visually inspect the quality of the data and graphically extract portions of the data. Third, users can compensate for errors in the real-time vehicle navigation. Corrections include: (i) manual filtering and median filtering of long baseline (LBL) ranges; (ii) estimation of the Doppler/gyro alignment using previously reported methodologies; and (iii) sound velocity, tide, and LBL transponder corrections. Using these corrections, the Doppler and LBL positions can be recomputed to provide improved estimates of the vehicle position compared to those computed in real-time. The data can be saved in either binary or ASCII

  14. Differential GNSS and Vision-Based Tracking to Improve Navigation Performance in Cooperative Multi-UAV Systems

    PubMed Central

    Vetrella, Amedeo Rodi; Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio

    2016-01-01

    Autonomous navigation of micro-UAVs is typically based on the integration of low cost Global Navigation Satellite System (GNSS) receivers and Micro-Electro-Mechanical Systems (MEMS)-based inertial and magnetic sensors to stabilize and control the flight. The resulting navigation performance in terms of position and attitude accuracy may not suffice for other mission needs, such as the ones relevant to fine sensor pointing. In this framework, this paper presents a cooperative UAV navigation algorithm that allows a chief vehicle, equipped with inertial and magnetic sensors, a Global Positioning System (GPS) receiver, and a vision system, to improve its navigation performance (in real time or in the post processing phase) exploiting formation flying deputy vehicles equipped with GPS receivers. The focus is set on outdoor environments and the key concept is to exploit differential GPS among vehicles and vision-based tracking (DGPS/Vision) to build a virtual additional navigation sensor whose information is then integrated in a sensor fusion algorithm based on an Extended Kalman Filter. The developed concept and processing architecture are described, with a focus on DGPS/Vision attitude determination algorithm. Performance assessment is carried out on the basis of both numerical simulations and flight tests. In the latter ones, navigation estimates derived from the DGPS/Vision approach are compared with those provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit magnetic- and inertial-independent accurate attitude information. PMID:27999318

  15. The Design of an Autonomous Underwater Vehicle for Water Quality Monitoring

    NASA Astrophysics Data System (ADS)

    Li, Yulong; Liu, Rong; Liu, Shujin

    2018-01-01

    This paper describes the development of a civilian-used autonomous underwater vehicle (AUV) for water quality monitoring at reservoirs and watercourses that can obtain realtime visual and locational information. The mechanical design was completed with CAD software Solidworks. Four thrusters—two horizontal and two vertical—on board enable the vehicle to surge, heave, yaw, and pitch. A specialized water sample collection compartment is designed to perform water collection at target locations. The vehicle has a central controller—STM32—and a sub-coordinate controller—Arduino MEGA 2560—that coordinates multiple sensors including an inertial sensor, ultrasonic sensors, etc. Global Navigation Satellite System (GNSS) and the inertial sensor enable the vehicle’s localization. Remote operators monitor and control the vehicle via a host computer system. Operators choose either semi-autonomous mode in which they set target locations or manual mode. The experimental results show that the vehicle is able to perform well in either mode.

  16. Nonlinearity analysis of measurement model for vision-based optical navigation system

    NASA Astrophysics Data System (ADS)

    Li, Jianguo; Cui, Hutao; Tian, Yang

    2015-02-01

    In the autonomous optical navigation system based on line-of-sight vector observation, nonlinearity of measurement model is highly correlated with the navigation performance. By quantitatively calculating the degree of nonlinearity of the focal plane model and the unit vector model, this paper focuses on determining which optical measurement model performs better. Firstly, measurement equations and measurement noise statistics of these two line-of-sight measurement models are established based on perspective projection co-linearity equation. Then the nonlinear effects of measurement model on the filter performance are analyzed within the framework of the Extended Kalman filter, also the degrees of nonlinearity of two measurement models are compared using the curvature measure theory from differential geometry. Finally, a simulation of star-tracker-based attitude determination is presented to confirm the superiority of the unit vector measurement model. Simulation results show that the magnitude of curvature nonlinearity measurement is consistent with the filter performance, and the unit vector measurement model yields higher estimation precision and faster convergence properties.

  17. Novel Intersection Type Recognition for Autonomous Vehicles Using a Multi-Layer Laser Scanner

    PubMed Central

    An, Jhonghyun; Choi, Baehoon; Sim, Kwee-Bo; Kim, Euntai

    2016-01-01

    There are several types of intersections such as merge-roads, diverge-roads, plus-shape intersections and two types of T-shape junctions in urban roads. When an autonomous vehicle encounters new intersections, it is crucial to recognize the types of intersections for safe navigation. In this paper, a novel intersection type recognition method is proposed for an autonomous vehicle using a multi-layer laser scanner. The proposed method consists of two steps: (1) static local coordinate occupancy grid map (SLOGM) building and (2) intersection classification. In the first step, the SLOGM is built relative to the local coordinate using the dynamic binary Bayes filter. In the second step, the SLOGM is used as an attribute for the classification. The proposed method is applied to a real-world environment and its validity is demonstrated through experimentation. PMID:27447640

  18. Analysis and Compensation of Modulation Angular Rate Error Based on Missile-Borne Rotation Semi-Strapdown Inertial Navigation System

    PubMed Central

    Zhang, Jiayu; Li, Jie; Zhang, Xi; Che, Xiaorui; Huang, Yugang; Feng, Kaiqiang

    2018-01-01

    The Semi-Strapdown Inertial Navigation System (SSINS) provides a new solution to attitude measurement of a high-speed rotating missile. However, micro-electro-mechanical-systems (MEMS) inertial measurement unit (MIMU) outputs are corrupted by significant sensor errors. In order to improve the navigation precision, a rotation modulation technology method called Rotation Semi-Strapdown Inertial Navigation System (RSSINS) is introduced into SINS. In fact, the stability of the modulation angular rate is difficult to achieve in a high-speed rotation environment. The changing rotary angular rate has an impact on the inertial sensor error self-compensation. In this paper, the influence of modulation angular rate error, including acceleration-deceleration process, and instability of the angular rate on the navigation accuracy of RSSINS is deduced and the error characteristics of the reciprocating rotation scheme are analyzed. A new compensation method is proposed to remove or reduce sensor errors so as to make it possible to maintain high precision autonomous navigation performance by MIMU when there is no external aid. Experiments have been carried out to validate the performance of the method. In addition, the proposed method is applicable for modulation angular rate error compensation under various dynamic conditions. PMID:29734707

  19. Analysis and Compensation of Modulation Angular Rate Error Based on Missile-Borne Rotation Semi-Strapdown Inertial Navigation System.

    PubMed

    Zhang, Jiayu; Li, Jie; Zhang, Xi; Che, Xiaorui; Huang, Yugang; Feng, Kaiqiang

    2018-05-04

    The Semi-Strapdown Inertial Navigation System (SSINS) provides a new solution to attitude measurement of a high-speed rotating missile. However, micro-electro-mechanical-systems (MEMS) inertial measurement unit (MIMU) outputs are corrupted by significant sensor errors. In order to improve the navigation precision, a rotation modulation technology method called Rotation Semi-Strapdown Inertial Navigation System (RSSINS) is introduced into SINS. In fact, the stability of the modulation angular rate is difficult to achieve in a high-speed rotation environment. The changing rotary angular rate has an impact on the inertial sensor error self-compensation. In this paper, the influence of modulation angular rate error, including acceleration-deceleration process, and instability of the angular rate on the navigation accuracy of RSSINS is deduced and the error characteristics of the reciprocating rotation scheme are analyzed. A new compensation method is proposed to remove or reduce sensor errors so as to make it possible to maintain high precision autonomous navigation performance by MIMU when there is no external aid. Experiments have been carried out to validate the performance of the method. In addition, the proposed method is applicable for modulation angular rate error compensation under various dynamic conditions.

  20. Navigation.

    PubMed

    Wiltschko, Roswitha

    2017-07-01

    Experiments with migrating birds displaced during autumn migration outside their normal migration corridor reveal two different navigational strategies: adult migrants compensate for the displacement, and head towards their traditional wintering areas, whereas young first-time migrants continue in their migratory direction. Young birds are guided to their still unknown goal by a genetically coded migration program that indicates duration and direction(s) of the migratory flight by controlling the amount of migratory restlessness and the compass course(s) with respect to the geomagnetic field and celestial rotation. Adult migrants that have already wintered and are familiar with the goal area approach the goal by true navigation, specifically heading towards it and changing their course correspondingly after displacement. During their first journey, young birds experience the distribution of potential navigational factors en route and in their winter home, which allows them to truly navigate on their next migrations. The navigational factors used appear to include magnetic intensity as a component in their multi-modal navigational 'map'; olfactory input is also involved, even if it is not yet entirely clear in what way. The mechanisms of migratory birds for true navigation over long distances appear to be in principle similar to those discussed for by homing pigeons.