Relative navigation requirements for automatic rendezvous and capture systems
NASA Technical Reports Server (NTRS)
Kachmar, Peter M.; Polutchko, Robert J.; Chu, William; Montez, Moises
1991-01-01
This paper will discuss in detail the relative navigation system requirements and sensor trade-offs for Automatic Rendezvous and Capture. Rendezvous navigation filter development will be discussed in the context of navigation performance requirements for a 'Phase One' AR&C system capability. Navigation system architectures and the resulting relative navigation performance for both cooperative and uncooperative target vehicles will be assessed. Relative navigation performance using rendezvous radar, star tracker, radiometric, laser and GPS navigation sensors during appropriate phases of the trajectory will be presented. The effect of relative navigation performance on the Integrated AR&C system performance will be addressed. Linear covariance and deterministic simulation results will be used. Evaluation of relative navigation and IGN&C system performance for several representative relative approach profiles will be presented in order to demonstrate the full range of system capabilities. A summary of the sensor requirements and recommendations for AR&C system capabilities for several programs requiring AR&C will be presented.
Automated Rendezvous and Docking Sensor Testing at the Flight Robotics Laboratory
NASA Technical Reports Server (NTRS)
Mitchell, J.; Johnston, A.; Howard, R.; Williamson, M.; Brewster, L.; Strack, D.; Cryan, S.
2007-01-01
The Exploration Systems Architecture defines missions that require rendezvous, proximity operations, and docking (RPOD) of two spacecraft both in Low Earth Orbit (LEO) and in Low Lunar Orbit (LLO). Uncrewed spacecraft must perform automated and/or autonomous rendezvous, proximity operations and docking operations (commonly known as Automated Rendezvous and Docking, AR&D). The crewed versions may also perform AR&D, possibly with a different level of automation and/or autonomy, and must also provide the crew with relative navigation information for manual piloting. The capabilities of the RPOD sensors are critical to the success of the Exploration Program. NASA has the responsibility to determine whether the Crew Exploration Vehicle (CEV) contractor-proposed relative navigation sensor suite will meet the CEV requirements. The relatively low technology readiness of relative navigation sensors for AR&D has been carried as one of the CEV Projects top risks. The AR&D Sensor Technology Project seeks to reduce this risk by increasing technology maturation of selected relative navigation sensor technologies through testing and simulation, and to allow the CEV Project to assess the relative navigation sensors.
Automated Rendezvous and Docking Sensor Testing at the Flight Robotics Laboratory
NASA Technical Reports Server (NTRS)
Howard, Richard T.; Williamson, Marlin L.; Johnston, Albert S.; Brewster, Linda L.; Mitchell, Jennifer D.; Cryan, Scott P.; Strack, David; Key, Kevin
2007-01-01
The Exploration Systems Architecture defines missions that require rendezvous, proximity operations, and docking (RPOD) of two spacecraft both in Low Earth Orbit (LEO) and in Low Lunar Orbit (LLO). Uncrewed spacecraft must perform automated and/or autonomous rendezvous, proximity operations and docking operations (commonly known as Automated Rendezvous and Docking, (AR&D).) The crewed versions of the spacecraft may also perform AR&D, possibly with a different level of automation and/or autonomy, and must also provide the crew with relative navigation information for manual piloting. The capabilities of the RPOD sensors are critical to the success of the Exploration Program. NASA has the responsibility to determine whether the Crew Exploration Vehicle (CEV) contractor-proposed relative navigation sensor suite will meet the CEV requirements. The relatively low technology readiness of relative navigation sensors for AR&D has been carried as one of the CEV Projects top risks. The AR&D Sensor Technology Project seeks to reduce this risk by increasing technology maturation of selected relative navigation sensor technologies through testing and simulation, and to allow the CEV Project to assess the relative navigation sensors.
Ilyas, Muhammad; Hong, Beomjin; Cho, Kuk; Baeg, Seung-Ho; Park, Sangdeok
2016-01-01
This paper provides algorithms to fuse relative and absolute microelectromechanical systems (MEMS) navigation sensors, suitable for micro planetary rovers, to provide a more accurate estimation of navigation information, specifically, attitude and position. Planetary rovers have extremely slow speed (~1 cm/s) and lack conventional navigation sensors/systems, hence the general methods of terrestrial navigation may not be applicable to these applications. While relative attitude and position can be tracked in a way similar to those for ground robots, absolute navigation information is hard to achieve on a remote celestial body, like Moon or Mars, in contrast to terrestrial applications. In this study, two absolute attitude estimation algorithms were developed and compared for accuracy and robustness. The estimated absolute attitude was fused with the relative attitude sensors in a framework of nonlinear filters. The nonlinear Extended Kalman filter (EKF) and Unscented Kalman filter (UKF) were compared in pursuit of better accuracy and reliability in this nonlinear estimation problem, using only on-board low cost MEMS sensors. Experimental results confirmed the viability of the proposed algorithms and the sensor suite, for low cost and low weight micro planetary rovers. It is demonstrated that integrating the relative and absolute navigation MEMS sensors reduces the navigation errors to the desired level. PMID:27223293
Ilyas, Muhammad; Hong, Beomjin; Cho, Kuk; Baeg, Seung-Ho; Park, Sangdeok
2016-05-23
This paper provides algorithms to fuse relative and absolute microelectromechanical systems (MEMS) navigation sensors, suitable for micro planetary rovers, to provide a more accurate estimation of navigation information, specifically, attitude and position. Planetary rovers have extremely slow speed (~1 cm/s) and lack conventional navigation sensors/systems, hence the general methods of terrestrial navigation may not be applicable to these applications. While relative attitude and position can be tracked in a way similar to those for ground robots, absolute navigation information is hard to achieve on a remote celestial body, like Moon or Mars, in contrast to terrestrial applications. In this study, two absolute attitude estimation algorithms were developed and compared for accuracy and robustness. The estimated absolute attitude was fused with the relative attitude sensors in a framework of nonlinear filters. The nonlinear Extended Kalman filter (EKF) and Unscented Kalman filter (UKF) were compared in pursuit of better accuracy and reliability in this nonlinear estimation problem, using only on-board low cost MEMS sensors. Experimental results confirmed the viability of the proposed algorithms and the sensor suite, for low cost and low weight micro planetary rovers. It is demonstrated that integrating the relative and absolute navigation MEMS sensors reduces the navigation errors to the desired level.
Unmanned Ground Vehicle Navigation and Coverage Hole Patching in Wireless Sensor Networks
ERIC Educational Resources Information Center
Zhang, Guyu
2013-01-01
This dissertation presents a study of an Unmanned Ground Vehicle (UGV) navigation and coverage hole patching in coordinate-free and localization-free Wireless Sensor Networks (WSNs). Navigation and coverage maintenance are related problems since coverage hole patching requires effective navigation in the sensor network environment. A…
NASA Technical Reports Server (NTRS)
Christian, John A.; Patangan, Mogi; Hinkel, Heather; Chevray, Keiko; Brazzel, Jack
2012-01-01
The Orion Multi-Purpose Crew Vehicle is a new spacecraft being designed by NASA and Lockheed Martin for future crewed exploration missions. The Vision Navigation Sensor is a Flash LIDAR that will be the primary relative navigation sensor for this vehicle. To obtain a better understanding of this sensor's performance, the Orion relative navigation team has performed both flight tests and ground tests. This paper summarizes and compares the performance results from the STS-134 flight test, called the Sensor Test for Orion RelNav Risk Mitigation (STORRM) Development Test Objective, and the ground tests at the Space Operations Simulation Center.
NASA Technical Reports Server (NTRS)
1970-01-01
The guidance and navigation requirements for unmanned missions to the outer planets, assuming constant, low thrust, ion propulsion are discussed. The navigational capability of the ground based Deep Space Network is compared to the improvements in navigational capability brought about by the addition of guidance and navigation related onboard sensors. Relevant onboard sensors include: (1) the optical onboard navigation sensor, (2) the attitude reference sensors, and (3) highly sensitive accelerometers. The totally ground based, and the combination ground based and onboard sensor systems are compared by means of the estimated errors in target planet ephemeris, and the spacecraft position with respect to the planet.
Embedded Relative Navigation Sensor Fusion Algorithms for Autonomous Rendezvous and Docking Missions
NASA Technical Reports Server (NTRS)
DeKock, Brandon K.; Betts, Kevin M.; McDuffie, James H.; Dreas, Christine B.
2008-01-01
bd Systems (a subsidiary of SAIC) has developed a suite of embedded relative navigation sensor fusion algorithms to enable NASA autonomous rendezvous and docking (AR&D) missions. Translational and rotational Extended Kalman Filters (EKFs) were developed for integrating measurements based on the vehicles' orbital mechanics and high-fidelity sensor error models and provide a solution with increased accuracy and robustness relative to any single relative navigation sensor. The filters were tested tinough stand-alone covariance analysis, closed-loop testing with a high-fidelity multi-body orbital simulation, and hardware-in-the-loop (HWIL) testing in the Marshall Space Flight Center (MSFC) Flight Robotics Laboratory (FRL).
NASA Astrophysics Data System (ADS)
Jeong, Junho; Kim, Seungkeun; Suk, Jinyoung
2017-12-01
In order to overcome the limited range of GPS-based techniques, vision-based relative navigation methods have recently emerged as alternative approaches for a high Earth orbit (HEO) or deep space missions. Therefore, various vision-based relative navigation systems use for proximity operations between two spacecraft. For the implementation of these systems, a sensor placement problem can occur on the exterior of spacecraft due to its limited space. To deal with the sensor placement, this paper proposes a novel methodology for a vision-based relative navigation based on multiple position sensitive diode (PSD) sensors and multiple infrared beacon modules. For the proposed method, an iterated parametric study is used based on the farthest point optimization (FPO) and a constrained extended Kalman filter (CEKF). Each algorithm is applied to set the location of the sensors and to estimate relative positions and attitudes according to each combination by the PSDs and beacons. After that, scores for the sensor placement are calculated with respect to parameters: the number of the PSDs, number of the beacons, and accuracy of relative estimates. Then, the best scoring candidate is determined for the sensor placement. Moreover, the results of the iterated estimation show that the accuracy improves dramatically, as the number of the PSDs increases from one to three.
NASA Technical Reports Server (NTRS)
Bishop, Robert H.; DeMars, Kyle; Trawny, Nikolas; Crain, Tim; Hanak, Chad; Carson, John M.; Christian, John
2016-01-01
The navigation filter architecture successfully deployed on the Morpheus flight vehicle is presented. The filter was developed as a key element of the NASA Autonomous Landing and Hazard Avoidance Technology (ALHAT) project and over the course of 15 free fights was integrated into the Morpheus vehicle, operations, and flight control loop. Flight testing completed by demonstrating autonomous hazard detection and avoidance, integration of an altimeter, surface relative velocity (velocimeter) and hazard relative navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman flter software, and landing within 2 meters of the vertical testbed GPS-based navigation solution at the safe landing site target. Morpheus followed a trajectory that included an ascent phase followed by a partial descent-to-landing, although the proposed filter architecture is applicable to more general planetary precision entry, descent, and landings. The main new contribution is the incorporation of a sophisticated hazard relative navigation sensor-originally intended to locate safe landing sites-into the navigation system and employed as a navigation sensor. The formulation of a dual-state inertial extended Kalman filter was designed to address the precision planetary landing problem when viewed as a rendezvous problem with an intended landing site. For the required precision navigation system that is capable of navigating along a descent-to-landing trajectory to a precise landing, the impact of attitude errors on the translational state estimation are included in a fully integrated navigation structure in which translation state estimation is combined with attitude state estimation. The map tie errors are estimated as part of the process, thereby creating a dual-state filter implementation. Also, the filter is implemented using inertial states rather than states relative to the target. External measurements include altimeter, velocimeter, star camera, terrain relative navigation sensor, and a hazard relative navigation sensor providing information regarding hazards on a map generated on-the-fly.
A Bionic Camera-Based Polarization Navigation Sensor
Wang, Daobin; Liang, Huawei; Zhu, Hui; Zhang, Shuai
2014-01-01
Navigation and positioning technology is closely related to our routine life activities, from travel to aerospace. Recently it has been found that Cataglyphis (a kind of desert ant) is able to detect the polarization direction of skylight and navigate according to this information. This paper presents a real-time bionic camera-based polarization navigation sensor. This sensor has two work modes: one is a single-point measurement mode and the other is a multi-point measurement mode. An indoor calibration experiment of the sensor has been done under a beam of standard polarized light. The experiment results show that after noise reduction the accuracy of the sensor can reach up to 0.3256°. It is also compared with GPS and INS (Inertial Navigation System) in the single-point measurement mode through an outdoor experiment. Through time compensation and location compensation, the sensor can be a useful alternative to GPS and INS. In addition, the sensor also can measure the polarization distribution pattern when it works in multi-point measurement mode. PMID:25051029
NASA Technical Reports Server (NTRS)
Mitchell, Jennifer D.; Cryan, Scott P.; Baker, Kenneth; Martin, Toby; Goode, Robert; Key, Kevin W.; Manning, Thomas; Chien, Chiun-Hong
2008-01-01
The Exploration Systems Architecture defines missions that require rendezvous, proximity operations, and docking (RPOD) of two spacecraft both in Low Earth Orbit (LEO) and in Low Lunar Orbit (LLO). Uncrewed spacecraft must perform automated and/or autonomous rendezvous, proximity operations and docking operations (commonly known as Automated Rendezvous and Docking, AR&D). The crewed versions may also perform AR&D, possibly with a different level of automation and/or autonomy, and must also provide the crew with relative navigation information for manual piloting. The capabilities of the RPOD sensors are critical to the success of the Constellation Program; this is carried as one of the CEV Project top risks. The Exploration Technology Development Program (ETDP) AR&D Sensor Technology Project seeks to reduce this risk by increasing technology maturation of selected relative navigation sensor technologies through testing and simulation. One of the project activities is a series of "pathfinder" testing and simulation activities to integrate relative navigation sensors with the Johnson Space Center Six-Degree-of-Freedom Test System (SDTS). The SDTS will be the primary testing location for the Orion spacecraft s Low Impact Docking System (LIDS). Project team members have integrated the Orion simulation with the SDTS computer system so that real-time closed loop testing can be performed with relative navigation sensors and the docking system in the loop during docking and undocking scenarios. Two relative navigation sensors are being used as part of a "pathfinder" activity in order to pave the way for future testing with the actual Orion sensors. This paper describes the test configuration and test results.
Design and testing of a multi-sensor pedestrian location and navigation platform.
Morrison, Aiden; Renaudin, Valérie; Bancroft, Jared B; Lachapelle, Gérard
2012-01-01
Navigation and location technologies are continually advancing, allowing ever higher accuracies and operation under ever more challenging conditions. The development of such technologies requires the rapid evaluation of a large number of sensors and related utilization strategies. The integration of Global Navigation Satellite Systems (GNSSs) such as the Global Positioning System (GPS) with accelerometers, gyros, barometers, magnetometers and other sensors is allowing for novel applications, but is hindered by the difficulties to test and compare integrated solutions using multiple sensor sets. In order to achieve compatibility and flexibility in terms of multiple sensors, an advanced adaptable platform is required. This paper describes the design and testing of the NavCube, a multi-sensor navigation, location and timing platform. The system provides a research tool for pedestrian navigation, location and body motion analysis in an unobtrusive form factor that enables in situ data collections with minimal gait and posture impact. Testing and examples of applications of the NavCube are provided.
Multi-Sensor Testing for Automated Rendezvous and Docking Sensor Testing at the Flight Robotics Lab
NASA Technical Reports Server (NTRS)
Brewster, Linda L.; Howard, Richard T.; Johnston, A. S.; Carrington, Connie; Mitchell, Jennifer D.; Cryan, Scott P.
2008-01-01
The Exploration Systems Architecture defines missions that require rendezvous, proximity operations, and docking (RPOD) of two spacecraft both in Low Earth Orbit (LEO) and in Low Lunar Orbit (LLO). Uncrewed spacecraft must perform automated and/or autonomous rendezvous, proximity operations and docking operations (commonly known as AR&D). The crewed missions may also perform rendezvous and docking operations and may require different levels of automation and/or autonomy, and must provide the crew with relative navigation information for manual piloting. The capabilities of the RPOD sensors are critical to the success ofthe Exploration Program. NASA has the responsibility to determine whether the Crew Exploration Vehicle (CEV) contractor-proposed relative navigation sensor suite will meet the requirements. The relatively low technology readiness level of AR&D relative navigation sensors has been carried as one of the CEV Project's top risks. The AR&D Sensor Technology Project seeks to reduce the risk by the testing and analysis of selected relative navigation sensor technologies through hardware-in-the-Ioop testing and simulation. These activities will provide the CEV Project information to assess the relative navigation sensors maturity as well as demonstrate test methods and capabilities. The first year of this project focused on a series of "pathfinder" testing tasks to develop the test plans, test facility requirements, trajectories, math model architecture, simulation platform, and processes that will be used to evaluate the Contractor-proposed sensors. Four candidate sensors were used in the first phase of the testing. The second phase of testing used four sensors simultaneously: two Marshall Space Flight Center (MSFC) Advanced Video Guidance Sensors (AVGS), a laser-based video sensor that uses retroreflectors attached to the target vehicle, and two commercial laser range finders. The multi-sensor testing was conducted at MSFC's Flight Robotics Laboratory (FRL) using the FRL's 6-DOF gantry system, called the Dynamic Overhead Target System (DOTS). The target vehicle for "docking" in the laboratory was a mockup that was representative of the proposed CEV docking system, with added retroreflectors for the AVGS.' The multi-sensor test configuration used 35 open-loop test trajectories covering three major objectives: (l) sensor characterization trajectories designed to test a wide range of performance parameters; (2) CEV-specific trajectories designed to test performance during CEV-like approach and departure profiles; and (3) sensor characterization tests designed for evaluating sensor performance under more extreme conditions as might be induced during a spacecraft failure or during contingency situations. This paper describes the test development, test facility, test preparations, test execution, and test results of the multisensor series oftrajectories
Navigation integrity monitoring and obstacle detection for enhanced-vision systems
NASA Astrophysics Data System (ADS)
Korn, Bernd; Doehler, Hans-Ullrich; Hecker, Peter
2001-08-01
Typically, Enhanced Vision (EV) systems consist of two main parts, sensor vision and synthetic vision. Synthetic vision usually generates a virtual out-the-window view using databases and accurate navigation data, e. g. provided by differential GPS (DGPS). The reliability of the synthetic vision highly depends on both, the accuracy of the used database and the integrity of the navigation data. But especially in GPS based systems, the integrity of the navigation can't be guaranteed. Furthermore, only objects that are stored in the database can be displayed to the pilot. Consequently, unexpected obstacles are invisible and this might cause severe problems. Therefore, additional information has to be extracted from sensor data to overcome these problems. In particular, the sensor data analysis has to identify obstacles and has to monitor the integrity of databases and navigation. Furthermore, if a lack of integrity arises, navigation data, e.g. the relative position of runway and aircraft, has to be extracted directly from the sensor data. The main contribution of this paper is about the realization of these three sensor data analysis tasks within our EV system, which uses the HiVision 35 GHz MMW radar of EADS, Ulm as the primary EV sensor. For the integrity monitoring, objects extracted from radar images are registered with both database objects and objects (e. g. other aircrafts) transmitted via data link. This results in a classification into known and unknown radar image objects and consequently, in a validation of the integrity of database and navigation. Furthermore, special runway structures are searched for in the radar image where they should appear. The outcome of this runway check contributes to the integrity analysis, too. Concurrent to this investigation a radar image based navigation is performed without using neither precision navigation nor detailed database information to determine the aircraft's position relative to the runway. The performance of our approach is demonstrated with real data acquired during extensive flight tests to several airports in Northern Germany.
Relative Navigation of Formation Flying Satellites
NASA Technical Reports Server (NTRS)
Long, Anne; Kelbel, David; Lee, Taesul; Leung, Dominic; Carpenter, Russell; Gramling, Cheryl; Bauer, Frank (Technical Monitor)
2002-01-01
The Guidance, Navigation, and Control Center (GNCC) at Goddard Space Flight Center (GSFC) has successfully developed high-accuracy autonomous satellite navigation systems using the National Aeronautics and Space Administration's (NASA's) space and ground communications systems and the Global Positioning System (GPS). In addition, an autonomous navigation system that uses celestial object sensor measurements is currently under development and has been successfully tested using real Sun and Earth horizon measurements.The GNCC has developed advanced spacecraft systems that provide autonomous navigation and control of formation flyers in near-Earth, high-Earth, and libration point orbits. To support this effort, the GNCC is assessing the relative navigation accuracy achievable for proposed formations using GPS, intersatellite crosslink, ground-to-satellite Doppler, and celestial object sensor measurements. This paper evaluates the performance of these relative navigation approaches for three proposed missions with two or more vehicles maintaining relatively tight formations. High-fidelity simulations were performed to quantify the absolute and relative navigation accuracy as a function of navigation algorithm and measurement type. Realistically-simulated measurements were processed using the extended Kalman filter implemented in the GPS Enhanced Inboard Navigation System (GEONS) flight software developed by GSFC GNCC. Solutions obtained by simultaneously estimating all satellites in the formation were compared with the results obtained using a simpler approach based on differencing independently estimated state vectors.
NASA Technical Reports Server (NTRS)
Brewster, L.; Johnston, A.; Howard, R.; Mitchell, J.; Cryan, S.
2007-01-01
The Exploration Systems Architecture defines missions that require rendezvous, proximity operations, and docking (RPOD) of two spacecraft both in Low Earth Orbit (LEO) and in Low Lunar Orbit (LLO). Uncrewed spacecraft must perform automated and/or autonomous rendezvous, proximity operations and docking operations (commonly known as AR&D). The crewed missions may also perform rendezvous and docking operations and may require different levels of automation and/or autonomy, and must provide the crew with relative navigation information for manual piloting. The capabilities of the RPOD sensors are critical to the success of the Exploration Program. NASA has the responsibility to determine whether the Crew Exploration Vehicle (CEV) contractor proposed relative navigation sensor suite will meet the requirements. The relatively low technology readiness level of AR&D relative navigation sensors has been carried as one of the CEV Project's top risks. The AR&D Sensor Technology Project seeks to reduce the risk by the testing and analysis of selected relative navigation sensor technologies through hardware-in-the-loop testing and simulation. These activities will provide the CEV Project information to assess the relative navigation sensors maturity as well as demonstrate test methods and capabilities. The first year of this project focused on a series of"pathfinder" testing tasks to develop the test plans, test facility requirements, trajectories, math model architecture, simulation platform, and processes that will be used to evaluate the Contractor-proposed sensors. Four candidate sensors were used in the first phase of the testing. The second phase of testing used four sensors simultaneously: two Marshall Space Flight Center (MSFC) Advanced Video Guidance Sensors (AVGS), a laser-based video sensor that uses retroreflectors attached to the target vehicle, and two commercial laser range finders. The multi-sensor testing was conducted at MSFC's Flight Robotics Laboratory (FRL) using the FRL's 6-DOF gantry system, called the Dynamic Overhead Target System (DOTS). The target vehicle for "docking" in the laboratory was a mockup that was representative of the proposed CEV docking system, with added retroreflectors for the AVGS. The multi-sensor test configuration used 35 open-loop test trajectories covering three major objectives: (1) sensor characterization trajectories designed to test a wide range of performance parameters; (2) CEV-specific trajectories designed to test performance during CEV-like approach and departure profiles; and (3) sensor characterization tests designed for evaluating sensor performance under more extreme conditions as might be induced during a spacecraft failure or during contingency situations. This paper describes the test development, test facility, test preparations, test execution, and test results of the multi-sensor series of trajectories.
Precision Landing and Hazard Avoidance Doman
NASA Technical Reports Server (NTRS)
Robertson, Edward A.; Carson, John M., III
2016-01-01
The Precision Landing and Hazard Avoidance (PL&HA) domain addresses the development, integration, testing, and spaceflight infusion of sensing, processing, and GN&C functions critical to the success and safety of future human and robotic exploration missions. PL&HA sensors also have applications to other mission events, such as rendezvous and docking. Autonomous PL&HA builds upon the core GN&C capabilities developed to enable soft, controlled landings on the Moon, Mars, and other solar system bodies. Through the addition of a Terrain Relative Navigation (TRN) function, precision landing within tens of meters of a map-based target is possible. The addition of a 3-D terrain mapping lidar sensor improves the probability of a safe landing via autonomous, real-time Hazard Detection and Avoidance (HDA). PL&HA significantly improves the probability of mission success and enhances access to sites of scientific interest located in challenging terrain. PL&HA can also utilize external navigation aids, such as navigation satellites and surface beacons. Advanced Lidar Sensors High precision ranging, velocimetry, and 3-D terrain mapping Terrain Relative Navigation (TRN) TRN compares onboard reconnaissance data with real-time terrain imaging data to update the S/C position estimate Hazard Detection and Avoidance (HDA) Generates a high-resolution, 3-D terrain map in real-time during the approach trajectory to identify safe landing targets Inertial Navigation During Terminal Descent High precision surface relative sensors enable accurate inertial navigation during terminal descent and a tightly controlled touchdown within meters of the selected safe landing target.
Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles.
Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F
2016-09-16
Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV's navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results.
Distributed Ship Navigation Control System Based on Dual Network
NASA Astrophysics Data System (ADS)
Yao, Ying; Lv, Wu
2017-10-01
Navigation system is very important for ship’s normal running. There are a lot of devices and sensors in the navigation system to guarantee ship’s regular work. In the past, these devices and sensors were usually connected via CAN bus for high performance and reliability. However, as the development of related devices and sensors, the navigation system also needs the ability of high information throughput and remote data sharing. To meet these new requirements, we propose the communication method based on dual network which contains CAN bus and industrial Ethernet. Also, we import multiple distributed control terminals with cooperative strategy based on the idea of synchronizing the status by multicasting UDP message contained operation timestamp to make the system more efficient and reliable.
FPGA-based real-time embedded system for RISS/GPS integrated navigation.
Abdelfatah, Walid Farid; Georgy, Jacques; Iqbal, Umar; Noureldin, Aboelmagd
2012-01-01
Navigation algorithms integrating measurements from multi-sensor systems overcome the problems that arise from using GPS navigation systems in standalone mode. Algorithms which integrate the data from 2D low-cost reduced inertial sensor system (RISS), consisting of a gyroscope and an odometer or wheel encoders, along with a GPS receiver via a Kalman filter has proved to be worthy in providing a consistent and more reliable navigation solution compared to standalone GPS receivers. It has been also shown to be beneficial, especially in GPS-denied environments such as urban canyons and tunnels. The main objective of this paper is to narrow the idea-to-implementation gap that follows the algorithm development by realizing a low-cost real-time embedded navigation system capable of computing the data-fused positioning solution. The role of the developed system is to synchronize the measurements from the three sensors, relative to the pulse per second signal generated from the GPS, after which the navigation algorithm is applied to the synchronized measurements to compute the navigation solution in real-time. Employing a customizable soft-core processor on an FPGA in the kernel of the navigation system, provided the flexibility for communicating with the various sensors and the computation capability required by the Kalman filter integration algorithm.
FPGA-Based Real-Time Embedded System for RISS/GPS Integrated Navigation
Abdelfatah, Walid Farid; Georgy, Jacques; Iqbal, Umar; Noureldin, Aboelmagd
2012-01-01
Navigation algorithms integrating measurements from multi-sensor systems overcome the problems that arise from using GPS navigation systems in standalone mode. Algorithms which integrate the data from 2D low-cost reduced inertial sensor system (RISS), consisting of a gyroscope and an odometer or wheel encoders, along with a GPS receiver via a Kalman filter has proved to be worthy in providing a consistent and more reliable navigation solution compared to standalone GPS receivers. It has been also shown to be beneficial, especially in GPS-denied environments such as urban canyons and tunnels. The main objective of this paper is to narrow the idea-to-implementation gap that follows the algorithm development by realizing a low-cost real-time embedded navigation system capable of computing the data-fused positioning solution. The role of the developed system is to synchronize the measurements from the three sensors, relative to the pulse per second signal generated from the GPS, after which the navigation algorithm is applied to the synchronized measurements to compute the navigation solution in real-time. Employing a customizable soft-core processor on an FPGA in the kernel of the navigation system, provided the flexibility for communicating with the various sensors and the computation capability required by the Kalman filter integration algorithm. PMID:22368460
A Bionic Polarization Navigation Sensor and Its Calibration Method.
Zhao, Huijie; Xu, Wujian
2016-08-03
The polarization patterns of skylight which arise due to the scattering of sunlight in the atmosphere can be used by many insects for deriving compass information. Inspired by insects' polarized light compass, scientists have developed a new kind of navigation method. One of the key techniques in this method is the polarimetric sensor which is used to acquire direction information from skylight. In this paper, a polarization navigation sensor is proposed which imitates the working principles of the polarization vision systems of insects. We introduce the optical design and mathematical model of the sensor. In addition, a calibration method based on variable substitution and non-linear curve fitting is proposed. The results obtained from the outdoor experiments provide support for the feasibility and precision of the sensor. The sensor's signal processing can be well described using our mathematical model. A relatively high degree of accuracy in polarization measurement can be obtained without any error compensation.
Open-Loop Flight Testing of COBALT Navigation and Sensor Technologies for Precise Soft Landing
NASA Technical Reports Server (NTRS)
Carson, John M., III; Restrepo, Caroline I.; Seubert, Carl R.; Amzajerdian, Farzin; Pierrottet, Diego F.; Collins, Steven M.; O'Neal, Travis V.; Stelling, Richard
2017-01-01
An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) payload was conducted onboard the Masten Xodiac suborbital rocket testbed. The payload integrates two complementary sensor technologies that together provide a spacecraft with knowledge during planetary descent and landing to precisely navigate and softly touchdown in close proximity to targeted surface locations. The two technologies are the Navigation Doppler Lidar (NDL), for high-precision velocity and range measurements, and the Lander Vision System (LVS) for map-relative state esti- mates. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a very precise Terrain Relative Navigation (TRN) solution that is suitable for future, autonomous planetary landing systems that require precise and soft landing capabilities. During the open-loop flight campaign, the COBALT payload acquired measurements and generated a precise navigation solution, but the Xodiac vehicle planned and executed its maneuvers based on an independent, GPS-based navigation solution. This minimized the risk to the vehicle during the integration and testing of the new navigation sensing technologies within the COBALT payload.
FLASH LIDAR Based Relative Navigation
NASA Technical Reports Server (NTRS)
Brazzel, Jack; Clark, Fred; Milenkovic, Zoran
2014-01-01
Relative navigation remains the most challenging part of spacecraft rendezvous and docking. In recent years, flash LIDARs, have been increasingly selected as the go-to sensors for proximity operations and docking. Flash LIDARS are generally lighter and require less power that scanning Lidars. Flash LIDARs do not have moving parts, and they are capable of tracking multiple targets as well as generating a 3D map of a given target. However, there are some significant drawbacks of Flash Lidars that must be resolved if their use is to be of long-term significance. Overcoming the challenges of Flash LIDARs for navigation-namely, low technology readiness level, lack of historical performance data, target identification, existence of false positives, and performance of vision processing algorithms as intermediaries between the raw sensor data and the Kalman filter-requires a world-class testing facility, such as the Lockheed Martin Space Operations Simulation Center (SOSC). Ground-based testing is a critical step for maturing the next-generation flash LIDAR-based spacecraft relative navigation. This paper will focus on the tests of an integrated relative navigation system conducted at the SOSC in January 2014. The intent of the tests was to characterize and then improve the performance of relative navigation, while addressing many of the flash LIDAR challenges mentioned above. A section on navigation performance and future recommendation completes the discussion.
Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles
Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F.
2016-01-01
Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV’s navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results. PMID:27649203
A Bionic Polarization Navigation Sensor and Its Calibration Method
Zhao, Huijie; Xu, Wujian
2016-01-01
The polarization patterns of skylight which arise due to the scattering of sunlight in the atmosphere can be used by many insects for deriving compass information. Inspired by insects’ polarized light compass, scientists have developed a new kind of navigation method. One of the key techniques in this method is the polarimetric sensor which is used to acquire direction information from skylight. In this paper, a polarization navigation sensor is proposed which imitates the working principles of the polarization vision systems of insects. We introduce the optical design and mathematical model of the sensor. In addition, a calibration method based on variable substitution and non-linear curve fitting is proposed. The results obtained from the outdoor experiments provide support for the feasibility and precision of the sensor. The sensor’s signal processing can be well described using our mathematical model. A relatively high degree of accuracy in polarization measurement can be obtained without any error compensation. PMID:27527171
Cloud Absorption Radiometer Autonomous Navigation System - CANS
NASA Technical Reports Server (NTRS)
Kahle, Duncan; Gatebe, Charles; McCune, Bill; Hellwig, Dustan
2013-01-01
CAR (cloud absorption radiometer) acquires spatial reference data from host aircraft navigation systems. This poses various problems during CAR data reduction, including navigation data format, accuracy of position data, accuracy of airframe inertial data, and navigation data rate. Incorporating its own navigation system, which included GPS (Global Positioning System), roll axis inertia and rates, and three axis acceleration, CANS expedites data reduction and increases the accuracy of the CAR end data product. CANS provides a self-contained navigation system for the CAR, using inertial reference and GPS positional information. The intent of the software application was to correct the sensor with respect to aircraft roll in real time based upon inputs from a precision navigation sensor. In addition, the navigation information (including GPS position), attitude data, and sensor position details are all streamed to a remote system for recording and later analysis. CANS comprises a commercially available inertial navigation system with integral GPS capability (Attitude Heading Reference System AHRS) integrated into the CAR support structure and data system. The unit is attached to the bottom of the tripod support structure. The related GPS antenna is located on the P-3 radome immediately above the CAR. The AHRS unit provides a RS-232 data stream containing global position and inertial attitude and velocity data to the CAR, which is recorded concurrently with the CAR data. This independence from aircraft navigation input provides for position and inertial state data that accounts for very small changes in aircraft attitude and position, sensed at the CAR location as opposed to aircraft state sensors typically installed close to the aircraft center of gravity. More accurate positional data enables quicker CAR data reduction with better resolution. The CANS software operates in two modes: initialization/calibration and operational. In the initialization/calibration mode, the software aligns the precision navigation sensors and initializes the communications interfaces with the sensor and the remote computing system. It also monitors the navigation data state for quality and ensures that the system maintains the required fidelity for attitude and positional information. In the operational mode, the software runs at 12.5 Hz and gathers the required navigation/attitude data, computes the required sensor correction values, and then commands the sensor to the required roll correction. In this manner, the sensor will stay very near to vertical at all times, greatly improving the resulting collected data and imagery. CANS greatly improves quality of resulting imagery and data collected. In addition, the software component of the system outputs a concisely formatted, high-speed data stream that can be used for further science data processing. This precision, time-stamped data also can benefit other instruments on the same aircraft platform by providing extra information from the mission flight.
NASA Technical Reports Server (NTRS)
Galante, Joseph M.; Eepoel, John Van; Strube, Matt; Gill, Nat; Gonzalez, Marcelo; Hyslop, Andrew; Patrick, Bryan
2012-01-01
Argon is a flight-ready sensor suite with two visual cameras, a flash LIDAR, an on- board flight computer, and associated electronics. Argon was designed to provide sensing capabilities for relative navigation during proximity, rendezvous, and docking operations between spacecraft. A rigorous ground test campaign assessed the performance capability of the Argon navigation suite to measure the relative pose of high-fidelity satellite mock-ups during a variety of simulated rendezvous and proximity maneuvers facilitated by robot manipulators in a variety of lighting conditions representative of the orbital environment. A brief description of the Argon suite and test setup are given as well as an analysis of the performance of the system in simulated proximity and rendezvous operations.
VLC-based indoor location awareness using LED light and image sensors
NASA Astrophysics Data System (ADS)
Lee, Seok-Ju; Yoo, Jong-Ho; Jung, Sung-Yoon
2012-11-01
Recently, indoor LED lighting can be considered for constructing green infra with energy saving and additionally providing LED-IT convergence services such as visible light communication (VLC) based location awareness and navigation services. For example, in case of large complex shopping mall, location awareness to navigate the destination is very important issue. However, the conventional navigation using GPS is not working indoors. Alternative location service based on WLAN has a problem that the position accuracy is low. For example, it is difficult to estimate the height exactly. If the position error of the height is greater than the height between floors, it may cause big problem. Therefore, conventional navigation is inappropriate for indoor navigation. Alternative possible solution for indoor navigation is VLC based location awareness scheme. Because indoor LED infra will be definitely equipped for providing lighting functionality, indoor LED lighting has a possibility to provide relatively high accuracy of position estimation combined with VLC technology. In this paper, we provide a new VLC based positioning system using visible LED lights and image sensors. Our system uses location of image sensor lens and location of reception plane. By using more than two image sensor, we can determine transmitter position less than 1m position error. Through simulation, we verify the validity of the proposed VLC based new positioning system using visible LED light and image sensors.
Siegelaar, Sarah E; Barwari, Temo; Hermanides, Jeroen; van der Voort, Peter H J; Hoekstra, Joost B L; DeVries, J Hans
2013-11-01
Continuous glucose monitoring could be helpful for glucose regulation in critically ill patients; however, its accuracy is uncertain and might be influenced by microcirculation. We investigated the microcirculation and its relation to the accuracy of 2 continuous glucose monitoring devices in patients after cardiac surgery. The present prospective, observational study included 60 patients admitted for cardiac surgery. Two continuous glucose monitoring devices (Guardian Real-Time and FreeStyle Navigator) were placed before surgery. The relative absolute deviation between continuous glucose monitoring and the arterial reference glucose was calculated to assess the accuracy. Microcirculation was measured using the microvascular flow index, perfused vessel density, and proportion of perfused vessels using sublingual sidestream dark-field imaging, and tissue oxygenation using near-infrared spectroscopy. The associations were assessed using a linear mixed-effects model for repeated measures. The median relative absolute deviation of the Navigator was 11% (interquartile range, 8%-16%) and of the Guardian was 14% (interquartile range, 11%-18%; P = .05). Tissue oxygenation significantly increased during the intensive care unit admission (maximum 91.2% [3.9] after 6 hours) and decreased thereafter, stabilizing after 20 hours. A decrease in perfused vessel density accompanied the increase in tissue oxygenation. Microcirculatory variables were not associated with sensor accuracy. A lower peripheral temperature (Navigator, b = -0.008, P = .003; Guardian, b = -0.006, P = .048), and for the Navigator, also a higher Acute Physiology and Chronic Health Evaluation IV predicted mortality (b = 0.017, P < .001) and age (b = 0.002, P = .037) were associated with decreased sensor accuracy. The results of the present study have shown acceptable accuracy for both sensors in patients after cardiac surgery. The microcirculation was impaired to a limited extent compared with that in patients with sepsis and healthy controls. This impairment was not related to sensor accuracy but the peripheral temperature for both sensors and patient age and Acute Physiology and Chronic Health Evaluation IV predicted mortality for the Navigator were. Copyright © 2013 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Context-Aware Personal Navigation Using Embedded Sensor Fusion in Smartphones
Saeedi, Sara; Moussa, Adel; El-Sheimy, Naser
2014-01-01
Context-awareness is an interesting topic in mobile navigation scenarios where the context of the application is highly dynamic. Using context-aware computing, navigation services consider the situation of user, not only in the design process, but in real time while the device is in use. The basic idea is that mobile navigation services can provide different services based on different contexts—where contexts are related to the user's activity and the device placement. Context-aware systems are concerned with the following challenges which are addressed in this paper: context acquisition, context understanding, and context-aware application adaptation. The proposed approach in this paper is using low-cost sensors in a multi-level fusion scheme to improve the accuracy and robustness of context-aware navigation system. The experimental results demonstrate the capabilities of the context-aware Personal Navigation Systems (PNS) for outdoor personal navigation using a smartphone. PMID:24670715
Context-aware personal navigation using embedded sensor fusion in smartphones.
Saeedi, Sara; Moussa, Adel; El-Sheimy, Naser
2014-03-25
Context-awareness is an interesting topic in mobile navigation scenarios where the context of the application is highly dynamic. Using context-aware computing, navigation services consider the situation of user, not only in the design process, but in real time while the device is in use. The basic idea is that mobile navigation services can provide different services based on different contexts-where contexts are related to the user's activity and the device placement. Context-aware systems are concerned with the following challenges which are addressed in this paper: context acquisition, context understanding, and context-aware application adaptation. The proposed approach in this paper is using low-cost sensors in a multi-level fusion scheme to improve the accuracy and robustness of context-aware navigation system. The experimental results demonstrate the capabilities of the context-aware Personal Navigation Systems (PNS) for outdoor personal navigation using a smartphone.
Integrated polarization-dependent sensor for autonomous navigation
NASA Astrophysics Data System (ADS)
Liu, Ze; Zhang, Ran; Wang, Zhiwen; Guan, Le; Li, Bin; Chu, Jinkui
2015-01-01
Based on the navigation strategy of insects utilizing the polarized skylight, an integrated polarization-dependent sensor for autonomous navigation is presented. The navigation sensor has the features of compact structure, high precision, strong robustness, and a simple manufacture technique. The sensor is composed by integrating a complementary-metal-oxide-semiconductor sensor with a multiorientation nanowire grid polarizer. By nanoimprint lithography, the multiorientation nanowire polarizer is fabricated in one step and the alignment error is eliminated. The statistical theory is added to the interval-division algorithm to calculate the polarization angle of the incident light. The laboratory and outdoor tests for the navigation sensor are implemented and the errors of the measured angle are ±0.02 deg and ±1.3 deg, respectively. The results show that the proposed sensor has potential for application in autonomous navigation.
An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles.
Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo
2017-03-25
Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance.
An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles
Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo
2017-01-01
Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance. PMID:28346346
An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database.
Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang
2016-01-28
In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m.
An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database
Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang
2016-01-01
In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m. PMID:26828496
Relative Navigation for Formation Flying of Spacecraft
NASA Technical Reports Server (NTRS)
Alonso, Roberto; Du, Ju-Young; Hughes, Declan; Junkins, John L.; Crassidis, John L.
2001-01-01
This paper presents a robust and efficient approach for relative navigation and attitude estimation of spacecraft flying in formation. This approach uses measurements from a new optical sensor that provides a line of sight vector from the master spacecraft to the secondary satellite. The overall system provides a novel, reliable, and autonomous relative navigation and attitude determination system, employing relatively simple electronic circuits with modest digital signal processing requirements and is fully independent of any external systems. Experimental calibration results are presented, which are used to achieve accurate line of sight measurements. State estimation for formation flying is achieved through an optimal observer design. Also, because the rotational and translational motions are coupled through the observation vectors, three approaches are suggested to separate both signals just for stability analysis. Simulation and experimental results indicate that the combined sensor/estimator approach provides accurate relative position and attitude estimates.
Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) Project Status as of May 2010
NASA Technical Reports Server (NTRS)
Striepe, Scott A.; Epp, Chirold D.; Robertson, Edward A.
2010-01-01
This paper includes the current status of NASA s Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) Project. The ALHAT team has completed several flight tests and two major design analysis cycles. These tests and analyses examine terrain relative navigation sensors, hazard detection and avoidance sensors and algorithms, and hazard relative navigation algorithms, and the guidance and navigation system using these ALHAT functions. The next flight test is scheduled for July 2010. The paper contains results from completed flight tests and analysis cycles. ALHAT system status, upcoming tests and analyses is also addressed. The current ALHAT plans as of May 2010 are discussed. Application of the ALHAT system to landing on bodies other than the Moon is included
NASA Astrophysics Data System (ADS)
Tramutola, A.; Paltro, D.; Cabalo Perucha, M. P.; Paar, G.; Steiner, J.; Barrio, A. M.
2015-09-01
Vision Based Navigation (VBNAV) has been identified as a valid technology to support space exploration because it can improve autonomy and safety of space missions. Several mission scenarios can benefit from the VBNAV: Rendezvous & Docking, Fly-Bys, Interplanetary cruise, Entry Descent and Landing (EDL) and Planetary Surface exploration. For some of them VBNAV can improve the accuracy in state estimation as additional relative navigation sensor or as absolute navigation sensor. For some others, like surface mobility and terrain exploration for path identification and planning, VBNAV is mandatory. This paper presents the general avionic architecture of a Vision Based System as defined in the frame of the ESA R&T study “Multi-purpose Vision-based Navigation System Engineering Model - part 1 (VisNav-EM-1)” with special focus on the surface mobility application.
Open-Loop Performance of COBALT Precision Landing Payload on a Commercial Sub-Orbital Rocket
NASA Technical Reports Server (NTRS)
Restrepo, Carolina I.; Carson, John M., III; Amzajerdian, Farzin; Seubert, Carl R.; Lovelace, Ronney S.; McCarthy, Megan M.; Tse, Teming; Stelling, Richard; Collins, Steven M.
2018-01-01
An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) platform was conducted onboard the Masten Xodiac suborbital rocket testbed. The COBALT platform integrates NASA Guidance, Navigation and Control (GN&C) sensing technologies for autonomous, precise soft landing, including the Navigation Doppler Lidar (NDL) velocity and range sensor and the Lander Vision System (LVS) Terrain Relative Navigation (TRN) system. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a navigation solution that is independent of GPS and suitable for future, autonomous, planetary, landing systems. COBALT was a passive payload during the open loop tests. COBALT's sensors were actively taking data and processing it in real time, but the Xodiac rocket flew with its own GPS-navigation system as a risk reduction activity in the maturation of the technologies towards space flight. A future closed-loop test campaign is planned where the COBALT navigation solution will be used to fly its host vehicle.
Baohua, Li; Wenjie, Lai; Yun, Chen; Zongming, Liu
2013-01-01
An autonomous navigation algorithm using the sensor that integrated the star sensor (FOV1) and ultraviolet earth sensor (FOV2) is presented. The star images are sampled by FOV1, and the ultraviolet earth images are sampled by the FOV2. The star identification algorithm and star tracking algorithm are executed at FOV1. Then, the optical axis direction of FOV1 at J2000.0 coordinate system is calculated. The ultraviolet image of earth is sampled by FOV2. The center vector of earth at FOV2 coordinate system is calculated with the coordinates of ultraviolet earth. The autonomous navigation data of satellite are calculated by integrated sensor with the optical axis direction of FOV1 and the center vector of earth from FOV2. The position accuracy of the autonomous navigation for satellite is improved from 1000 meters to 300 meters. And the velocity accuracy of the autonomous navigation for satellite is improved from 100 m/s to 20 m/s. At the same time, the period sine errors of the autonomous navigation for satellite are eliminated. The autonomous navigation for satellite with a sensor that integrated ultraviolet earth sensor and star sensor is well robust. PMID:24250261
Baohua, Li; Wenjie, Lai; Yun, Chen; Zongming, Liu
2013-01-01
An autonomous navigation algorithm using the sensor that integrated the star sensor (FOV1) and ultraviolet earth sensor (FOV2) is presented. The star images are sampled by FOV1, and the ultraviolet earth images are sampled by the FOV2. The star identification algorithm and star tracking algorithm are executed at FOV1. Then, the optical axis direction of FOV1 at J2000.0 coordinate system is calculated. The ultraviolet image of earth is sampled by FOV2. The center vector of earth at FOV2 coordinate system is calculated with the coordinates of ultraviolet earth. The autonomous navigation data of satellite are calculated by integrated sensor with the optical axis direction of FOV1 and the center vector of earth from FOV2. The position accuracy of the autonomous navigation for satellite is improved from 1000 meters to 300 meters. And the velocity accuracy of the autonomous navigation for satellite is improved from 100 m/s to 20 m/s. At the same time, the period sine errors of the autonomous navigation for satellite are eliminated. The autonomous navigation for satellite with a sensor that integrated ultraviolet earth sensor and star sensor is well robust.
Alsubaie, Naif M; Youssef, Ahmed A; El-Sheimy, Naser
2017-09-30
This paper introduces a new method which facilitate the use of smartphones as a handheld low-cost mobile mapping system (MMS). Smartphones are becoming more sophisticated and smarter and are quickly closing the gap between computers and portable tablet devices. The current generation of smartphones are equipped with low-cost GPS receivers, high-resolution digital cameras, and micro-electro mechanical systems (MEMS)-based navigation sensors (e.g., accelerometers, gyroscopes, magnetic compasses, and barometers). These sensors are in fact the essential components for a MMS. However, smartphone navigation sensors suffer from the poor accuracy of global navigation satellite System (GNSS), accumulated drift, and high signal noise. These issues affect the accuracy of the initial Exterior Orientation Parameters (EOPs) that are inputted into the bundle adjustment algorithm, which then produces inaccurate 3D mapping solutions. This paper proposes new methodologies for increasing the accuracy of direct geo-referencing of smartphones using relative orientation and smartphone motion sensor measurements as well as integrating geometric scene constraints into free network bundle adjustment. The new methodologies incorporate fusing the relative orientations of the captured images and their corresponding motion sensor measurements to improve the initial EOPs. Then, the geometric features (e.g., horizontal and vertical linear lines) visible in each image are extracted and used as constraints in the bundle adjustment procedure which correct the relative position and orientation of the 3D mapping solution.
Alsubaie, Naif M.; Youssef, Ahmed A.; El-Sheimy, Naser
2017-01-01
This paper introduces a new method which facilitate the use of smartphones as a handheld low-cost mobile mapping system (MMS). Smartphones are becoming more sophisticated and smarter and are quickly closing the gap between computers and portable tablet devices. The current generation of smartphones are equipped with low-cost GPS receivers, high-resolution digital cameras, and micro-electro mechanical systems (MEMS)-based navigation sensors (e.g., accelerometers, gyroscopes, magnetic compasses, and barometers). These sensors are in fact the essential components for a MMS. However, smartphone navigation sensors suffer from the poor accuracy of global navigation satellite System (GNSS), accumulated drift, and high signal noise. These issues affect the accuracy of the initial Exterior Orientation Parameters (EOPs) that are inputted into the bundle adjustment algorithm, which then produces inaccurate 3D mapping solutions. This paper proposes new methodologies for increasing the accuracy of direct geo-referencing of smartphones using relative orientation and smartphone motion sensor measurements as well as integrating geometric scene constraints into free network bundle adjustment. The new methodologies incorporate fusing the relative orientations of the captured images and their corresponding motion sensor measurements to improve the initial EOPs. Then, the geometric features (e.g., horizontal and vertical linear lines) visible in each image are extracted and used as constraints in the bundle adjustment procedure which correct the relative position and orientation of the 3D mapping solution. PMID:28973958
Luo, Xiongbiao; Jayarathne, Uditha L; McLeod, A Jonathan; Mori, Kensaku
2014-01-01
Endoscopic navigation generally integrates different modalities of sensory information in order to continuously locate an endoscope relative to suspicious tissues in the body during interventions. Current electromagnetic tracking techniques for endoscopic navigation have limited accuracy due to tissue deformation and magnetic field distortion. To avoid these limitations and improve the endoscopic localization accuracy, this paper proposes a new endoscopic navigation framework that uses an optical mouse sensor to measure the endoscope movements along its viewing direction. We then enhance the differential evolution algorithm by modifying its mutation operation. Based on the enhanced differential evolution method, these movement measurements and image structural patches in endoscopic videos are fused to accurately determine the endoscope position. An evaluation on a dynamic phantom demonstrated that our method provides a more accurate navigation framework. Compared to state-of-the-art methods, it improved the navigation accuracy from 2.4 to 1.6 mm and reduced the processing time from 2.8 to 0.9 seconds.
Xian, Zhiwen; Hu, Xiaoping; Lian, Junxiang; Zhang, Lilian; Cao, Juliang; Wang, Yujie; Ma, Tao
2014-09-15
Navigation plays a vital role in our daily life. As traditional and commonly used navigation technologies, Inertial Navigation System (INS) and Global Navigation Satellite System (GNSS) can provide accurate location information, but suffer from the accumulative error of inertial sensors and cannot be used in a satellite denied environment. The remarkable navigation ability of animals shows that the pattern of the polarization sky can be used for navigation. A bio-inspired POLarization Navigation Sensor (POLNS) is constructed to detect the polarization of skylight. Contrary to the previous approach, we utilize all the outputs of POLNS to compute input polarization angle, based on Least Squares, which provides optimal angle estimation. In addition, a new sensor calibration algorithm is presented, in which the installation angle errors and sensor biases are taken into consideration. Derivation and implementation of our calibration algorithm are discussed in detail. To evaluate the performance of our algorithms, simulation and real data test are done to compare our algorithms with several exiting algorithms. Comparison results indicate that our algorithms are superior to the others and are more feasible and effective in practice.
Acoustic Sensors for Air and Surface Navigation Applications
Kapoor, Rohan; Ramasamy, Subramanian; Schyndel, Ron Van
2018-01-01
This paper presents the state-of-the-art and reviews the state-of-research of acoustic sensors used for a variety of navigation and guidance applications on air and surface vehicles. In particular, this paper focuses on echolocation, which is widely utilized in nature by certain mammals (e.g., cetaceans and bats). Although acoustic sensors have been extensively adopted in various engineering applications, their use in navigation and guidance systems is yet to be fully exploited. This technology has clear potential for applications in air and surface navigation/guidance for intelligent transport systems (ITS), especially considering air and surface operations indoors and in other environments where satellite positioning is not available. Propagation of sound in the atmosphere is discussed in detail, with all potential attenuation sources taken into account. The errors introduced in echolocation measurements due to Doppler, multipath and atmospheric effects are discussed, and an uncertainty analysis method is presented for ranging error budget prediction in acoustic navigation applications. Considering the design challenges associated with monostatic and multi-static sensor implementations and looking at the performance predictions for different possible configurations, acoustic sensors show clear promises in navigation, proximity sensing, as well as obstacle detection and tracking. The integration of acoustic sensors in multi-sensor navigation systems is also considered towards the end of the paper and a low Size, Weight and Power, and Cost (SWaP-C) sensor integration architecture is presented for possible introduction in air and surface navigation systems. PMID:29414894
Indoor integrated navigation and synchronous data acquisition method for Android smartphone
NASA Astrophysics Data System (ADS)
Hu, Chunsheng; Wei, Wenjian; Qin, Shiqiao; Wang, Xingshu; Habib, Ayman; Wang, Ruisheng
2015-08-01
Smartphones are widely used at present. Most smartphones have cameras and kinds of sensors, such as gyroscope, accelerometer and magnet meter. Indoor navigation based on smartphone is very important and valuable. According to the features of the smartphone and indoor navigation, a new indoor integrated navigation method is proposed, which uses MEMS (Micro-Electro-Mechanical Systems) IMU (Inertial Measurement Unit), camera and magnet meter of smartphone. The proposed navigation method mainly involves data acquisition, camera calibration, image measurement, IMU calibration, initial alignment, strapdown integral, zero velocity update and integrated navigation. Synchronous data acquisition of the sensors (gyroscope, accelerometer and magnet meter) and the camera is the base of the indoor navigation on the smartphone. A camera data acquisition method is introduced, which uses the camera class of Android to record images and time of smartphone camera. Two kinds of sensor data acquisition methods are introduced and compared. The first method records sensor data and time with the SensorManager of Android. The second method realizes open, close, data receiving and saving functions in C language, and calls the sensor functions in Java language with JNI interface. A data acquisition software is developed with JDK (Java Development Kit), Android ADT (Android Development Tools) and NDK (Native Development Kit). The software can record camera data, sensor data and time at the same time. Data acquisition experiments have been done with the developed software and Sumsang Note 2 smartphone. The experimental results show that the first method of sensor data acquisition is convenient but lost the sensor data sometimes, the second method is much better in real-time performance and much less in data losing. A checkerboard image is recorded, and the corner points of the checkerboard are detected with the Harris method. The sensor data of gyroscope, accelerometer and magnet meter have been recorded about 30 minutes. The bias stability and noise feature of the sensors have been analyzed. Besides the indoor integrated navigation, the integrated navigation and synchronous data acquisition method can be applied to outdoor navigation.
Integrated INS/GPS Navigation from a Popular Perspective
NASA Technical Reports Server (NTRS)
Omerbashich, Mensur
2002-01-01
Inertial navigation, blended with other navigation aids, Global Positioning System (GPS) in particular, has gained significance due to enhanced navigation and inertial reference performance and dissimilarity for fault tolerance and anti-jamming. Relatively new concepts based upon using Differential GPS (DGPS) blended with Inertial (and visual) Navigation Sensors (INS) offer the possibility of low cost, autonomous aircraft landing. The FAA has decided to implement the system in a sophisticated form as a new standard navigation tool during this decade. There have been a number of new inertial sensor concepts in the recent past that emphasize increased accuracy of INS/GPS versus INS and reliability of navigation, as well as lower size and weight, and higher power, fault tolerance, and long life. The principles of GPS are not discussed; rather the attention is directed towards general concepts and comparative advantages. A short introduction to the problems faced in kinematics is presented. The intention is to relate the basic principles of kinematics to probably the most used navigation method in the future-INS/GPS. An example of the airborne INS is presented, with emphasis on how it works. The discussion of the error types and sources in navigation, and of the role of filters in optimal estimation of the errors then follows. The main question this paper is trying to answer is 'What are the benefits of the integration of INS and GPS and how is this, navigation concept of the future achieved in reality?' The main goal is to communicate the idea about what stands behind a modern navigation method.
Onboard Navigation Systems Characteristics
NASA Technical Reports Server (NTRS)
1979-01-01
The space shuttle onboard navigation systems characteristics are described. A standard source of equations and numerical data for use in error analyses and mission simulations related to space shuttle development is reported. The sensor characteristics described are used for shuttle onboard navigation performance assessment. The use of complete models in the studies depend on the analyses to be performed, the capabilities of the computer programs, and the availability of computer resources.
Flight Testing ALHAT Precision Landing Technologies Integrated Onboard the Morpheus Rocket Vehicle
NASA Technical Reports Server (NTRS)
Carson, John M. III; Robertson, Edward A.; Trawny, Nikolas; Amzajerdian, Farzin
2015-01-01
A suite of prototype sensors, software, and avionics developed within the NASA Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project were terrestrially demonstrated onboard the NASA Morpheus rocket-propelled Vertical Testbed (VTB) in 2014. The sensors included a LIDAR-based Hazard Detection System (HDS), a Navigation Doppler LIDAR (NDL) velocimeter, and a long-range Laser Altimeter (LAlt) that enable autonomous and safe precision landing of robotic or human vehicles on solid solar system bodies under varying terrain lighting conditions. The flight test campaign with the Morpheus vehicle involved a detailed integration and functional verification process, followed by tether testing and six successful free flights, including one night flight. The ALHAT sensor measurements were integrated into a common navigation solution through a specialized ALHAT Navigation filter that was employed in closed-loop flight testing within the Morpheus Guidance, Navigation and Control (GN&C) subsystem. Flight testing on Morpheus utilized ALHAT for safe landing site identification and ranking, followed by precise surface-relative navigation to the selected landing site. The successful autonomous, closed-loop flight demonstrations of the prototype ALHAT system have laid the foundation for the infusion of safe, precision landing capabilities into future planetary exploration missions.
On the Design of Attitude-Heading Reference Systems Using the Allan Variance.
Hidalgo-Carrió, Javier; Arnold, Sascha; Poulakis, Pantelis
2016-04-01
The Allan variance is a method to characterize stochastic random processes. The technique was originally developed to characterize the stability of atomic clocks and has also been successfully applied to the characterization of inertial sensors. Inertial navigation systems (INS) can provide accurate results in a short time, which tend to rapidly degrade in longer time intervals. During the last decade, the performance of inertial sensors has significantly improved, particularly in terms of signal stability, mechanical robustness, and power consumption. The mass and volume of inertial sensors have also been significantly reduced, offering system-level design and accommodation advantages. This paper presents a complete methodology for the characterization and modeling of inertial sensors using the Allan variance, with direct application to navigation systems. Although the concept of sensor fusion is relatively straightforward, accurate characterization and sensor-information filtering is not a trivial task, yet they are essential for good performance. A complete and reproducible methodology utilizing the Allan variance, including all the intermediate steps, is described. An end-to-end (E2E) process for sensor-error characterization and modeling up to the final integration in the sensor-fusion scheme is explained in detail. The strength of this approach is demonstrated with representative tests on novel, high-grade inertial sensors. Experimental navigation results are presented from two distinct robotic applications: a planetary exploration rover prototype and an autonomous underwater vehicle (AUV).
On Navigation Sensor Error Correction
NASA Astrophysics Data System (ADS)
Larin, V. B.
2016-01-01
The navigation problem for the simplest wheeled robotic vehicle is solved by just measuring kinematical parameters, doing without accelerometers and angular-rate sensors. It is supposed that the steerable-wheel angle sensor has a bias that must be corrected. The navigation parameters are corrected using the GPS. The approach proposed regards the wheeled robot as a system with nonholonomic constraints. The performance of such a navigation system is demonstrated by way of an example
Results of prototype software development for automation of shuttle proximity operations
NASA Technical Reports Server (NTRS)
Hiers, Hal; Olszweski, Oscar
1991-01-01
The effort involves demonstration of expert system technology application to Shuttle rendezvous operations in a high-fidelity, real-time simulation environment. The JSC Systems Engineering Simulator (SES) served as the test bed for the demonstration. Rendezvous applications were focused on crew procedures and monitoring of sensor health and trajectory status. Proximity operations applications were focused on monitoring, crew advisory, and control of the approach trajectory. Guidance, Navigation, and Control areas of emphasis included the approach, transition and stationkeeping guidance, and laser docking sensor navigation. Operator interface displays for monitor and control functions were developed. A rule-based expert system was developed to manage the relative navigation system/sensors for nominal operations and simple failure contingencies. Testing resulted in the following findings; (1) the developed guidance is applicable for operations with LVLH stabilized targets; (2) closing rates less than 0.05 feet per second are difficult to maintain due to the Shuttle translational/rotational cross-coupling; (3) automated operations result in reduced propellant consumption and plume impingement effects on the target as compared to manual operations; and (4) braking gates are beneficial for trajectory management. A versatile guidance design was demonstrated. An accurate proximity operations sensor/navigation system to provide relative attitude information within 30 feet is required and redesign of the existing Shuttle digital autopilot should be considered to reduce the cross-coupling effects. This activity has demonstrated the feasibility of automated Shuttle proximity operations with the Space Station Freedom. Indications are that berthing operations as well as docking can be supported.
NASA Astrophysics Data System (ADS)
Pierrottet, Diego; Amzajerdian, Farzin; Petway, Larry; Barnes, Bruce; Lockard, George; Hines, Glenn
2011-06-01
An all fiber Navigation Doppler Lidar (NDL) system is under development at NASA Langley Research Center (LaRC) for precision descent and landing applications on planetary bodies. The sensor produces high-resolution line of sight range, altitude above ground, ground relative attitude, and high precision velocity vector measurements. Previous helicopter flight test results demonstrated the NDL measurement concepts, including measurement precision, accuracies, and operational range. This paper discusses the results obtained from a recent campaign to test the improved sensor hardware, and various signal processing algorithms applicable to real-time processing. The NDL was mounted in an instrumentation pod aboard an Erickson Air-Crane helicopter and flown over various terrains. The sensor was one of several sensors tested in this field test by NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) project.
NASA Technical Reports Server (NTRS)
Pierrottet, Diego F.; Lockhard, George; Amzajerdian, Farzin; Petway, Larry B.; Barnes, Bruce; Hines, Glenn D.
2011-01-01
An all fiber Navigation Doppler Lidar (NDL) system is under development at NASA Langley Research Center (LaRC) for precision descent and landing applications on planetary bodies. The sensor produces high resolution line of sight range, altitude above ground, ground relative attitude, and high precision velocity vector measurements. Previous helicopter flight test results demonstrated the NDL measurement concepts, including measurement precision, accuracies, and operational range. This paper discusses the results obtained from a recent campaign to test the improved sensor hardware, and various signal processing algorithms applicable to real-time processing. The NDL was mounted in an instrumentation pod aboard an Erickson Air-Crane helicopter and flown over vegetation free terrain. The sensor was one of several sensors tested in this field test by NASA?s Autonomous Landing and Hazard Avoidance Technology (ALHAT) project.
Xian, Zhiwen; Hu, Xiaoping; Lian, Junxiang; Zhang, Lilian; Cao, Juliang; Wang, Yujie; Ma, Tao
2014-01-01
Navigation plays a vital role in our daily life. As traditional and commonly used navigation technologies, Inertial Navigation System (INS) and Global Navigation Satellite System (GNSS) can provide accurate location information, but suffer from the accumulative error of inertial sensors and cannot be used in a satellite denied environment. The remarkable navigation ability of animals shows that the pattern of the polarization sky can be used for navigation. A bio-inspired POLarization Navigation Sensor (POLNS) is constructed to detect the polarization of skylight. Contrary to the previous approach, we utilize all the outputs of POLNS to compute input polarization angle, based on Least Squares, which provides optimal angle estimation. In addition, a new sensor calibration algorithm is presented, in which the installation angle errors and sensor biases are taken into consideration. Derivation and implementation of our calibration algorithm are discussed in detail. To evaluate the performance of our algorithms, simulation and real data test are done to compare our algorithms with several exiting algorithms. Comparison results indicate that our algorithms are superior to the others and are more feasible and effective in practice. PMID:25225872
Collaborative WiFi Fingerprinting Using Sensor-Based Navigation on Smartphones.
Zhang, Peng; Zhao, Qile; Li, You; Niu, Xiaoji; Zhuang, Yuan; Liu, Jingnan
2015-07-20
This paper presents a method that trains the WiFi fingerprint database using sensor-based navigation solutions. Since micro-electromechanical systems (MEMS) sensors provide only a short-term accuracy but suffer from the accuracy degradation with time, we restrict the time length of available indoor navigation trajectories, and conduct post-processing to improve the sensor-based navigation solution. Different middle-term navigation trajectories that move in and out of an indoor area are combined to make up the database. Furthermore, we evaluate the effect of WiFi database shifts on WiFi fingerprinting using the database generated by the proposed method. Results show that the fingerprinting errors will not increase linearly according to database (DB) errors in smartphone-based WiFi fingerprinting applications.
Collaborative WiFi Fingerprinting Using Sensor-Based Navigation on Smartphones
Zhang, Peng; Zhao, Qile; Li, You; Niu, Xiaoji; Zhuang, Yuan; Liu, Jingnan
2015-01-01
This paper presents a method that trains the WiFi fingerprint database using sensor-based navigation solutions. Since micro-electromechanical systems (MEMS) sensors provide only a short-term accuracy but suffer from the accuracy degradation with time, we restrict the time length of available indoor navigation trajectories, and conduct post-processing to improve the sensor-based navigation solution. Different middle-term navigation trajectories that move in and out of an indoor area are combined to make up the database. Furthermore, we evaluate the effect of WiFi database shifts on WiFi fingerprinting using the database generated by the proposed method. Results show that the fingerprinting errors will not increase linearly according to database (DB) errors in smartphone-based WiFi fingerprinting applications. PMID:26205269
2012-08-01
ACTIVE SAFETY TECHNOLOGY – ENVIRONMENTAL UNDERSTANDING AND NAVIGATION WITH USE OF LOW COST SENSORS David Simon Lockheed Martin MFC, Grand Prairie, TX...Understanding and Navigation with use of low cost sensors 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) David Simon ; Bernard
Autonomous Wheeled Robot Platform Testbed for Navigation and Mapping Using Low-Cost Sensors
NASA Astrophysics Data System (ADS)
Calero, D.; Fernandez, E.; Parés, M. E.
2017-11-01
This paper presents the concept of an architecture for a wheeled robot system that helps researchers in the field of geomatics to speed up their daily research on kinematic geodesy, indoor navigation and indoor positioning fields. The presented ideas corresponds to an extensible and modular hardware and software system aimed at the development of new low-cost mapping algorithms as well as at the evaluation of the performance of sensors. The concept, already implemented in the CTTC's system ARAS (Autonomous Rover for Automatic Surveying) is generic and extensible. This means that it is possible to incorporate new navigation algorithms or sensors at no maintenance cost. Only the effort related to the development tasks required to either create such algorithms needs to be taken into account. As a consequence, change poses a much small problem for research activities in this specific area. This system includes several standalone sensors that may be combined in different ways to accomplish several goals; that is, this system may be used to perform a variety of tasks, as, for instance evaluates positioning algorithms performance or mapping algorithms performance.
UGV navigation in wireless sensor and actuator network environments
NASA Astrophysics Data System (ADS)
Zhang, Guyu; Li, Jianfeng; Duncan, Christian A.; Kanno, Jinko; Selmic, Rastko R.
2012-06-01
We consider a navigation problem in a distributed, self-organized and coordinate-free Wireless Sensor and Ac- tuator Network (WSAN). We rst present navigation algorithms that are veried using simulation results. Con- sidering more than one destination and multiple mobile Unmanned Ground Vehicles (UGVs), we introduce a distributed solution to the Multi-UGV, Multi-Destination navigation problem. The objective of the solution to this problem is to eciently allocate UGVs to dierent destinations and carry out navigation in the network en- vironment that minimizes total travel distance. The main contribution of this paper is to develop a solution that does not attempt to localize either the UGVs or the sensor and actuator nodes. Other than some connectivity as- sumptions about the communication graph, we consider that no prior information about the WSAN is available. The solution presented here is distributed, and the UGV navigation is solely based on feedback from neigh- boring sensor and actuator nodes. One special case discussed in the paper, the Single-UGV, Multi-Destination navigation problem, is essentially equivalent to the well-known and dicult Traveling Salesman Problem (TSP). Simulation results are presented that illustrate the navigation distance traveled through the network. We also introduce an experimental testbed for the realization of coordinate-free and localization-free UGV navigation. We use the Cricket platform as the sensor and actuator network and a Pioneer 3-DX robot as the UGV. The experiments illustrate the UGV navigation in a coordinate-free WSAN environment where the UGV successfully arrives at the assigned destinations.
The Sensor Test for Orion RelNav Risk Mitigation Development Test Objective
NASA Technical Reports Server (NTRS)
Christian, John A.; Hinkel, Heather; Maguire, Sean
2011-01-01
The Sensor Test for Orion Relative-Navigation Risk Mitigation (STORRM) Development Test Objective (DTO) ew aboard the Space Shuttle Endeavour on STS-134, and was designed to characterize the performance of the ash LIDAR being developed for the Orion. This ash LIDAR, called the Vision Navigation Sensor (VNS), will be the primary navigation instrument used by the Orion vehicle during rendezvous, proximity operations, and docking. This paper provides an overview of the STORRM test objectives and the concept of operations. It continues with a description of the STORRM's major hardware compo nents, which include the VNS and the docking camera. Next, an overview of crew and analyst training activities will describe how the STORRM team prepared for flight. Then an overview of how insight data collection and analysis actually went is presented. Key ndings and results from this project are summarized, including a description of "truth" data. Finally, the paper concludes with lessons learned from the STORRM DTO.
Flight Results from the HST SM4 Relative Navigation Sensor System
NASA Technical Reports Server (NTRS)
Naasz, Bo; Eepoel, John Van; Queen, Steve; Southward, C. Michael; Hannah, Joel
2010-01-01
On May 11, 2009, Space Shuttle Atlantis roared off of Launch Pad 39A enroute to the Hubble Space Telescope (HST) to undertake its final servicing of HST, Servicing Mission 4. Onboard Atlantis was a small payload called the Relative Navigation Sensor experiment, which included three cameras of varying focal ranges, avionics to record images and estimate, in real time, the relative position and attitude (aka "pose") of the telescope during rendezvous and deploy. The avionics package, known as SpaceCube and developed at the Goddard Space Flight Center, performed image processing using field programmable gate arrays to accelerate this process, and in addition executed two different pose algorithms in parallel, the Goddard Natural Feature Image Recognition and the ULTOR Passive Pose and Position Engine (P3E) algorithms
Validation of Inertial and Optical Navigation Techniques for Space Applications with UAVS
NASA Astrophysics Data System (ADS)
Montaño, J.; Wis, M.; Pulido, J. A.; Latorre, A.; Molina, P.; Fernández, E.; Angelats, E.; Colomina, I.
2015-09-01
PERIGEO is an R&D project, funded by the INNPRONTA 2011-2014 programme from Spanish CDTI, which aims to investigate the use of UAV technologies and processes for the validation of space oriented technologies. For this purpose, among different space missions and technologies, a set of activities for absolute and relative navigation are being carried out to deal with the attitude and position estimation problem from a temporal image sequence from a camera on the visible spectrum and/or Light Detection and Ranging (LIDAR) sensor. The process is covered entirely: from sensor measurements and data acquisition (images, LiDAR ranges and angles), data pre-processing (calibration and co-registration of camera and LIDAR data), features and landmarks extraction from the images and image/LiDAR-based state estimation. In addition to image processing area, classical navigation system based on inertial sensors is also included in the research. The reason of combining both approaches is to enable the possibility to keep navigation capability in environments or missions where the radio beacon or reference signal as the GNSS satellite is not available (as for example an atmospheric flight in Titan). The rationale behind the combination of those systems is that they complement each other. The INS is capable of providing accurate position, velocity and full attitude estimations at high data rates. However, they need an absolute reference observation to compensate the time accumulative errors caused by inertial sensor inaccuracies. On the other hand, imaging observables can provide absolute and relative positioning and attitude estimations. However they need that the sensor head is pointing toward ground (something that may not be possible if the carrying platform is maneuvering) to provide accurate estimations and they are not capable of provide some hundreds of Hz that can deliver an INS. This mutual complementarity has been observed in PERIGEO and because of this they are combined into one system. The inertial navigation system implemented in PERIGEO is based on a classical loosely coupled INS/GNSS approach that is very similar to the implementation of the INS/Imaging navigation system that is mentioned above. The activities envisaged in PERIGEO cover the algorithms development and validation and technology testing on UAVs under representative conditions. Past activities have covered the design and development of the algorithms and systems. This paper presents the most recent activities and results on the area of image processing for robust estimation within PERIGEO, which are related with the hardware platforms definition (including sensors) and its integration in UAVs. Results for the tests performed during the flight campaigns in representative outdoor environments will be also presented (at the time of the full paper submission the tests will be performed), as well as analyzed, together with a roadmap definition for future developments.
Vision-Based 3D Motion Estimation for On-Orbit Proximity Satellite Tracking and Navigation
2015-06-01
Multiple-Purpose Crew Vehicle (MPVC), which will be provided with a LIDAR sensor as primary relative navigation system [26, 33, 34]. A drawback of LIDAR...328–352, 2009. [63] C. Luigini and M. Romano, “A ballistic- pendulum test stand to characterize small cold-gas thruster nozzles,” Acta
NASA Technical Reports Server (NTRS)
Galante, Joseph M.; Van Eepoel, John; D'Souza, Chris; Patrick, Bryan
2016-01-01
The Raven ISS Hosted Payload will feature several pose measurement sensors on a pan/tilt gimbal which will be used to autonomously track resupply vehicles as they approach and depart the International Space Station. This paper discusses the derivation of a Relative Navigation Filter (RNF) to fuse measurements from the different pose measurement sensors to produce relative position and attitude estimates. The RNF relies on relative translation and orientation kinematics and careful pose sensor modeling to eliminate dependence on orbital position information and associated orbital dynamics models. The filter state is augmented with sensor biases to provide a mechanism for the filter to estimate and mitigate the offset between the measurements from different pose sensors
NASA Technical Reports Server (NTRS)
Galante, Joseph M.; Van Eepoel, John; D' Souza, Chris; Patrick, Bryan
2016-01-01
The Raven ISS Hosted Payload will feature several pose measurement sensors on a pan/tilt gimbal which will be used to autonomously track resupply vehicles as they approach and depart the International Space Station. This paper discusses the derivation of a Relative Navigation Filter (RNF) to fuse measurements from the different pose measurement sensors to produce relative position and attitude estimates. The RNF relies on relative translation and orientation kinematics and careful pose sensor modeling to eliminate dependence on orbital position information and associated orbital dynamics models. The filter state is augmented with sensor biases to provide a mechanism for the filter to estimate and mitigate the offset between the measurements from different pose sensors.
An Agent-Based Model for Navigation Simulation in a Heterogeneous Environment
ERIC Educational Resources Information Center
Shanklin, Teresa A.
2012-01-01
Complex navigation (e.g. indoor and outdoor environments) can be studied as a system-of-systems problem. The model is made up of disparate systems that can aid a user in navigating from one location to another, utilizing whatever sensor system or information is available. By using intelligent navigation sensors and techniques (e.g. RFID, Wifi,…
NASA Astrophysics Data System (ADS)
Vinande, Eric T.
This research proposes several means to overcome challenges in the urban environment to ground vehicle global positioning system (GPS) receiver navigation performance through the integration of external sensor information. The effects of narrowband radio frequency interference and signal attenuation, both common in the urban environment, are examined with respect to receiver signal tracking processes. Low-cost microelectromechanical systems (MEMS) inertial sensors, suitable for the consumer market, are the focus of receiver augmentation as they provide an independent measure of motion and are independent of vehicle systems. A method for estimating the mounting angles of an inertial sensor cluster utilizing typical urban driving maneuvers is developed and is able to provide angular measurements within two degrees of truth. The integration of GPS and MEMS inertial sensors is developed utilizing a full state navigation filter. Appropriate statistical methods are developed to evaluate the urban environment navigation improvement due to the addition of MEMS inertial sensors. A receiver evaluation metric that combines accuracy, availability, and maximum error measurements is presented and evaluated over several drive tests. Following a description of proper drive test techniques, record and playback systems are evaluated as the optimal way of testing multiple receivers and/or integrated navigation systems in the urban environment as they simplify vehicle testing requirements.
An Automated Method for Navigation Assessment for Earth Survey Sensors Using Island Targets
NASA Technical Reports Server (NTRS)
Patt, F. S.; Woodward, R. H.; Gregg, W. W.
1997-01-01
An automated method has been developed for performing navigation assessment on satellite-based Earth sensor data. The method utilizes islands as targets which can be readily located in the sensor data and identified with reference locations. The essential elements are an algorithm for classifying the sensor data according to source, a reference catalogue of island locations, and a robust pattern-matching algorithm for island identification. The algorithms were developed and tested for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), an ocean colour sensor. This method will allow navigation error statistics to be automatically generated for large numbers of points, supporting analysis over large spatial and temporal ranges.
Automated navigation assessment for earth survey sensors using island targets
NASA Technical Reports Server (NTRS)
Patt, Frederick S.; Woodward, Robert H.; Gregg, Watson W.
1997-01-01
An automated method has been developed for performing navigation assessment on satellite-based Earth sensor data. The method utilizes islands as targets which can be readily located in the sensor data and identified with reference locations. The essential elements are an algorithm for classifying the sensor data according to source, a reference catalog of island locations, and a robust pattern-matching algorithm for island identification. The algorithms were developed and tested for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), an ocean color sensor. This method will allow navigation error statistics to be automatically generated for large numbers of points, supporting analysis over large spatial and temporal ranges.
Inertial navigation sensor integrated obstacle detection system
NASA Technical Reports Server (NTRS)
Bhanu, Bir (Inventor); Roberts, Barry A. (Inventor)
1992-01-01
A system that incorporates inertial sensor information into optical flow computations to detect obstacles and to provide alternative navigational paths free from obstacles. The system is a maximally passive obstacle detection system that makes selective use of an active sensor. The active detection typically utilizes a laser. Passive sensor suite includes binocular stereo, motion stereo and variable fields-of-view. Optical flow computations involve extraction, derotation and matching of interest points from sequential frames of imagery, for range interpolation of the sensed scene, which in turn provides obstacle information for purposes of safe navigation.
Effects of Optical Artifacts in a Laser-Based Spacecraft Navigation Sensor
NASA Technical Reports Server (NTRS)
LeCroy, Jerry E.; Howard, Richard T.; Hallmark, Dean S.
2007-01-01
Testing of the Advanced Video Guidance Sensor (AVGS) used for proximity operations navigation on the Orbital Express ASTRO spacecraft exposed several unanticipated imaging system artifacts and aberrations that required correction to meet critical navigation performance requirements. Mitigation actions are described for a number of system error sources, including lens aberration, optical train misalignment, laser speckle, target image defects, and detector nonlinearity/noise characteristics. Sensor test requirements and protocols are described, along with a summary of test results from sensor confidence tests and system performance testing.
Effects of Optical Artifacts in a Laser-Based Spacecraft Navigation Sensor
NASA Technical Reports Server (NTRS)
LeCroy, Jerry E.; Hallmark, Dean S.; Howard, Richard T.
2007-01-01
Testing Of the Advanced Video Guidance Sensor (AVGS) used for proximity operations navigation on the Orbital Express ASTRO spacecraft exposed several unanticipated imaging system artifacts and aberrations that required correction, to meet critical navigation performance requirements. Mitigation actions are described for a number of system error sources, including lens aberration, optical train misalignment, laser speckle, target image defects, and detector nonlinearity/noise characteristics. Sensor test requirements and protocols are described, along with a summary ,of test results from sensor confidence tests and system performance testing.
1999-08-01
Electro - Optic Sensor Integration Technology (NEOSIT) software application. The design is highly modular and based on COTS tools to facilitate integration with sensors, navigation and digital data sources already installed on different host
Lidar Systems for Precision Navigation and Safe Landing on Planetary Bodies
NASA Technical Reports Server (NTRS)
Amzajerdian, Farzin; Pierrottet, Diego F.; Petway, Larry B.; Hines, Glenn D.; Roback, Vincent E.
2011-01-01
The ability of lidar technology to provide three-dimensional elevation maps of the terrain, high precision distance to the ground, and approach velocity can enable safe landing of robotic and manned vehicles with a high degree of precision. Currently, NASA is developing novel lidar sensors aimed at needs of future planetary landing missions. These lidar sensors are a 3-Dimensional Imaging Flash Lidar, a Doppler Lidar, and a Laser Altimeter. The Flash Lidar is capable of generating elevation maps of the terrain that indicate hazardous features such as rocks, craters, and steep slopes. The elevation maps collected during the approach phase of a landing vehicle, at about 1 km above the ground, can be used to determine the most suitable safe landing site. The Doppler Lidar provides highly accurate ground relative velocity and distance data allowing for precision navigation to the landing site. Our Doppler lidar utilizes three laser beams pointed to different directions to measure line of sight velocities and ranges to the ground from altitudes of over 2 km. Throughout the landing trajectory starting at altitudes of about 20 km, the Laser Altimeter can provide very accurate ground relative altitude measurements that are used to improve the vehicle position knowledge obtained from the vehicle navigation system. At altitudes from approximately 15 km to 10 km, either the Laser Altimeter or the Flash Lidar can be used to generate contour maps of the terrain, identifying known surface features such as craters, to perform Terrain relative Navigation thus further reducing the vehicle s relative position error. This paper describes the operational capabilities of each lidar sensor and provides a status of their development. Keywords: Laser Remote Sensing, Laser Radar, Doppler Lidar, Flash Lidar, 3-D Imaging, Laser Altimeter, Precession Landing, Hazard Detection
LIRIS flight database and its use toward noncooperative rendezvous
NASA Astrophysics Data System (ADS)
Mongrard, O.; Ankersen, F.; Casiez, P.; Cavrois, B.; Donnard, A.; Vergnol, A.; Southivong, U.
2018-06-01
ESA's fifth and last Automated Transfer Vehicle, ATV Georges Lemaître, tested new rendezvous technology before docking with the International Space Station (ISS) in August 2014. The technology demonstration called Laser Infrared Imaging Sensors (LIRIS) provides an unseen view of the ISS. During Georges Lemaître's rendezvous, LIRIS sensors, composed of two infrared cameras, one visible camera, and a scanning LIDAR (Light Detection and Ranging), were turned on two and a half hours and 3500 m from the Space Station. All sensors worked as expected and a large amount of data was recorded and stored within ATV-5's cargo hold before being returned to Earth with the Soyuz flight 38S in September 2014. As a part of the LIRIS postflight activities, the information gathered by all sensors is collected inside a flight database together with the reference ATV trajectory and attitude estimated by ATV main navigation sensors. Although decoupled from the ATV main computer, the LIRIS data were carefully synchronized with ATV guidance, navigation, and control (GNC) data. Hence, the LIRIS database can be used to assess the performance of various image processing algorithms to provide range and line-of-sight (LoS) navigation at long/medium range but also 6 degree-of-freedom (DoF) navigation at short range. The database also contains information related to the overall ATV position with respect to Earth and the Sun direction within ATV frame such that the effect of the environment on the sensors can also be investigated. This paper introduces the structure of the LIRIS database and provides some example of applications to increase the technology readiness level of noncooperative rendezvous.
Wang, Hao; Jiang, Jie; Zhang, Guangjun
2017-04-21
The simultaneous extraction of optical navigation measurements from a target celestial body and star images is essential for autonomous optical navigation. Generally, a single optical navigation sensor cannot simultaneously image the target celestial body and stars well-exposed because their irradiance difference is generally large. Multi-sensor integration or complex image processing algorithms are commonly utilized to solve the said problem. This study analyzes and demonstrates the feasibility of simultaneously imaging the target celestial body and stars well-exposed within a single exposure through a single field of view (FOV) optical navigation sensor using the well capacity adjusting (WCA) scheme. First, the irradiance characteristics of the celestial body are analyzed. Then, the celestial body edge model and star spot imaging model are established when the WCA scheme is applied. Furthermore, the effect of exposure parameters on the accuracy of star centroiding and edge extraction is analyzed using the proposed model. Optimal exposure parameters are also derived by conducting Monte Carlo simulation to obtain the best performance of the navigation sensor. Finally, laboratorial and night sky experiments are performed to validate the correctness of the proposed model and optimal exposure parameters.
Wang, Hao; Jiang, Jie; Zhang, Guangjun
2017-01-01
The simultaneous extraction of optical navigation measurements from a target celestial body and star images is essential for autonomous optical navigation. Generally, a single optical navigation sensor cannot simultaneously image the target celestial body and stars well-exposed because their irradiance difference is generally large. Multi-sensor integration or complex image processing algorithms are commonly utilized to solve the said problem. This study analyzes and demonstrates the feasibility of simultaneously imaging the target celestial body and stars well-exposed within a single exposure through a single field of view (FOV) optical navigation sensor using the well capacity adjusting (WCA) scheme. First, the irradiance characteristics of the celestial body are analyzed. Then, the celestial body edge model and star spot imaging model are established when the WCA scheme is applied. Furthermore, the effect of exposure parameters on the accuracy of star centroiding and edge extraction is analyzed using the proposed model. Optimal exposure parameters are also derived by conducting Monte Carlo simulation to obtain the best performance of the navigation sensor. Finally, laboratorial and night sky experiments are performed to validate the correctness of the proposed model and optimal exposure parameters. PMID:28430132
Acoustic Communications and Navigation for Mobile Under-Ice Sensors
2017-02-04
From- To) 04/02/2017 Final Report 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Acoustic Communications and Navigation for Mobile Under-Ice Sensors...development and fielding of a new acoustic communications and navigation system for use on autonomous platforms (gliders and profiling floats) under the...contact below the ice. 15. SUBJECT TERMS Arctic Ocean, Undersea Workstations & Vehicles, Signal Processing, Navigation, Underwater Acoustics 16
NA-241_Quarterly Report_SBLibby - 12.31.2017_v2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Libby, Stephen B.
This is an evaluation of candidate navigation solutions for GPS free inspection tools that can be used in tours of large building interiors. In principle, COTS portable inertial motion unit (IMU) sensors with satisfactory accuracy, SWAP (size, weight, power), low error, and bias drift can provide sufficiently accurate dead reckoning navigation in a large building in the absence of GPS. To explore this assumption, the capabilities of representative IMU navigation sensors to meet these requirements will be evaluated, starting with a market survey, and then carrying out a basic analysis of these sensors using LLNL’s navigation codes.
COBALT: Development of a Platform to Flight Test Lander GN&C Technologies on Suborbital Rockets
NASA Technical Reports Server (NTRS)
Carson, John M., III; Seubert, Carl R.; Amzajerdian, Farzin; Bergh, Chuck; Kourchians, Ara; Restrepo, Carolina I.; Villapando, Carlos Y.; O'Neal, Travis V.; Robertson, Edward A.; Pierrottet, Diego;
2017-01-01
The NASA COBALT Project (CoOperative Blending of Autonomous Landing Technologies) is developing and integrating new precision-landing Guidance, Navigation and Control (GN&C) technologies, along with developing a terrestrial fight-test platform for Technology Readiness Level (TRL) maturation. The current technologies include a third- generation Navigation Doppler Lidar (NDL) sensor for ultra-precise velocity and line- of-site (LOS) range measurements, and the Lander Vision System (LVS) that provides passive-optical Terrain Relative Navigation (TRN) estimates of map-relative position. The COBALT platform is self contained and includes the NDL and LVS sensors, blending filter, a custom compute element, power unit, and communication system. The platform incorporates a structural frame that has been designed to integrate with the payload frame onboard the new Masten Xodiac vertical take-o, vertical landing (VTVL) terrestrial rocket vehicle. Ground integration and testing is underway, and terrestrial fight testing onboard Xodiac is planned for 2017 with two flight campaigns: one open-loop and one closed-loop.
Bio-inspired polarized skylight navigation: a review
NASA Astrophysics Data System (ADS)
Zhang, Xi; Wan, Yongqin; Li, Lijing
2015-12-01
The idea of using skylight polarization in navigation is learned from animals such as desert ants and honeybees. Various research groups have been working on the development of novel navigation systems inspired by polarized skylight. The research of background in polarized skylight navigation is introduced, and basic principle of the insects navigation is expatiated. Then, the research progress status at home and abroad in skylight polarization pattern, three bio-inspired polarized skylight navigation sensors and polarized skylight navigation are reviewed. Finally, the research focuses in the field of polarized skylight navigation are analyzed. At the same time, the trend of development and prospect in the future are predicted. It is believed that the review is helpful to people understand polarized skylight navigation and polarized skylight navigation sensors.
Multi-Sensor Fusion with Interacting Multiple Model Filter for Improved Aircraft Position Accuracy
Cho, Taehwan; Lee, Changho; Choi, Sangbang
2013-01-01
The International Civil Aviation Organization (ICAO) has decided to adopt Communications, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) as the 21st century standard for navigation. Accordingly, ICAO members have provided an impetus to develop related technology and build sufficient infrastructure. For aviation surveillance with CNS/ATM, Ground-Based Augmentation System (GBAS), Automatic Dependent Surveillance-Broadcast (ADS-B), multilateration (MLAT) and wide-area multilateration (WAM) systems are being established. These sensors can track aircraft positions more accurately than existing radar and can compensate for the blind spots in aircraft surveillance. In this paper, we applied a novel sensor fusion method with Interacting Multiple Model (IMM) filter to GBAS, ADS-B, MLAT, and WAM data in order to improve the reliability of the aircraft position. Results of performance analysis show that the position accuracy is improved by the proposed sensor fusion method with the IMM filter. PMID:23535715
Multi-sensor fusion with interacting multiple model filter for improved aircraft position accuracy.
Cho, Taehwan; Lee, Changho; Choi, Sangbang
2013-03-27
The International Civil Aviation Organization (ICAO) has decided to adopt Communications, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) as the 21st century standard for navigation. Accordingly, ICAO members have provided an impetus to develop related technology and build sufficient infrastructure. For aviation surveillance with CNS/ATM, Ground-Based Augmentation System (GBAS), Automatic Dependent Surveillance-Broadcast (ADS-B), multilateration (MLAT) and wide-area multilateration (WAM) systems are being established. These sensors can track aircraft positions more accurately than existing radar and can compensate for the blind spots in aircraft surveillance. In this paper, we applied a novel sensor fusion method with Interacting Multiple Model (IMM) filter to GBAS, ADS-B, MLAT, and WAM data in order to improve the reliability of the aircraft position. Results of performance analysis show that the position accuracy is improved by the proposed sensor fusion method with the IMM filter.
Observability-Based Guidance and Sensor Placement
NASA Astrophysics Data System (ADS)
Hinson, Brian T.
Control system performance is highly dependent on the quality of sensor information available. In a growing number of applications, however, the control task must be accomplished with limited sensing capabilities. This thesis addresses these types of problems from a control-theoretic point-of-view, leveraging system nonlinearities to improve sensing performance. Using measures of observability as an information quality metric, guidance trajectories and sensor distributions are designed to improve the quality of sensor information. An observability-based sensor placement algorithm is developed to compute optimal sensor configurations for a general nonlinear system. The algorithm utilizes a simulation of the nonlinear system as the source of input data, and convex optimization provides a scalable solution method. The sensor placement algorithm is applied to a study of gyroscopic sensing in insect wings. The sensor placement algorithm reveals information-rich areas on flexible insect wings, and a comparison to biological data suggests that insect wings are capable of acting as gyroscopic sensors. An observability-based guidance framework is developed for robotic navigation with limited inertial sensing. Guidance trajectories and algorithms are developed for range-only and bearing-only navigation that improve navigation accuracy. Simulations and experiments with an underwater vehicle demonstrate that the observability measure allows tuning of the navigation uncertainty.
Autonomous satellite navigation using starlight refraction angle measurements
NASA Astrophysics Data System (ADS)
Ning, Xiaolin; Wang, Longhua; Bai, Xinbei; Fang, Jiancheng
2013-05-01
An on-board autonomous navigation capability is required to reduce the operation costs and enhance the navigation performance of future satellites. Autonomous navigation by stellar refraction is a type of autonomous celestial navigation method that uses high-accuracy star sensors instead of Earth sensors to provide information regarding Earth's horizon. In previous studies, the refraction apparent height has typically been used for such navigation. However, the apparent height cannot be measured directly by a star sensor and can only be calculated by the refraction angle and an atmospheric refraction model. Therefore, additional errors are introduced by the uncertainty and nonlinearity of atmospheric refraction models, which result in reduced navigation accuracy and reliability. A new navigation method based on the direct measurement of the refraction angle is proposed to solve this problem. Techniques for the determination of the refraction angle are introduced, and a measurement model for the refraction angle is established. The method is tested and validated by simulations. When the starlight refraction height ranges from 20 to 50 km, a positioning accuracy of better than 100 m can be achieved for a low-Earth-orbit (LEO) satellite using the refraction angle, while the positioning accuracy of the traditional method using the apparent height is worse than 500 m under the same conditions. Furthermore, an analysis of the factors that affect navigation accuracy, including the measurement accuracy of the refraction angle, the number of visible refracted stars per orbit and the installation azimuth of star sensor, is presented. This method is highly recommended for small satellites in particular, as no additional hardware besides two star sensors is required.
Biomimetic MEMS sensor array for navigation and water detection
NASA Astrophysics Data System (ADS)
Futterknecht, Oliver; Macqueen, Mark O.; Karman, Salmah; Diah, S. Zaleha M.; Gebeshuber, Ille C.
2013-05-01
The focus of this study is biomimetic concept development for a MEMS sensor array for navigation and water detection. The MEMS sensor array is inspired by abstractions of the respective biological functions: polarized skylight-based navigation sensors in honeybees (Apis mellifera) and the ability of African elephants (Loxodonta africana) to detect water. The focus lies on how to navigate to and how to detect water sources in desert-like or remote areas. The goal is to develop a sensor that can provide both, navigation clues and help in detecting nearby water sources. We basically use the information provided by the natural polarization pattern produced by the sunbeams scattered within the atmosphere combined with the capability of the honeybee's compound eye to extrapolate the navigation information. The detection device uses light beam reactive MEMS, which are capable to detect the skylight polarization based on the Rayleigh sky model. For water detection we present various possible approaches to realize the sensor. In the first approach, polarization is used: moisture saturated areas near ground have a small but distinctively different effect on scattering and polarizing light than less moist ones. Modified skylight polarization sensors (Karman, Diah and Gebeshuber, 2012) are used to visualize this small change in scattering. The second approach is inspired by the ability of elephants to detect infrasound produced by underground water reservoirs, and shall be used to determine the location of underground rivers and visualize their exact routes.
A Survey of LIDAR Technology and Its Use in Spacecraft Relative Navigation
NASA Technical Reports Server (NTRS)
Christian, John A.; Cryan, Scott P.
2013-01-01
This paper provides a survey of modern LIght Detection And Ranging (LIDAR) sensors from a perspective of how they can be used for spacecraft relative navigation. In addition to LIDAR technology commonly used in space applications today (e.g. scanning, flash), this paper reviews emerging LIDAR technologies gaining traction in other non-aerospace fields. The discussion will include an overview of sensor operating principles and specific pros/cons for each type of LIDAR. This paper provides a comprehensive review of LIDAR technology as applied specifically to spacecraft relative navigation. HE problem of orbital rendezvous and docking has been a consistent challenge for complex space missions since before the Gemini 8 spacecraft performed the first successful on-orbit docking of two spacecraft in 1966. Over the years, a great deal of effort has been devoted to advancing technology associated with all aspects of the rendezvous, proximity operations, and docking (RPOD) flight phase. After years of perfecting the art of crewed rendezvous with the Gemini, Apollo, and Space Shuttle programs, NASA began investigating the problem of autonomous rendezvous and docking (AR&D) to support a host of different mission applications. Some of these applications include autonomous resupply of the International Space Station (ISS), robotic servicing/refueling of existing orbital assets, and on-orbit assembly.1 The push towards a robust AR&D capability has led to an intensified interest in a number of different sensors capable of providing insight into the relative state of two spacecraft. The present work focuses on exploring the state-of-the-art in one of these sensors - LIght Detection And Ranging (LIDAR) sensors. It should be noted that the military community frequently uses the acronym LADAR (LAser Detection And Ranging) to refer to what this paper calls LIDARs. A LIDAR is an active remote sensing device that is typically used in space applications to obtain the range to one or more points on a target spacecraft. As the name suggests, LIDAR sensors use light (typically a laser) to illuminate the target and measure the time it takes for the emitted signal to return to the sensor. Because the light must travel from the source, to
Application of aircraft navigation sensors to enhanced vision systems
NASA Technical Reports Server (NTRS)
Sweet, Barbara T.
1993-01-01
In this presentation, the applicability of various aircraft navigation sensors to enhanced vision system design is discussed. First, the accuracy requirements of the FAA for precision landing systems are presented, followed by the current navigation systems and their characteristics. These systems include Instrument Landing System (ILS), Microwave Landing System (MLS), Inertial Navigation, Altimetry, and Global Positioning System (GPS). Finally, the use of navigation system data to improve enhanced vision systems is discussed. These applications include radar image rectification, motion compensation, and image registration.
Integrated communications and optical navigation system
NASA Astrophysics Data System (ADS)
Mueller, J.; Pajer, G.; Paluszek, M.
2013-12-01
The Integrated Communications and Optical Navigation System (ICONS) is a flexible navigation system for spacecraft that does not require global positioning system (GPS) measurements. The navigation solution is computed using an Unscented Kalman Filter (UKF) that can accept any combination of range, range-rate, planet chord width, landmark, and angle measurements using any celestial object. Both absolute and relative orbit determination is supported. The UKF employs a full nonlinear dynamical model of the orbit including gravity models and disturbance models. The ICONS package also includes attitude determination algorithms using the UKF algorithm with the Inertial Measurement Unit (IMU). The IMU is used as the dynamical base for the attitude determination algorithms. This makes the sensor a more capable plug-in replacement for a star tracker, thus reducing the integration and test cost of adding this sensor to a spacecraft. Recent additions include an integrated optical communications system which adds communications, and integrated range and range rate measurement and timing. The paper includes test results from trajectories based on the NASA New Horizons spacecraft.
Vetrella, Amedeo Rodi; Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio
2016-12-17
Autonomous navigation of micro-UAVs is typically based on the integration of low cost Global Navigation Satellite System (GNSS) receivers and Micro-Electro-Mechanical Systems (MEMS)-based inertial and magnetic sensors to stabilize and control the flight. The resulting navigation performance in terms of position and attitude accuracy may not suffice for other mission needs, such as the ones relevant to fine sensor pointing. In this framework, this paper presents a cooperative UAV navigation algorithm that allows a chief vehicle, equipped with inertial and magnetic sensors, a Global Positioning System (GPS) receiver, and a vision system, to improve its navigation performance (in real time or in the post processing phase) exploiting formation flying deputy vehicles equipped with GPS receivers. The focus is set on outdoor environments and the key concept is to exploit differential GPS among vehicles and vision-based tracking (DGPS/Vision) to build a virtual additional navigation sensor whose information is then integrated in a sensor fusion algorithm based on an Extended Kalman Filter. The developed concept and processing architecture are described, with a focus on DGPS/Vision attitude determination algorithm. Performance assessment is carried out on the basis of both numerical simulations and flight tests. In the latter ones, navigation estimates derived from the DGPS/Vision approach are compared with those provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit magnetic- and inertial-independent accurate attitude information.
3D Reconfigurable MPSoC for Unmanned Spacecraft Navigation
NASA Astrophysics Data System (ADS)
Dekoulis, George
2016-07-01
This paper describes the design of a new lightweight spacecraft navigation system for unmanned space missions. The system addresses the demands for more efficient autonomous navigation in the near-Earth environment or deep space. The proposed instrumentation is directly suitable for unmanned systems operation and testing of new airborne prototypes for remote sensing applications. The system features a new sensor technology and significant improvements over existing solutions. Fluxgate type sensors have been traditionally used in unmanned defense systems such as target drones, guided missiles, rockets and satellites, however, the guidance sensors' configurations exhibit lower specifications than the presented solution. The current implementation is based on a recently developed material in a reengineered optimum sensor configuration for unprecedented low-power consumption. The new sensor's performance characteristics qualify it for spacecraft navigation applications. A major advantage of the system is the efficiency in redundancy reduction achieved in terms of both hardware and software requirements.
Performance Evaluation and Requirements Assessment for Gravity Gradient Referenced Navigation
Lee, Jisun; Kwon, Jay Hyoun; Yu, Myeongjong
2015-01-01
In this study, simulation tests for gravity gradient referenced navigation (GGRN) are conducted to verify the effects of various factors such as database (DB) and sensor errors, flight altitude, DB resolution, initial errors, and measurement update rates on the navigation performance. Based on the simulation results, requirements for GGRN are established for position determination with certain target accuracies. It is found that DB and sensor errors and flight altitude have strong effects on the navigation performance. In particular, a DB and sensor with accuracies of 0.1 E and 0.01 E, respectively, are required to determine the position more accurately than or at a level similar to the navigation performance of terrain referenced navigation (TRN). In most cases, the horizontal position error of GGRN is less than 100 m. However, the navigation performance of GGRN is similar to or worse than that of a pure inertial navigation system when the DB and sensor errors are 3 E or 5 E each and the flight altitude is 3000 m. Considering that the accuracy of currently available gradiometers is about 3 E or 5 E, GGRN does not show much advantage over TRN at present. However, GGRN is expected to exhibit much better performance in the near future when accurate DBs and gravity gradiometer are available. PMID:26184212
Designing the STS-134 Re-Rendezvous: A Preparation for Future Crewed Rendezvous Missions
NASA Technical Reports Server (NTRS)
Stuit, Timothy D.
2011-01-01
In preparation to provide the capability for the Orion spacecraft, also known as the Multi-Purpose Crew Vehicle (MPCV), to rendezvous with the International Space Station (ISS) and future spacecraft, a new suite of relative navigation sensors are in development and were tested on one of the final Space Shuttle missions to ISS. The National Aeronautics and Space Administration (NASA) commissioned a flight test of prototypes of the Orion relative navigation sensors on STS-134, in order to test their performance in the space environment during the nominal rendezvous and docking, as well as a re-rendezvous dedicated to testing the prototype sensors following the undocking of the Space Shuttle orbiter at the end of the mission. Unlike the rendezvous and docking at the beginning of the mission, the re-rendezvous profile replicates the newly designed Orion coelliptic approach trajectory, something never before attempted with the shuttle orbiter. Therefore, there were a number of new parameters that needed to be conceived of, designed, and tested for this rerendezvous to make the flight test successful. Additionally, all of this work had to be integrated with the normal operations of the ISS and shuttle and had to conform to the constraints of the mission and vehicles. The result of this work is a separation and rerendezvous trajectory design that would not only prove the design of the relative navigation sensors for the Orion vehicle, but also would serve as a proof of concept for the Orion rendezvous trajectory itself. This document presents the analysis and decision making process involved in attaining the final STS-134 re-rendezvous design.
Use of Earth's magnetic field for mitigating gyroscope errors regardless of magnetic perturbation.
Afzal, Muhammad Haris; Renaudin, Valérie; Lachapelle, Gérard
2011-01-01
Most portable systems like smart-phones are equipped with low cost consumer grade sensors, making them useful as Pedestrian Navigation Systems (PNS). Measurements of these sensors are severely contaminated by errors caused due to instrumentation and environmental issues rendering the unaided navigation solution with these sensors of limited use. The overall navigation error budget associated with pedestrian navigation can be categorized into position/displacement errors and attitude/orientation errors. Most of the research is conducted for tackling and reducing the displacement errors, which either utilize Pedestrian Dead Reckoning (PDR) or special constraints like Zero velocity UPdaTes (ZUPT) and Zero Angular Rate Updates (ZARU). This article targets the orientation/attitude errors encountered in pedestrian navigation and develops a novel sensor fusion technique to utilize the Earth's magnetic field, even perturbed, for attitude and rate gyroscope error estimation in pedestrian navigation environments where it is assumed that Global Navigation Satellite System (GNSS) navigation is denied. As the Earth's magnetic field undergoes severe degradations in pedestrian navigation environments, a novel Quasi-Static magnetic Field (QSF) based attitude and angular rate error estimation technique is developed to effectively use magnetic measurements in highly perturbed environments. The QSF scheme is then used for generating the desired measurements for the proposed Extended Kalman Filter (EKF) based attitude estimator. Results indicate that the QSF measurements are capable of effectively estimating attitude and gyroscope errors, reducing the overall navigation error budget by over 80% in urban canyon environment.
Use of Earth’s Magnetic Field for Mitigating Gyroscope Errors Regardless of Magnetic Perturbation
Afzal, Muhammad Haris; Renaudin, Valérie; Lachapelle, Gérard
2011-01-01
Most portable systems like smart-phones are equipped with low cost consumer grade sensors, making them useful as Pedestrian Navigation Systems (PNS). Measurements of these sensors are severely contaminated by errors caused due to instrumentation and environmental issues rendering the unaided navigation solution with these sensors of limited use. The overall navigation error budget associated with pedestrian navigation can be categorized into position/displacement errors and attitude/orientation errors. Most of the research is conducted for tackling and reducing the displacement errors, which either utilize Pedestrian Dead Reckoning (PDR) or special constraints like Zero velocity UPdaTes (ZUPT) and Zero Angular Rate Updates (ZARU). This article targets the orientation/attitude errors encountered in pedestrian navigation and develops a novel sensor fusion technique to utilize the Earth’s magnetic field, even perturbed, for attitude and rate gyroscope error estimation in pedestrian navigation environments where it is assumed that Global Navigation Satellite System (GNSS) navigation is denied. As the Earth’s magnetic field undergoes severe degradations in pedestrian navigation environments, a novel Quasi-Static magnetic Field (QSF) based attitude and angular rate error estimation technique is developed to effectively use magnetic measurements in highly perturbed environments. The QSF scheme is then used for generating the desired measurements for the proposed Extended Kalman Filter (EKF) based attitude estimator. Results indicate that the QSF measurements are capable of effectively estimating attitude and gyroscope errors, reducing the overall navigation error budget by over 80% in urban canyon environment. PMID:22247672
GPS free navigation inspired by insects through monocular camera and inertial sensors
NASA Astrophysics Data System (ADS)
Liu, Yi; Liu, J. G.; Cao, H.; Huang, Y.
2015-12-01
Navigation without GPS and other knowledge of environment have been studied for many decades. Advance technology have made sensors more compact and subtle that can be easily integrated into micro and hand-hold device. Recently researchers found that bee and fruit fly have an effectively and efficiently navigation mechanism through optical flow information and process only with their miniature brain. We present a navigation system inspired by the study of insects through a calibrated camera and other inertial sensors. The system utilizes SLAM theory and can be worked in many GPS denied environment. Simulation and experimental results are presented for validation and quantification.
NASA Astrophysics Data System (ADS)
Beaudoin, Yanick; Desbiens, André; Gagnon, Eric; Landry, René
2018-01-01
The navigation system of a satellite launcher is of paramount importance. In order to correct the trajectory of the launcher, the position, velocity and attitude must be known with the best possible precision. In this paper, the observability of four navigation solutions is investigated. The first one is the INS/GPS couple. Then, attitude reference sensors, such as magnetometers, are added to the INS/GPS solution. The authors have already demonstrated that the reference trajectory could be used to improve the navigation performance. This approach is added to the two previously mentioned navigation systems. For each navigation solution, the observability is analyzed with different sensor error models. First, sensor biases are neglected. Then, sensor biases are modelled as random walks and as first order Markov processes. The observability is tested with the rank and condition number of the observability matrix, the time evolution of the covariance matrix and sensitivity to measurement outlier tests. The covariance matrix is exploited to evaluate the correlation between states in order to detect structural unobservability problems. Finally, when an unobservable subspace is detected, the result is verified with theoretical analysis of the navigation equations. The results show that evaluating only the observability of a model does not guarantee the ability of the aiding sensors to correct the INS estimates within the mission time. The analysis of the covariance matrix time evolution could be a powerful tool to detect this situation, however in some cases, the problem is only revealed with a sensitivity to measurement outlier test. None of the tested solutions provide GPS position bias observability. For the considered mission, the modelling of the sensor biases as random walks or Markov processes gives equivalent results. Relying on the reference trajectory can improve the precision of the roll estimates. But, in the context of a satellite launcher, the roll estimation error and gyroscope bias are only observable if attitude reference sensors are present.
The Sensor Test for Orion RelNav Risk Mitigation (STORRM) Development Test Objective
NASA Technical Reports Server (NTRS)
Christian, John A.; Hinkel, Heather; D'Souza, Christopher N.; Maguire, Sean; Patangan, Mogi
2011-01-01
The Sensor Test for Orion Relative-Navigation Risk Mitigation (STORRM) Development Test Objective (DTO) flew aboard the Space Shuttle Endeavour on STS-134 in May- June 2011, and was designed to characterize the performance of the flash LIDAR and docking camera being developed for the Orion Multi-Purpose Crew Vehicle. The flash LIDAR, called the Vision Navigation Sensor (VNS), will be the primary navigation instrument used by the Orion vehicle during rendezvous, proximity operations, and docking. The DC will be used by the Orion crew for piloting cues during docking. This paper provides an overview of the STORRM test objectives and the concept of operations. It continues with a description of STORRM's major hardware components, which include the VNS, docking camera, and supporting avionics. Next, an overview of crew and analyst training activities will describe how the STORRM team prepared for flight. Then an overview of in-flight data collection and analysis is presented. Key findings and results from this project are summarized. Finally, the paper concludes with lessons learned from the STORRM DTO.
Quantitative knowledge acquisition for expert systems
NASA Technical Reports Server (NTRS)
Belkin, Brenda L.; Stengel, Robert F.
1991-01-01
A common problem in the design of expert systems is the definition of rules from data obtained in system operation or simulation. While it is relatively easy to collect data and to log the comments of human operators engaged in experiments, generalizing such information to a set of rules has not previously been a direct task. A statistical method is presented for generating rule bases from numerical data, motivated by an example based on aircraft navigation with multiple sensors. The specific objective is to design an expert system that selects a satisfactory suite of measurements from a dissimilar, redundant set, given an arbitrary navigation geometry and possible sensor failures. The systematic development is described of a Navigation Sensor Management (NSM) Expert System from Kalman Filter convariance data. The method invokes two statistical techniques: Analysis of Variance (ANOVA) and the ID3 Algorithm. The ANOVA technique indicates whether variations of problem parameters give statistically different covariance results, and the ID3 algorithms identifies the relationships between the problem parameters using probabilistic knowledge extracted from a simulation example set. Both are detailed.
A Dynamic Precision Evaluation Method for the Star Sensor in the Stellar-Inertial Navigation System.
Lu, Jiazhen; Lei, Chaohua; Yang, Yanqiang
2017-06-28
Integrating the advantages of INS (inertial navigation system) and the star sensor, the stellar-inertial navigation system has been used for a wide variety of applications. The star sensor is a high-precision attitude measurement instrument; therefore, determining how to validate its accuracy is critical in guaranteeing its practical precision. The dynamic precision evaluation of the star sensor is more difficult than a static precision evaluation because of dynamic reference values and other impacts. This paper proposes a dynamic precision verification method of star sensor with the aid of inertial navigation device to realize real-time attitude accuracy measurement. Based on the gold-standard reference generated by the star simulator, the altitude and azimuth angle errors of the star sensor are calculated for evaluation criteria. With the goal of diminishing the impacts of factors such as the sensors' drift and devices, the innovative aspect of this method is to employ static accuracy for comparison. If the dynamic results are as good as the static results, which have accuracy comparable to the single star sensor's precision, the practical precision of the star sensor is sufficiently high to meet the requirements of the system specification. The experiments demonstrate the feasibility and effectiveness of the proposed method.
Vetrella, Amedeo Rodi; Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio
2016-01-01
Autonomous navigation of micro-UAVs is typically based on the integration of low cost Global Navigation Satellite System (GNSS) receivers and Micro-Electro-Mechanical Systems (MEMS)-based inertial and magnetic sensors to stabilize and control the flight. The resulting navigation performance in terms of position and attitude accuracy may not suffice for other mission needs, such as the ones relevant to fine sensor pointing. In this framework, this paper presents a cooperative UAV navigation algorithm that allows a chief vehicle, equipped with inertial and magnetic sensors, a Global Positioning System (GPS) receiver, and a vision system, to improve its navigation performance (in real time or in the post processing phase) exploiting formation flying deputy vehicles equipped with GPS receivers. The focus is set on outdoor environments and the key concept is to exploit differential GPS among vehicles and vision-based tracking (DGPS/Vision) to build a virtual additional navigation sensor whose information is then integrated in a sensor fusion algorithm based on an Extended Kalman Filter. The developed concept and processing architecture are described, with a focus on DGPS/Vision attitude determination algorithm. Performance assessment is carried out on the basis of both numerical simulations and flight tests. In the latter ones, navigation estimates derived from the DGPS/Vision approach are compared with those provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit magnetic- and inertial-independent accurate attitude information. PMID:27999318
Learning for Autonomous Navigation
NASA Technical Reports Server (NTRS)
Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric
2005-01-01
Robotic ground vehicles for outdoor applications have achieved some remarkable successes, notably in autonomous highway following (Dickmanns, 1987), planetary exploration (1), and off-road navigation on Earth (1). Nevertheless, major challenges remain to enable reliable, high-speed, autonomous navigation in a wide variety of complex, off-road terrain. 3-D perception of terrain geometry with imaging range sensors is the mainstay of off-road driving systems. However, the stopping distance at high speed exceeds the effective lookahead distance of existing range sensors. Prospects for extending the range of 3-D sensors is strongly limited by sensor physics, eye safety of lasers, and related issues. Range sensor limitations also allow vehicles to enter large cul-de-sacs even at low speed, leading to long detours. Moreover, sensing only terrain geometry fails to reveal mechanical properties of terrain that are critical to assessing its traversability, such as potential for slippage, sinkage, and the degree of compliance of potential obstacles. Rovers in the Mars Exploration Rover (MER) mission have got stuck in sand dunes and experienced significant downhill slippage in the vicinity of large rock hazards. Earth-based off-road robots today have very limited ability to discriminate traversable vegetation from non-traversable vegetation or rough ground. It is impossible today to preprogram a system with knowledge of these properties for all types of terrain and weather conditions that might be encountered.
Concept of AHRS Algorithm Designed for Platform Independent Imu Attitude Alignment
NASA Astrophysics Data System (ADS)
Tomaszewski, Dariusz; Rapiński, Jacek; Pelc-Mieczkowska, Renata
2017-12-01
Nowadays, along with the advancement of technology one can notice the rapid development of various types of navigation systems. So far the most popular satellite navigation, is now supported by positioning results calculated with use of other measurement system. The method and manner of integration will depend directly on the destination of system being developed. To increase the frequency of readings and improve the operation of outdoor navigation systems, one will support satellite navigation systems (GPS, GLONASS ect.) with inertial navigation. Such method of navigation consists of several steps. The first stage is the determination of initial orientation of inertial measurement unit, called INS alignment. During this process, on the basis of acceleration and the angular velocity readings, values of Euler angles (pitch, roll, yaw) are calculated allowing for unambiguous orientation of the sensor coordinate system relative to external coordinate system. The following study presents the concept of AHRS (Attitude and heading reference system) algorithm, allowing to define the Euler angles.The study were conducted with the use of readings from low-cost MEMS cell phone sensors. Subsequently the results of the study were analyzed to determine the accuracy of featured algorithm. On the basis of performed experiments the legitimacy of developed algorithm was stated.
Development of Navigation Doppler Lidar for Future Landing Mission
NASA Technical Reports Server (NTRS)
Amzajerdian, Farzin; Hines, Glenn D.; Petway, Larry B.; Barnes, Bruce W.; Pierrottet, Diego F.; Carson, John M., III
2016-01-01
A coherent Navigation Doppler Lidar (NDL) sensor has been developed under the Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project to support future NASA missions to planetary bodies. This lidar sensor provides accurate surface-relative altitude and vector velocity data during the descent phase that can be used by an autonomous Guidance, Navigation, and Control (GN&C) system to precisely navigate the vehicle from a few kilometers above the ground to a designated location and execute a controlled soft touchdown. The operation and performance of the NDL was demonstrated through closed-loop flights onboard the rocket-propelled Morpheus vehicle in 2014. In Morpheus flights, conducted at the NASA Kennedy Space Center, the NDL data was used by an autonomous GN&C system to navigate and land the vehicle precisely at the selected location surrounded by hazardous rocks and craters. Since then, development efforts for the NDL have shifted toward enhancing performance, optimizing design, and addressing spaceflight size and mass constraints and environmental and reliability requirements. The next generation NDL, with expanded operational envelope and significantly reduced size, will be demonstrated in 2017 through a new flight test campaign onboard a commercial rocketpropelled test vehicle.
Garg, Satish K; Smith, James; Beatson, Christie; Lopez-Baca, Benita; Voelmle, Mary; Gottlieb, Peter A
2009-02-01
This study evaluated the accuracy and safety of two continuous glucose monitoring (CGM) systems, the SEVEN (DexCom, San Diego, CA) and the Navigator (Abbott Diabetes Care, Alameda, CA), with the YSI laboratory measurements of blood glucose (blood glucose meter manufactured by YSI, Yellow Springs, OH), when worn concurrently in adults with type 1 diabetes. Fourteen subjects with type 1 diabetes, 33 +/- 6 (mean +/- SD) years old, were enrolled in this study. All subjects wore both sensors concurrently over three consecutive 5-day CGM sessions (15-day wear). On Days 5, 10, and 15, subjects participated in an 8-h in-clinic session where measurements from the CGM systems were collected and compared with YSI measurements every 15 min. At the end of Day 5 and 10 in-clinic sessions, the sensors were removed, and new sensors were inserted for the following CGM session despite the SEVEN system's recommended use for up to 7 days. The mean absolute relative difference (ARD) for the two CGM devices versus YSI was not different: 16.8% and 16.1% for SEVEN and Navigator, respectively (P = 0.38). In the hypoglycemic region (YSI value <80 mg/dL), the mean ARD for SEVEN was lower than for Navigator (21.5% vs. 29.8%, respectively; P = 0.001). The data analyses were similar when compared with self-monitoring of blood glucose (SMBG) values. Thirteen additional Navigator replacement devices were issued compared to two for the SEVEN. A total of three versus 14 skin reactions were reported with the SEVEN and Navigator insertion area, respectively. Glucose measurements with the SEVEN and Navigator were found to be similar compared with YSI and SMBG measurements, with the exception of the hypoglycemic range where the SEVEN performed better. However, the Navigator caused more skin area reactions.
2009-09-01
22 b. Hazard Detection and Avoidance ( HDA )...............................22 c. Hazard Relative Navigation (HRN...Navigation (HRN) and Hazard Detection and Avoidance ( HDA ). In addition to the TRN and HDA sensors used during these phases, which will be discussed...and Avoidance ( HDA ) During the HAD phase, the expected landing site is examined and evaluated, and a new site may be selected. Using the HDA
Progress in Insect-Inspired Optical Navigation Sensors
NASA Technical Reports Server (NTRS)
Thakoor, Sarita; Chahl, Javaan; Zometzer, Steve
2005-01-01
Progress has been made in continuing efforts to develop optical flight-control and navigation sensors for miniature robotic aircraft. The designs of these sensors are inspired by the designs and functions of the vision systems and brains of insects. Two types of sensors of particular interest are polarization compasses and ocellar horizon sensors. The basic principle of polarization compasses was described (but without using the term "polarization compass") in "Insect-Inspired Flight Control for Small Flying Robots" (NPO-30545), NASA Tech Briefs, Vol. 29, No. 1 (January 2005), page 61. To recapitulate: Bees use sky polarization patterns in ultraviolet (UV) light, caused by Rayleigh scattering of sunlight by atmospheric gas molecules, as direction references relative to the apparent position of the Sun. A robotic direction-finding technique based on this concept would be more robust in comparison with a technique based on the direction to the visible Sun because the UV polarization pattern is distributed across the entire sky and, hence, is redundant and can be extrapolated from a small region of clear sky in an elsewhere cloudy sky that hides the Sun.
Navigation system for a mobile robot with a visual sensor using a fish-eye lens
NASA Astrophysics Data System (ADS)
Kurata, Junichi; Grattan, Kenneth T. V.; Uchiyama, Hironobu
1998-02-01
Various position sensing and navigation systems have been proposed for the autonomous control of mobile robots. Some of these systems have been installed with an omnidirectional visual sensor system that proved very useful in obtaining information on the environment around the mobile robot for position reckoning. In this article, this type of navigation system is discussed. The sensor is composed of one TV camera with a fish-eye lens, using a reference target on a ceiling and hybrid image processing circuits. The position of the robot, with respect to the floor, is calculated by integrating the information obtained from a visual sensor and a gyroscope mounted in the mobile robot, and the use of a simple algorithm based on PTP control for guidance is discussed. An experimental trial showed that the proposed system was both valid and useful for the navigation of an indoor vehicle.
Analysis of navigation and guidance requirements for commercial VTOL operations
NASA Technical Reports Server (NTRS)
Hoffman, W. C.; Zvara, J.; Hollister, W. M.
1975-01-01
The paper presents some results of a program undertaken to define navigation and guidance requirements for commercial VTOL operations in the takeoff, cruise, terminal and landing phases of flight in weather conditions up to and including Category III. Quantitative navigation requirements are given for the parameters range, coverage, operation near obstacles, horizontal accuracy, multiple landing aircraft, multiple pad requirements, inertial/radio-inertial requirements, reliability/redundancy, update rate, and data link requirements in all flight phases. A multi-configuration straw-man navigation and guidance system for commercial VTOL operations is presented. Operation of the system is keyed to a fully automatic approach for navigation, guidance and control, with pilot as monitor-manager. The system is a hybrid navigator using a relatively low-cost inertial sensor with DME updates and MLS in the approach/departure phases.
Bio-Inspired Polarized Skylight-Based Navigation Sensors: A Review
Karman, Salmah B.; Diah, S. Zaleha M.; Gebeshuber, Ille C.
2012-01-01
Animal senses cover a broad range of signal types and signal bandwidths and have inspired various sensors and bioinstrumentation devices for biological and medical applications. Insects, such as desert ants and honeybees, for example, utilize polarized skylight pattern-based information in their navigation activities. They reliably return to their nests and hives from places many kilometers away. The insect navigation system involves the dorsal rim area in their compound eyes and the corresponding polarization sensitive neurons in the brain. The dorsal rim area is equipped with photoreceptors, which have orthogonally arranged small hair-like structures termed microvilli. These are the specialized sensors for the detection of polarized skylight patterns (e-vector orientation). Various research groups have been working on the development of novel navigation systems inspired by polarized skylight-based navigation in animals. Their major contributions are critically reviewed. One focus of current research activities is on imitating the integration path mechanism in desert ants. The potential for simple, high performance miniaturized bioinstrumentation that can assist people in navigation will be explored. PMID:23202158
Bio-inspired polarized skylight-based navigation sensors: a review.
Karman, Salmah B; Diah, S Zaleha M; Gebeshuber, Ille C
2012-10-24
Animal senses cover a broad range of signal types and signal bandwidths and have inspired various sensors and bioinstrumentation devices for biological and medical applications. Insects, such as desert ants and honeybees, for example, utilize polarized skylight pattern-based information in their navigation activities. They reliably return to their nests and hives from places many kilometers away. The insect navigation system involves the dorsal rim area in their compound eyes and the corresponding polarization sensitive neurons in the brain. The dorsal rim area is equipped with photoreceptors, which have orthogonally arranged small hair-like structures termed microvilli. These are the specialized sensors for the detection of polarized skylight patterns (e-vector orientation). Various research groups have been working on the development of novel navigation systems inspired by polarized skylight-based navigation in animals. Their major contributions are critically reviewed. One focus of current research activities is on imitating the integration path mechanism in desert ants. The potential for simple, high performance miniaturized bioinstrumentation that can assist people in navigation will be explored.
Application of Vehicle Dynamic Modeling in Uavs for Precise Determination of Exterior Orientation
NASA Astrophysics Data System (ADS)
Khaghani, M.; Skaloud, J.
2016-06-01
Advances in unmanned aerial vehicles (UAV) and especially micro aerial vehicle (MAV) technology together with increasing quality and decreasing price of imaging devices have resulted in growing use of MAVs in photogrammetry. The practicality of MAV mapping is seriously enhanced with the ability to determine parameters of exterior orientation (EO) with sufficient accuracy, in both absolute and relative senses (change of attitude between successive images). While differential carrier phase GNSS satisfies cm-level positioning accuracy, precise attitude determination is essential for both direct sensor orientation (DiSO) and integrated sensor orientation (ISO) in corridor mapping or in block configuration imaging over surfaces with low texture. Limited cost, size, and weight of MAVs represent limitations on quality of onboard navigation sensors and puts emphasis on exploiting full capacity of available resources. Typically short flying times (10-30 minutes) also limit the possibility of estimating and/or correcting factors such as sensor misalignment and poor attitude initialization of inertial navigation system (INS). This research aims at increasing the accuracy of attitude determination in both absolute and relative senses with no extra sensors onboard. In comparison to classical INS/GNSS setup, novel approach is presented here to integrated state estimation, in which vehicle dynamic model (VDM) is used as the main process model. Such system benefits from available information from autopilot and physical properties of the platform in enhancing performance of determination of trajectory and parameters of exterior orientation consequently. The navigation system employs a differential carrier phase GNSS receiver and a micro electro-mechanical system (MEMS) grade inertial measurement unit (IMU), together with MAV control input from autopilot. Monte-Carlo simulation has been performed on trajectories for typical corridor mapping and block imaging. Results reveal considerable reduction in attitude errors with respect to conventional INS/GNSS system, in both absolute and relative senses. This eventually translates into higher redundancy and accuracy for photogrammetry applications.
Multi-sensor Navigation System Design
DOT National Transportation Integrated Search
1971-03-01
This report treats the design of naviggation systems that collect data from two or more on-board measurement subsystems and precess this data in an on-board computer. Such systems are called Multi-sensor Navigation Systems. : The design begins with t...
Flight Test Result for the Ground-Based Radio Navigation System Sensor with an Unmanned Air Vehicle
Jang, Jaegyu; Ahn, Woo-Guen; Seo, Seungwoo; Lee, Jang Yong; Park, Jun-Pyo
2015-01-01
The Ground-based Radio Navigation System (GRNS) is an alternative/backup navigation system based on time synchronized pseudolites. It has been studied for some years due to the potential vulnerability issue of satellite navigation systems (e.g., GPS or Galileo). In the framework of our study, a periodic pulsed sequence was used instead of the randomized pulse sequence recommended as the RTCM (radio technical commission for maritime services) SC (special committee)-104 pseudolite signal, as a randomized pulse sequence with a long dwell time is not suitable for applications requiring high dynamics. This paper introduces a mathematical model of the post-correlation output in a navigation sensor, showing that the aliasing caused by the additional frequency term of a periodic pulsed signal leads to a false lock (i.e., Doppler frequency bias) during the signal acquisition process or in the carrier tracking loop of the navigation sensor. We suggest algorithms to resolve the frequency false lock issue in this paper, relying on the use of a multi-correlator. A flight test with an unmanned helicopter was conducted to verify the implemented navigation sensor. The results of this analysis show that there were no false locks during the flight test and that outliers stem from bad dilution of precision (DOP) or fluctuations in the received signal quality. PMID:26569251
SPARTAN: A High-Fidelity Simulation for Automated Rendezvous and Docking Applications
NASA Technical Reports Server (NTRS)
Turbe, Michael A.; McDuffie, James H.; DeKock, Brandon K.; Betts, Kevin M.; Carrington, Connie K.
2007-01-01
bd Systems (a subsidiary of SAIC) has developed the Simulation Package for Autonomous Rendezvous Test and ANalysis (SPARTAN), a high-fidelity on-orbit simulation featuring multiple six-degree-of-freedom (6DOF) vehicles. SPARTAN has been developed in a modular fashion in Matlab/Simulink to test next-generation automated rendezvous and docking guidance, navigation,and control algorithms for NASA's new Vision for Space Exploration. SPARTAN includes autonomous state-based mission manager algorithms responsible for sequencing the vehicle through various flight phases based on on-board sensor inputs and closed-loop guidance algorithms, including Lambert transfers, Clohessy-Wiltshire maneuvers, and glideslope approaches The guidance commands are implemented using an integrated translation and attitude control system to provide 6DOF control of each vehicle in the simulation. SPARTAN also includes high-fidelity representations of a variety of absolute and relative navigation sensors that maybe used for NASA missions, including radio frequency, lidar, and video-based rendezvous sensors. Proprietary navigation sensor fusion algorithms have been developed that allow the integration of these sensor measurements through an extended Kalman filter framework to create a single optimal estimate of the relative state of the vehicles. SPARTAN provides capability for Monte Carlo dispersion analysis, allowing for rigorous evaluation of the performance of the complete proposed AR&D system, including software, sensors, and mechanisms. SPARTAN also supports hardware-in-the-loop testing through conversion of the algorithms to C code using Real-Time Workshop in order to be hosted in a mission computer engineering development unit running an embedded real-time operating system. SPARTAN also contains both runtime TCP/IP socket interface and post-processing compatibility with bdStudio, a visualization tool developed by bd Systems, allowing for intuitive evaluation of simulation results. A description of the SPARTAN architecture and capabilities is provided, along with details on the models and algorithms utilized and results from representative missions.
A LEO Satellite Navigation Algorithm Based on GPS and Magnetometer Data
NASA Technical Reports Server (NTRS)
Deutschmann, Julie; Bar-Itzhack, Itzhack; Harman, Rick; Bauer, Frank H. (Technical Monitor)
2000-01-01
The Global Positioning System (GPS) has become a standard method for low cost onboard satellite orbit determination. The use of a GPS receiver as an attitude and rate sensor has also been developed in the recent past. Additionally, focus has been given to attitude and orbit estimation using the magnetometer, a low cost, reliable sensor. Combining measurements from both GPS and a magnetometer can provide a robust navigation system that takes advantage of the estimation qualities of both measurements. Ultimately a low cost, accurate navigation system can result, potentially eliminating the need for more costly sensors, including gyroscopes.
INS/GNSS Integration for Aerobatic Flight Applications and Aircraft Motion Surveying.
V Hinüber, Edgar L; Reimer, Christian; Schneider, Tim; Stock, Michael
2017-04-26
This paper presents field tests of challenging flight applications obtained with a new family of lightweight low-power INS/GNSS ( inertial navigation system/global satellite navigation system ) solutions based on MEMS ( micro-electro-mechanical- sensor ) machined sensors, being used for UAV ( unmanned aerial vehicle ) navigation and control as well as for aircraft motion dynamics analysis and trajectory surveying. One key is a 42+ state extended Kalman-filter-based powerful data fusion, which also allows the estimation and correction of parameters that are typically affected by sensor aging, especially when applying MEMS-based inertial sensors, and which is not yet deeply considered in the literature. The paper presents the general system architecture, which allows iMAR Navigation the integration of all classes of inertial sensors and GNSS ( global navigation satellite system ) receivers from very-low-cost MEMS and high performance MEMS over FOG ( fiber optical gyro ) and RLG ( ring laser gyro ) up to HRG ( hemispherical resonator gyro ) technology, and presents detailed flight test results obtained under extreme flight conditions. As a real-world example, the aerobatic maneuvers of the World Champion 2016 (Red Bull Air Race) are presented. Short consideration is also given to surveying applications, where the ultimate performance of the same data fusion, but applied on gravimetric surveying, is discussed.
Tawk, Youssef; Tomé, Phillip; Botteron, Cyril; Stebler, Yannick; Farine, Pierre-André
2014-01-01
The use of global navigation satellite system receivers for navigation still presents many challenges in urban canyon and indoor environments, where satellite availability is typically reduced and received signals are attenuated. To improve the navigation performance in such environments, several enhancement methods can be implemented. For instance, external aid provided through coupling with other sensors has proven to contribute substantially to enhancing navigation performance and robustness. Within this context, coupling a very simple GPS receiver with an Inertial Navigation System (INS) based on low-cost micro-electro-mechanical systems (MEMS) inertial sensors is considered in this paper. In particular, we propose a GPS/INS Tightly Coupled Assisted PLL (TCAPLL) architecture, and present most of the associated challenges that need to be addressed when dealing with very-low-performance MEMS inertial sensors. In addition, we propose a data monitoring system in charge of checking the quality of the measurement flow in the architecture. The implementation of the TCAPLL is discussed in detail, and its performance under different scenarios is assessed. Finally, the architecture is evaluated through a test campaign using a vehicle that is driven in urban environments, with the purpose of highlighting the pros and cons of combining MEMS inertial sensors with GPS over GPS alone. PMID:24569773
INS/GNSS Integration for Aerobatic Flight Applications and Aircraft Motion Surveying
v. Hinüber, Edgar L.; Reimer, Christian; Schneider, Tim; Stock, Michael
2017-01-01
This paper presents field tests of challenging flight applications obtained with a new family of lightweight low-power INS/GNSS (inertial navigation system/global satellite navigation system) solutions based on MEMS (micro-electro-mechanical- sensor) machined sensors, being used for UAV (unmanned aerial vehicle) navigation and control as well as for aircraft motion dynamics analysis and trajectory surveying. One key is a 42+ state extended Kalman-filter-based powerful data fusion, which also allows the estimation and correction of parameters that are typically affected by sensor aging, especially when applying MEMS-based inertial sensors, and which is not yet deeply considered in the literature. The paper presents the general system architecture, which allows iMAR Navigation the integration of all classes of inertial sensors and GNSS (global navigation satellite system) receivers from very-low-cost MEMS and high performance MEMS over FOG (fiber optical gyro) and RLG (ring laser gyro) up to HRG (hemispherical resonator gyro) technology, and presents detailed flight test results obtained under extreme flight conditions. As a real-world example, the aerobatic maneuvers of the World Champion 2016 (Red Bull Air Race) are presented. Short consideration is also given to surveying applications, where the ultimate performance of the same data fusion, but applied on gravimetric surveying, is discussed. PMID:28445417
COBALT Flight Demonstrations Fuse Technologies
2017-06-07
This 5-minute, 50-second video shows how the CoOperative Blending of Autonomous Landing Technologies (COBALT) system pairs new landing sensor technologies that promise to yield the highest precision navigation solution ever tested for NASA space landing applications. The technologies included a navigation doppler lidar (NDL), which provides ultra-precise velocity and line-of-sight range measurements, and the Lander Vision System (LVS), which provides terrain-relative navigation. Through flight campaigns conducted in March and April 2017 aboard Masten Space Systems' Xodiac, a rocket-powered vertical takeoff, vertical landing (VTVL) platform, the COBALT system was flight tested to collect sensor performance data for NDL and LVS and to check the integration and communication between COBALT and the rocket. The flight tests provided excellent performance data for both sensors, as well as valuable information on the integrated performance with the rocket that will be used for subsequent COBALT modifications prior to follow-on flight tests. Based at NASA’s Armstrong Flight Research Center in Edwards, CA, the Flight Opportunities program funds technology development flight tests on commercial suborbital space providers of which Masten is a vendor. The program has previously tested the LVS on the Masten rocket and validated the technology for the Mars 2020 rover.
A Novel Online Data-Driven Algorithm for Detecting UAV Navigation Sensor Faults.
Sun, Rui; Cheng, Qi; Wang, Guanyu; Ochieng, Washington Yotto
2017-09-29
The use of Unmanned Aerial Vehicles (UAVs) has increased significantly in recent years. On-board integrated navigation sensors are a key component of UAVs' flight control systems and are essential for flight safety. In order to ensure flight safety, timely and effective navigation sensor fault detection capability is required. In this paper, a novel data-driven Adaptive Neuron Fuzzy Inference System (ANFIS)-based approach is presented for the detection of on-board navigation sensor faults in UAVs. Contrary to the classic UAV sensor fault detection algorithms, based on predefined or modelled faults, the proposed algorithm combines an online data training mechanism with the ANFIS-based decision system. The main advantages of this algorithm are that it allows real-time model-free residual analysis from Kalman Filter (KF) estimates and the ANFIS to build a reliable fault detection system. In addition, it allows fast and accurate detection of faults, which makes it suitable for real-time applications. Experimental results have demonstrated the effectiveness of the proposed fault detection method in terms of accuracy and misdetection rate.
NASA Astrophysics Data System (ADS)
Rhodes, Andrew P.; Christian, John A.; Evans, Thomas
2017-12-01
With the availability and popularity of 3D sensors, it is advantageous to re-examine the use of point cloud descriptors for the purpose of pose estimation and spacecraft relative navigation. One popular descriptor is the oriented unique repeatable clustered viewpoint feature histogram (
NASA Technical Reports Server (NTRS)
Hruby, R. J.; Bjorkman, W. S.
1977-01-01
Flight test results of the strapdown inertial reference unit (SIRU) navigation system are presented. The fault-tolerant SIRU navigation system features a redundant inertial sensor unit and dual computers. System software provides for detection and isolation of inertial sensor failures and continued operation in the event of failures. Flight test results include assessments of the system's navigational performance and fault tolerance.
All Source Sensor Integration Using an Extended Kalman Filter
2012-03-22
Abbreviations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix I. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 All...Positioning System . . . . . . . . . . . . . . . . . . 1 ASPN All Source Positioning Navigation . . . . . . . . . . . . . . 2 DARPA Defense Advanced...equations are developed for sensor preprocessed mea- 1 surements, and these navigation equations are not dependent upon the integrating filter. That is
An Improved Multi-Sensor Fusion Navigation Algorithm Based on the Factor Graph
Zeng, Qinghua; Chen, Weina; Liu, Jianye; Wang, Huizhe
2017-01-01
An integrated navigation system coupled with additional sensors can be used in the Micro Unmanned Aerial Vehicle (MUAV) applications because the multi-sensor information is redundant and complementary, which can markedly improve the system accuracy. How to deal with the information gathered from different sensors efficiently is an important problem. The fact that different sensors provide measurements asynchronously may complicate the processing of these measurements. In addition, the output signals of some sensors appear to have a non-linear character. In order to incorporate these measurements and calculate a navigation solution in real time, the multi-sensor fusion algorithm based on factor graph is proposed. The global optimum solution is factorized according to the chain structure of the factor graph, which allows for a more general form of the conditional probability density. It can convert the fusion matter into connecting factors defined by these measurements to the graph without considering the relationship between the sensor update frequency and the fusion period. An experimental MUAV system has been built and some experiments have been performed to prove the effectiveness of the proposed method. PMID:28335570
An Improved Multi-Sensor Fusion Navigation Algorithm Based on the Factor Graph.
Zeng, Qinghua; Chen, Weina; Liu, Jianye; Wang, Huizhe
2017-03-21
An integrated navigation system coupled with additional sensors can be used in the Micro Unmanned Aerial Vehicle (MUAV) applications because the multi-sensor information is redundant and complementary, which can markedly improve the system accuracy. How to deal with the information gathered from different sensors efficiently is an important problem. The fact that different sensors provide measurements asynchronously may complicate the processing of these measurements. In addition, the output signals of some sensors appear to have a non-linear character. In order to incorporate these measurements and calculate a navigation solution in real time, the multi-sensor fusion algorithm based on factor graph is proposed. The global optimum solution is factorized according to the chain structure of the factor graph, which allows for a more general form of the conditional probability density. It can convert the fusion matter into connecting factors defined by these measurements to the graph without considering the relationship between the sensor update frequency and the fusion period. An experimental MUAV system has been built and some experiments have been performed to prove the effectiveness of the proposed method.
Precise relative navigation using augmented CDGPS
NASA Astrophysics Data System (ADS)
Park, Chan-Woo
2001-10-01
Autonomous formation flying of multiple vehicles is a revolutionary enabling technology for many future space and earth science missions that require distributed measurements, such as sparse aperture radars and stellar interferometry. The techniques developed for the space applications will also have a significant impact on many terrestrial formation flying missions. One of the key requirements of formation flying is accurate knowledge of the relative positions and velocities between the vehicles. Several researchers have shown that the GPS is a viable sensor to perform this relative navigation. However, there are several limitations in the use of GPS because it requires adequate visibility to the NAVSTAR constellation. For some mission scenarios, such as MEO, GEO and tight formation missions, the visibility/geometry of the constellation may not be sufficient to accurately estimate the relative states. One solution to these problems is to include an RF ranging device onboard the vehicles in the formation and form a local constellation that augments the existing NAVSTAR constellation. These local range measurements, combined with the GPS measurements, can provide a sufficient number of measurements and adequate geometry to solve for the relative states. Furthermore, these RF ranging devices can be designed to provide substantially more accurate measures of the vehicle relative states than the traditional GPS pseudolites. The local range measurements also allow relative vehicle motion to be used to efficiently solve for the cycle ambiguities in real-time. This dissertation presents the development of an onboard ranging sensor and the extension of several related algorithms for a formation of vehicles with both GPS and local transmitters. Key among these are a robust cycle ambiguity estimation method and a decentralized relative navigation filter. The efficient decentralized approach to the GPS-only relative navigation problem is extended to an iterative cascade extended Kalman filtering (ICEKF) algorithm when the vehicles have onboard transmitters. Several ground testbeds were developed to demonstrate the feasibility of the augmentation concept and the relative navigation algorithms. The testbed includes the Stanford Pseudolite Transceiver Crosslink (SPTC), which was developed and extensively tested with a formation of outdoor ground vehicles.
A new method for determining which stars are near a star sensor field-of-view
NASA Technical Reports Server (NTRS)
Yates, Russell E., Jr.; Vedder, John D.
1991-01-01
A new method is described for determining which stars in a navigation star catalog are near a star sensor field of view (FOV). This method assumes that an estimate of spacecraft inertial attitude is known. Vector component ranges for the star sensor FOV are computed, so that stars whose vector components lie within these ranges are near the star sensor FOV. This method requires no presorting of the navigation star catalog, and is more efficient than tradition methods.
Gao, Wei; Zhang, Ya; Wang, Jianguo
2014-01-01
The integrated navigation system with strapdown inertial navigation system (SINS), Beidou (BD) receiver and Doppler velocity log (DVL) can be used in marine applications owing to the fact that the redundant and complementary information from different sensors can markedly improve the system accuracy. However, the existence of multisensor asynchrony will introduce errors into the system. In order to deal with the problem, conventionally the sampling interval is subdivided, which increases the computational complexity. In this paper, an innovative integrated navigation algorithm based on a Cubature Kalman filter (CKF) is proposed correspondingly. A nonlinear system model and observation model for the SINS/BD/DVL integrated system are established to more accurately describe the system. By taking multi-sensor asynchronization into account, a new sampling principle is proposed to make the best use of each sensor's information. Further, CKF is introduced in this new algorithm to enable the improvement of the filtering accuracy. The performance of this new algorithm has been examined through numerical simulations. The results have shown that the positional error can be effectively reduced with the new integrated navigation algorithm. Compared with the traditional algorithm based on EKF, the accuracy of the SINS/BD/DVL integrated navigation system is improved, making the proposed nonlinear integrated navigation algorithm feasible and efficient. PMID:24434842
Tele-auscultation support system with mixed reality navigation.
Hori, Kenta; Uchida, Yusuke; Kan, Tsukasa; Minami, Maya; Naito, Chisako; Kuroda, Tomohiro; Takahashi, Hideya; Ando, Masahiko; Kawamura, Takashi; Kume, Naoto; Okamoto, Kazuya; Takemura, Tadamasa; Yoshihara, Hiroyuki
2013-01-01
The aim of this research is to develop an information support system for tele-auscultation. In auscultation, a doctor requires to understand condition of applying a stethoscope, in addition to auscultatory sounds. The proposed system includes intuitive navigation system of stethoscope operation, in addition to conventional audio streaming system of auscultatory sounds and conventional video conferencing system for telecommunication. Mixed reality technology is applied for intuitive navigation of the stethoscope. Information, such as position, contact condition and breath, is overlaid on a view of the patient's chest. The contact condition of the stethoscope is measured by e-textile contact sensors. The breath is measured by a band type breath sensor. In a simulated tele-auscultation experiment, the stethoscope with the contact sensors and the breath sensor were evaluated. The results show that the presentation of the contact condition was not understandable enough for navigating the stethoscope handling. The time series of the breath phases was usable for the remote doctor to understand the breath condition of the patient.
Flight test results of the strapdown hexad inertial reference unit (SIRU). Volume 2: Test report
NASA Technical Reports Server (NTRS)
Hruby, R. J.; Bjorkman, W. S.
1977-01-01
Results of flight tests of the Strapdown Inertial Reference Unit (SIRU) navigation system are presented. The fault tolerant SIRU navigation system features a redundant inertial sensor unit and dual computers. System software provides for detection and isolation of inertial sensor failures and continued operation in the event of failures. Flight test results include assessments of the system's navigational performance and fault tolerance. Performance shortcomings are analyzed.
Space Shuttle Earth Observation sensors pointing and stabilization requirements study
NASA Technical Reports Server (NTRS)
1976-01-01
The shuttle orbiter inertial measurement unit (IMU), located in the orbiter cabin, is used to supply inertial attitude reference signals; and, in conjunction with the onboard navigation system, can provide a pointing capability of the navigation base accurate to within plus or minus 0.5 deg for earth viewing missions. This pointing accuracy can degrade to approximately plus or minus 2.0 deg for payloads located in the aft bay due to structural flexure of the shuttle vehicle, payload structural and mounting misalignments, and calibration errors with respect to the navigation base. Drawbacks to obtaining pointing accuracy by using the orbiter RCS jets are discussed. Supplemental electromechanical pointing systems are developed to provide independent pointing for individual sensors, or sensor groupings. The missions considered and the sensors required for these missions and the parameters of each sensor are described. Assumptions made to derive pointing and stabilization requirements are delineated.
Using neuromorphic optical sensors for spacecraft absolute and relative navigation
NASA Astrophysics Data System (ADS)
Shake, Christopher M.
We develop a novel attitude determination system (ADS) for use on nano spacecraft using neuromorphic optical sensors. The ADS intends to support nano-satellite operations by providing low-cost, low-mass, low-volume, low-power, and redundant attitude determination capabilities with quick and straightforward onboard programmability for real time spacecraft operations. The ADS is experimentally validated with commercial-off-the-shelf optical devices that perform sensing and image processing on the same circuit board and are biologically inspired by insects' vision systems, which measure optical flow while navigating in the environment. The firmware on the devices is modified to both perform the additional biologically inspired task of tracking objects and communicate with a PC/104 form-factor embedded computer running Real Time Application Interface Linux used on a spacecraft simulator. Algorithms are developed for operations using optical flow, point tracking, and hybrid modes with the sensors, and the performance of the system in all three modes is assessed using a spacecraft simulator in the Advanced Autonomous Multiple Spacecraft (ADAMUS) laboratory at Rensselaer. An existing relative state determination method is identified to be combined with the novel ADS to create a self-contained navigation system for nano spacecraft. The performance of the method is assessed in simulation and found not to match the results from its authors using only conditions and equations already published. An improved target inertia tensor method is proposed as an update to the existing relative state method, but found not to perform as expected, but is presented for others to build upon.
NASA Technical Reports Server (NTRS)
1971-01-01
The guidance and navigation requirements for a set of impulsive thrust missions involving one or more outer planets or comets. Specific missions considered include two Jupiter entry missions of 800 and 1200 day duration, two multiple swingby missions with the sequences Jupiter-Uranus-Neptune and Jupiter-Saturn-Pluto, and two comets rendezvous missions involving the short period comets P/Tempel 2 and P/Tuttle-Giacobini-Kresak. Results show the relative utility of onboard and Earth-based DSN navigation. The effects of parametric variations in navigation accuracy, measurement rate, and miscellaneous constraints are determined. The utility of a TV type onboard navigation sensor - sighting on planetary satellites and comets - is examined. Velocity corrections required for the nominal and parametrically varied cases are tabulated.
Navigation through unknown and dynamic open spaces using topological notions
NASA Astrophysics Data System (ADS)
Miguel-Tomé, Sergio
2018-04-01
Until now, most algorithms used for navigation have had the purpose of directing system towards one point in space. However, humans communicate tasks by specifying spatial relations among elements or places. In addition, the environments in which humans develop their activities are extremely dynamic. The only option that allows for successful navigation in dynamic and unknown environments is making real-time decisions. Therefore, robots capable of collaborating closely with human beings must be able to make decisions based on the local information registered by the sensors and interpret and express spatial relations. Furthermore, when one person is asked to perform a task in an environment, this task is communicated given a category of goals so the person does not need to be supervised. Thus, two problems appear when one wants to create multifunctional robots: how to navigate in dynamic and unknown environments using spatial relations and how to accomplish this without supervision. In this article, a new architecture to address the two cited problems is presented, called the topological qualitative navigation architecture. In previous works, a qualitative heuristic called the heuristic of topological qualitative semantics (HTQS) has been developed to establish and identify spatial relations. However, that heuristic only allows for establishing one spatial relation with a specific object. In contrast, navigation requires a temporal sequence of goals with different objects. The new architecture attains continuous generation of goals and resolves them using HTQS. Thus, the new architecture achieves autonomous navigation in dynamic or unknown open environments.
Multi-Sensor Fusion with Interaction Multiple Model and Chi-Square Test Tolerant Filter.
Yang, Chun; Mohammadi, Arash; Chen, Qing-Wei
2016-11-02
Motivated by the key importance of multi-sensor information fusion algorithms in the state-of-the-art integrated navigation systems due to recent advancements in sensor technologies, telecommunication, and navigation systems, the paper proposes an improved and innovative fault-tolerant fusion framework. An integrated navigation system is considered consisting of four sensory sub-systems, i.e., Strap-down Inertial Navigation System (SINS), Global Navigation System (GPS), the Bei-Dou2 (BD2) and Celestial Navigation System (CNS) navigation sensors. In such multi-sensor applications, on the one hand, the design of an efficient fusion methodology is extremely constrained specially when no information regarding the system's error characteristics is available. On the other hand, the development of an accurate fault detection and integrity monitoring solution is both challenging and critical. The paper addresses the sensitivity issues of conventional fault detection solutions and the unavailability of a precisely known system model by jointly designing fault detection and information fusion algorithms. In particular, by using ideas from Interacting Multiple Model (IMM) filters, the uncertainty of the system will be adjusted adaptively by model probabilities and using the proposed fuzzy-based fusion framework. The paper also addresses the problem of using corrupted measurements for fault detection purposes by designing a two state propagator chi-square test jointly with the fusion algorithm. Two IMM predictors, running in parallel, are used and alternatively reactivated based on the received information form the fusion filter to increase the reliability and accuracy of the proposed detection solution. With the combination of the IMM and the proposed fusion method, we increase the failure sensitivity of the detection system and, thereby, significantly increase the overall reliability and accuracy of the integrated navigation system. Simulation results indicate that the proposed fault tolerant fusion framework provides superior performance over its traditional counterparts.
Multi-Sensor Fusion with Interaction Multiple Model and Chi-Square Test Tolerant Filter
Yang, Chun; Mohammadi, Arash; Chen, Qing-Wei
2016-01-01
Motivated by the key importance of multi-sensor information fusion algorithms in the state-of-the-art integrated navigation systems due to recent advancements in sensor technologies, telecommunication, and navigation systems, the paper proposes an improved and innovative fault-tolerant fusion framework. An integrated navigation system is considered consisting of four sensory sub-systems, i.e., Strap-down Inertial Navigation System (SINS), Global Navigation System (GPS), the Bei-Dou2 (BD2) and Celestial Navigation System (CNS) navigation sensors. In such multi-sensor applications, on the one hand, the design of an efficient fusion methodology is extremely constrained specially when no information regarding the system’s error characteristics is available. On the other hand, the development of an accurate fault detection and integrity monitoring solution is both challenging and critical. The paper addresses the sensitivity issues of conventional fault detection solutions and the unavailability of a precisely known system model by jointly designing fault detection and information fusion algorithms. In particular, by using ideas from Interacting Multiple Model (IMM) filters, the uncertainty of the system will be adjusted adaptively by model probabilities and using the proposed fuzzy-based fusion framework. The paper also addresses the problem of using corrupted measurements for fault detection purposes by designing a two state propagator chi-square test jointly with the fusion algorithm. Two IMM predictors, running in parallel, are used and alternatively reactivated based on the received information form the fusion filter to increase the reliability and accuracy of the proposed detection solution. With the combination of the IMM and the proposed fusion method, we increase the failure sensitivity of the detection system and, thereby, significantly increase the overall reliability and accuracy of the integrated navigation system. Simulation results indicate that the proposed fault tolerant fusion framework provides superior performance over its traditional counterparts. PMID:27827832
Simulation analysis of a microcomputer-based, low-cost Omega navigation system
NASA Technical Reports Server (NTRS)
Lilley, R. W.; Salter, R. J., Jr.
1976-01-01
The current status of research on a proposed micro-computer-based, low-cost Omega Navigation System (ONS) is described. The design approach emphasizes minimum hardware, maximum software, and the use of a low-cost, commercially-available microcomputer. Currently under investigation is the implementation of a low-cost navigation processor and its interface with an omega sensor to complete the hardware-based ONS. Sensor processor functions are simulated to determine how many of the sensor processor functions can be handled by innovative software. An input data base of live Omega ground and flight test data was created. The Omega sensor and microcomputer interface modules used to collect the data are functionally described. Automatic synchronization to the Omega transmission pattern is described as an example of the algorithms developed using this data base.
Space shuttle onboard navigation console expert/trainer system
NASA Technical Reports Server (NTRS)
Wang, Lui; Bochsler, Dan
1987-01-01
A software system for use in enhancing operational performance as well as training ground controllers in monitoring onboard Space Shuttle navigation sensors is described. The Onboard Navigation (ONAV) development reflects a trend toward following a structured and methodical approach to development. The ONAV system must deal with integrated conventional and expert system software, complex interfaces, and implementation limitations due to the target operational environment. An overview of the onboard navigation sensor monitoring function is presented, along with a description of guidelines driving the development effort, requirements that the system must meet, current progress, and future efforts.
A simulation of GPS and differential GPS sensors
NASA Technical Reports Server (NTRS)
Rankin, James M.
1993-01-01
The Global Positioning System (GPS) is a revolutionary advance in navigation. Users can determine latitude, longitude, and altitude by receiving range information from at least four satellites. The statistical accuracy of the user's position is directly proportional to the statistical accuracy of the range measurement. Range errors are caused by clock errors, ephemeris errors, atmospheric delays, multipath errors, and receiver noise. Selective Availability, which the military uses to intentionally degrade accuracy for non-authorized users, is a major error source. The proportionality constant relating position errors to range errors is the Dilution of Precision (DOP) which is a function of the satellite geometry. Receivers separated by relatively short distances have the same satellite and atmospheric errors. Differential GPS (DGPS) removes these errors by transmitting pseudorange corrections from a fixed receiver to a mobile receiver. The corrected pseudorange at the moving receiver is now corrupted only by errors from the receiver clock, multipath, and measurement noise. This paper describes a software package that models position errors for various GPS and DGPS systems. The error model is used in the Real-Time Simulator and Cockpit Technology workstation simulations at NASA-LaRC. The GPS/DGPS sensor can simulate enroute navigation, instrument approaches, or on-airport navigation.
High-Fidelity Flash Lidar Model Development
NASA Technical Reports Server (NTRS)
Hines, Glenn D.; Pierrottet, Diego F.; Amzajerdian, Farzin
2014-01-01
NASA's Autonomous Landing and Hazard Avoidance Technologies (ALHAT) project is currently developing the critical technologies to safely and precisely navigate and land crew, cargo and robotic spacecraft vehicles on and around planetary bodies. One key element of this project is a high-fidelity Flash Lidar sensor that can generate three-dimensional (3-D) images of the planetary surface. These images are processed with hazard detection and avoidance and hazard relative navigation algorithms, and then are subsequently used by the Guidance, Navigation and Control subsystem to generate an optimal navigation solution. A complex, high-fidelity model of the Flash Lidar was developed in order to evaluate the performance of the sensor and its interaction with the interfacing ALHAT components on vehicles with different configurations and under different flight trajectories. The model contains a parameterized, general approach to Flash Lidar detection and reflects physical attributes such as range and electronic noise sources, and laser pulse temporal and spatial profiles. It also provides the realistic interaction of the laser pulse with terrain features that include varying albedo, boulders, craters slopes and shadows. This paper gives a description of the Flash Lidar model and presents results from the Lidar operating under different scenarios.
Fuzzy adaptive interacting multiple model nonlinear filter for integrated navigation sensor fusion.
Tseng, Chien-Hao; Chang, Chih-Wen; Jwo, Dah-Jing
2011-01-01
In this paper, the application of the fuzzy interacting multiple model unscented Kalman filter (FUZZY-IMMUKF) approach to integrated navigation processing for the maneuvering vehicle is presented. The unscented Kalman filter (UKF) employs a set of sigma points through deterministic sampling, such that a linearization process is not necessary, and therefore the errors caused by linearization as in the traditional extended Kalman filter (EKF) can be avoided. The nonlinear filters naturally suffer, to some extent, the same problem as the EKF for which the uncertainty of the process noise and measurement noise will degrade the performance. As a structural adaptation (model switching) mechanism, the interacting multiple model (IMM), which describes a set of switching models, can be utilized for determining the adequate value of process noise covariance. The fuzzy logic adaptive system (FLAS) is employed to determine the lower and upper bounds of the system noise through the fuzzy inference system (FIS). The resulting sensor fusion strategy can efficiently deal with the nonlinear problem for the vehicle navigation. The proposed FUZZY-IMMUKF algorithm shows remarkable improvement in the navigation estimation accuracy as compared to the relatively conventional approaches such as the UKF and IMMUKF.
NASA Astrophysics Data System (ADS)
Griesbach, J.; Westphal, J. J.; Roscoe, C.; Hawes, D. R.; Carrico, J. P.
2013-09-01
The Proximity Operations Nano-Satellite Flight Demonstration (PONSFD) program is to demonstrate rendezvous proximity operations (RPO), formation flying, and docking with a pair of 3U CubeSats. The program is sponsored by NASA Ames via the Office of the Chief Technologist (OCT) in support of its Small Spacecraft Technology Program (SSTP). The goal of the mission is to demonstrate complex RPO and docking operations with a pair of low-cost 3U CubeSat satellites using passive navigation sensors. The program encompasses the entire system evolution including system design, acquisition, satellite construction, launch, mission operations, and final disposal. The satellite is scheduled for launch in Fall 2015 with a 1-year mission lifetime. This paper provides a brief mission overview but will then focus on the current design and driving trade study results for the RPO mission specific processor and relevant ground software. The current design involves multiple on-board processors, each specifically tasked with providing mission critical capabilities. These capabilities range from attitude determination and control to image processing. The RPO system processor is responsible for absolute and relative navigation, maneuver planning, attitude commanding, and abort monitoring for mission safety. A low power processor running a Linux operating system has been selected for implementation. Navigation is one of the RPO processor's key tasks. This entails processing data obtained from the on-board GPS unit as well as the on-board imaging sensors. To do this, Kalman filters will be hosted on the processor to ingest and process measurements for maintenance of position and velocity estimates with associated uncertainties. While each satellite carries a GPS unit, it will be used sparsely to conserve power. As such, absolute navigation will mainly consist of propagating past known states, and relative navigation will be considered to be of greater importance. For relative observations, each spacecraft hosts 3 electro-optical sensors dedicated to imaging the companion satellite. The image processor will analyze the images to obtain estimates for range, bearing, and pose, with associated rates and uncertainties. These observations will be fed to the RPO processor's relative Kalman filter to perform relative navigation updates. This paper includes estimates for expected navigation accuracies for both absolute and relative position and velocity. Another key task for the RPO processor is maneuver planning. This includes automation to plan maneuvers to achieve a desired formation configuration or trajectory (including docking), as well as automation to safely react to potentially dangerous situations. This will allow each spacecraft to autonomously plan fuel-efficient maneuvers to achieve a desired trajectory as well as compute adjustment maneuvers to correct for thrusting errors. This paper discusses results from a trade study that has been conducted to examine maneuver targeting algorithms required on-board the spacecraft. Ground software will also work in conjunction with the on-board software to validate and approve maneuvers as necessary.
Rigorous Performance Evaluation of Smartphone GNSS/IMU Sensors for ITS Applications
Gikas, Vassilis; Perakis, Harris
2016-01-01
With the rapid growth in smartphone technologies and improvement in their navigation sensors, an increasing amount of location information is now available, opening the road to the provision of new Intelligent Transportation System (ITS) services. Current smartphone devices embody miniaturized Global Navigation Satellite System (GNSS), Inertial Measurement Unit (IMU) and other sensors capable of providing user position, velocity and attitude. However, it is hard to characterize their actual positioning and navigation performance capabilities due to the disparate sensor and software technologies adopted among manufacturers and the high influence of environmental conditions, and therefore, a unified certification process is missing. This paper presents the analysis results obtained from the assessment of two modern smartphones regarding their positioning accuracy (i.e., precision and trueness) capabilities (i.e., potential and limitations) based on a practical but rigorous methodological approach. Our investigation relies on the results of several vehicle tracking (i.e., cruising and maneuvering) tests realized through comparing smartphone obtained trajectories and kinematic parameters to those derived using a high-end GNSS/IMU system and advanced filtering techniques. Performance testing is undertaken for the HTC One S (Android) and iPhone 5s (iOS). Our findings indicate that the deviation of the smartphone locations from ground truth (trueness) deteriorates by a factor of two in obscured environments compared to those derived in open sky conditions. Moreover, it appears that iPhone 5s produces relatively smaller and less dispersed error values compared to those computed for HTC One S. Also, the navigation solution of the HTC One S appears to adapt faster to changes in environmental conditions, suggesting a somewhat different data filtering approach for the iPhone 5s. Testing the accuracy of the accelerometer and gyroscope sensors for a number of maneuvering (speeding, turning, etc.,) events reveals high consistency between smartphones, whereas the small deviations from ground truth verify their high potential even for critical ITS safety applications. PMID:27527187
Rigorous Performance Evaluation of Smartphone GNSS/IMU Sensors for ITS Applications.
Gikas, Vassilis; Perakis, Harris
2016-08-05
With the rapid growth in smartphone technologies and improvement in their navigation sensors, an increasing amount of location information is now available, opening the road to the provision of new Intelligent Transportation System (ITS) services. Current smartphone devices embody miniaturized Global Navigation Satellite System (GNSS), Inertial Measurement Unit (IMU) and other sensors capable of providing user position, velocity and attitude. However, it is hard to characterize their actual positioning and navigation performance capabilities due to the disparate sensor and software technologies adopted among manufacturers and the high influence of environmental conditions, and therefore, a unified certification process is missing. This paper presents the analysis results obtained from the assessment of two modern smartphones regarding their positioning accuracy (i.e., precision and trueness) capabilities (i.e., potential and limitations) based on a practical but rigorous methodological approach. Our investigation relies on the results of several vehicle tracking (i.e., cruising and maneuvering) tests realized through comparing smartphone obtained trajectories and kinematic parameters to those derived using a high-end GNSS/IMU system and advanced filtering techniques. Performance testing is undertaken for the HTC One S (Android) and iPhone 5s (iOS). Our findings indicate that the deviation of the smartphone locations from ground truth (trueness) deteriorates by a factor of two in obscured environments compared to those derived in open sky conditions. Moreover, it appears that iPhone 5s produces relatively smaller and less dispersed error values compared to those computed for HTC One S. Also, the navigation solution of the HTC One S appears to adapt faster to changes in environmental conditions, suggesting a somewhat different data filtering approach for the iPhone 5s. Testing the accuracy of the accelerometer and gyroscope sensors for a number of maneuvering (speeding, turning, etc.,) events reveals high consistency between smartphones, whereas the small deviations from ground truth verify their high potential even for critical ITS safety applications.
Enhanced Pedestrian Navigation Based on Course Angle Error Estimation Using Cascaded Kalman Filters
Park, Chan Gook
2018-01-01
An enhanced pedestrian dead reckoning (PDR) based navigation algorithm, which uses two cascaded Kalman filters (TCKF) for the estimation of course angle and navigation errors, is proposed. The proposed algorithm uses a foot-mounted inertial measurement unit (IMU), waist-mounted magnetic sensors, and a zero velocity update (ZUPT) based inertial navigation technique with TCKF. The first stage filter estimates the course angle error of a human, which is closely related to the heading error of the IMU. In order to obtain the course measurements, the filter uses magnetic sensors and a position-trace based course angle. For preventing magnetic disturbance from contaminating the estimation, the magnetic sensors are attached to the waistband. Because the course angle error is mainly due to the heading error of the IMU, and the characteristic error of the heading angle is highly dependent on that of the course angle, the estimated course angle error is used as a measurement for estimating the heading error in the second stage filter. At the second stage, an inertial navigation system-extended Kalman filter-ZUPT (INS-EKF-ZUPT) method is adopted. As the heading error is estimated directly by using course-angle error measurements, the estimation accuracy for the heading and yaw gyro bias can be enhanced, compared with the ZUPT-only case, which eventually enhances the position accuracy more efficiently. The performance enhancements are verified via experiments, and the way-point position error for the proposed method is compared with those for the ZUPT-only case and with other cases that use ZUPT and various types of magnetic heading measurements. The results show that the position errors are reduced by a maximum of 90% compared with the conventional ZUPT based PDR algorithms. PMID:29690539
Luo, Xiongbiao
2014-06-01
Various bronchoscopic navigation systems are developed for diagnosis, staging, and treatment of lung and bronchus cancers. To construct electromagnetically navigated bronchoscopy systems, registration of preoperative images and an electromagnetic tracker must be performed. This paper proposes a new marker-free registration method, which uses the centerlines of the bronchial tree and the center of a bronchoscope tip where an electromagnetic sensor is attached, to align preoperative images and electromagnetic tracker systems. The chest computed tomography (CT) volume (preoperative images) was segmented to extract the bronchial centerlines. An electromagnetic sensor was fixed at the bronchoscope tip surface. A model was designed and printed using a 3D printer to calibrate the relationship between the fixed sensor and the bronchoscope tip center. For each sensor measurement that includes sensor position and orientation information, its corresponding bronchoscope tip center position was calculated. By minimizing the distance between each bronchoscope tip center position and the bronchial centerlines, the spatial alignment of the electromagnetic tracker system and the CT volume was determined. After obtaining the spatial alignment, an electromagnetic navigation bronchoscopy system was established to real-timely track or locate a bronchoscope inside the bronchial tree during bronchoscopic examinations. The electromagnetic navigation bronchoscopy system was validated on a dynamic bronchial phantom that can simulate respiratory motion with a breath rate range of 0-10 min(-1). The fiducial and target registration errors of this navigation system were evaluated. The average fiducial registration error was reduced from 8.7 to 6.6 mm. The average target registration error, which indicates all tracked or navigated bronchoscope position accuracy, was much reduced from 6.8 to 4.5 mm compared to previous registration methods. An electromagnetically navigated bronchoscopy system was constructed with accurate registration of an electromagnetic tracker and the CT volume on the basis of an improved marker-free registration approach that uses the bronchial centerlines and bronchoscope tip center information. The fiducial and target registration errors of our electromagnetic navigation system were about 6.6 and 4.5 mm in dynamic bronchial phantom validation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Xiongbiao, E-mail: xiongbiao.luo@gmail.com
2014-06-15
Purpose: Various bronchoscopic navigation systems are developed for diagnosis, staging, and treatment of lung and bronchus cancers. To construct electromagnetically navigated bronchoscopy systems, registration of preoperative images and an electromagnetic tracker must be performed. This paper proposes a new marker-free registration method, which uses the centerlines of the bronchial tree and the center of a bronchoscope tip where an electromagnetic sensor is attached, to align preoperative images and electromagnetic tracker systems. Methods: The chest computed tomography (CT) volume (preoperative images) was segmented to extract the bronchial centerlines. An electromagnetic sensor was fixed at the bronchoscope tip surface. A model wasmore » designed and printed using a 3D printer to calibrate the relationship between the fixed sensor and the bronchoscope tip center. For each sensor measurement that includes sensor position and orientation information, its corresponding bronchoscope tip center position was calculated. By minimizing the distance between each bronchoscope tip center position and the bronchial centerlines, the spatial alignment of the electromagnetic tracker system and the CT volume was determined. After obtaining the spatial alignment, an electromagnetic navigation bronchoscopy system was established to real-timely track or locate a bronchoscope inside the bronchial tree during bronchoscopic examinations. Results: The electromagnetic navigation bronchoscopy system was validated on a dynamic bronchial phantom that can simulate respiratory motion with a breath rate range of 0–10 min{sup −1}. The fiducial and target registration errors of this navigation system were evaluated. The average fiducial registration error was reduced from 8.7 to 6.6 mm. The average target registration error, which indicates all tracked or navigated bronchoscope position accuracy, was much reduced from 6.8 to 4.5 mm compared to previous registration methods. Conclusions: An electromagnetically navigated bronchoscopy system was constructed with accurate registration of an electromagnetic tracker and the CT volume on the basis of an improved marker-free registration approach that uses the bronchial centerlines and bronchoscope tip center information. The fiducial and target registration errors of our electromagnetic navigation system were about 6.6 and 4.5 mm in dynamic bronchial phantom validation.« less
2001 Flight Mechanics Symposium
NASA Technical Reports Server (NTRS)
Lynch, John P. (Editor)
2001-01-01
This conference publication includes papers and abstracts presented at the Flight Mechanics Symposium held on June 19-21, 2001. Sponsored by the Guidance, Navigation and Control Center of Goddard Space Flight Center, this symposium featured technical papers on a wide range of issues related to attitude/orbit determination, prediction and control; attitude simulation; attitude sensor calibration; theoretical foundation of attitude computation; dynamics model improvements; autonomous navigation; constellation design and formation flying; estimation theory and computational techniques; Earth environment mission analysis and design; and, spacecraft re-entry mission design and operations.
Ganji, Yusof; Janabi-Sharifi, Farrokh; Cheema, Asim N
2011-12-01
Despite the recent advances in catheter design and technology, intra-cardiac navigation during electrophysiology procedures remains challenging. Incorporation of imaging along with magnetic or robotic guidance may improve navigation accuracy and procedural safety. In the present study, the in vivo performance of a novel remote controlled Robot Assisted Cardiac Navigation System (RACN) was evaluated in a porcine model. The navigation catheter and target sensor were advanced to the right atrium using fluoroscopic and intra-cardiac echo guidance. The target sensor was positioned at three target locations in the right atrium (RA) and the navigation task was completed by an experienced physician using both manual and RACN guidance. The navigation time, final distance between the catheter tip and target sensor, and variability in final catheter tip position were determined and compared for manual and RACN guided navigation. The experiments were completed in three animals and five measurements recorded for each target location. The mean distance (mm) between catheter tip and target sensor at the end of the navigation task was significantly less using RACN guidance compared with manual navigation (5.02 ± 0.31 vs. 9.66 ± 2.88, p = 0.050 for high RA, 9.19 ± 1.13 vs. 13.0 ± 1.00, p = 0.011 for low RA and 6.77 ± 0.59 vs. 15.66 ± 2.51, p = 0.003 for tricuspid valve annulus). The average time (s) needed to complete the navigation task was significantly longer by RACN guided navigation compared with manual navigation (43.31 ± 18.19 vs. 13.54 ± 1.36, p = 0.047 for high RA, 43.71 ± 11.93 vs. 22.71 ± 3.79, p = 0.043 for low RA and 37.84 ± 3.71 vs. 16.13 ± 4.92, p = 0.003 for tricuspid valve annulus. RACN guided navigation resulted in greater consistency in performance compared with manual navigation as evidenced by lower variability in final distance measurements (0.41 vs. 0.99 mm, p = 0.04). This study demonstrated the safety and feasibility of the RACN system for cardiac navigation. The results demonstrated that RACN performed comparably with manual navigation, with improved precision and consistency for targets located in and near the right atrial chamber. Copyright © 2011 John Wiley & Sons, Ltd.
Terrain matching image pre-process and its format transform in autonomous underwater navigation
NASA Astrophysics Data System (ADS)
Cao, Xuejun; Zhang, Feizhou; Yang, Dongkai; Yang, Bogang
2007-06-01
Underwater passive navigation technology is one of the important development orientations in the field of modern navigation. With the advantage of high self-determination, stealth at sea, anti-jamming and high precision, passive navigation is completely meet with actual navigation requirements. Therefore passive navigation has become a specific navigating method for underwater vehicles. The scientists and researchers in the navigating field paid more attention to it. The underwater passive navigation can provide accurate navigation information with main Inertial Navigation System (INS) for a long period, such as location and speed. Along with the development of micro-electronics technology, the navigation of AUV is given priority to INS assisted with other navigation methods, such as terrain matching navigation. It can provide navigation ability for a long period, correct the errors of INS and make AUV not emerge from the seabed termly. With terrain matching navigation technique, in the assistance of digital charts and ocean geographical characteristics sensors, we carry through underwater image matching assistant navigation to obtain the higher location precision, therefore it is content with the requirement of underwater, long-term, high precision and all-weather of the navigation system for Autonomous Underwater Vehicles. Tertian-assistant navigation (TAN) is directly dependent on the image information (map information) in the navigating field to assist the primary navigation system according to the path appointed in advance. In TAN, a factor coordinative important with the system operation is precision and practicability of the storable images and the database which produce the image data. If the data used for characteristics are not suitable, the system navigation precision will be low. Comparing with terrain matching assistant navigation system, image matching navigation system is a kind of high precision and low cost assistant navigation system, and its matching precision directly influences the final precision of integrated navigation system. Image matching assistant navigation is spatially matching and aiming at two underwater scenery images coming from two different sensors matriculating of the same scenery in order to confirm the relative displacement of the two images. In this way, we can obtain the vehicle's location in fiducial image known geographical relation, and the precise location information given from image matching location is transmitted to INS to eliminate its location error and greatly enhance the navigation precision of vehicle. Digital image data analysis and processing of image matching in underwater passive navigation is important. In regard to underwater geographic data analysis, we focus on the acquirement, disposal, analysis, expression and measurement of database information. These analysis items structure one of the important contents of underwater terrain matching and are propitious to know the seabed terrain configuration of navigation areas so that the best advantageous seabed terrain district and dependable navigation algorithm can be selected. In this way, we can improve the precision and reliability of terrain assistant navigation system. The pre-process and format transformation of digital image during underwater image matching are expatiated in this paper. The information of the terrain status in navigation areas need further study to provide the reliable data terrain characteristic and underwater overcast for navigation. Through realizing the choice of sea route, danger district prediction and navigating algorithm analysis, TAN can obtain more high location precision and probability, hence provide technological support for image matching of underwater passive navigation.
Performance Characteristic Mems-Based IMUs for UAVs Navigation
NASA Astrophysics Data System (ADS)
Mohamed, H. A.; Hansen, J. M.; Elhabiby, M. M.; El-Sheimy, N.; Sesay, A. B.
2015-08-01
Accurate 3D reconstruction has become essential for non-traditional mapping applications such as urban planning, mining industry, environmental monitoring, navigation, surveillance, pipeline inspection, infrastructure monitoring, landslide hazard analysis, indoor localization, and military simulation. The needs of these applications cannot be satisfied by traditional mapping, which is based on dedicated data acquisition systems designed for mapping purposes. Recent advances in hardware and software development have made it possible to conduct accurate 3D mapping without using costly and high-end data acquisition systems. Low-cost digital cameras, laser scanners, and navigation systems can provide accurate mapping if they are properly integrated at the hardware and software levels. Unmanned Aerial Vehicles (UAVs) are emerging as a mobile mapping platform that can provide additional economical and practical advantages. However, such economical and practical requirements need navigation systems that can provide uninterrupted navigation solution. Hence, testing the performance characteristics of Micro-Electro-Mechanical Systems (MEMS) or low cost navigation sensors for various UAV applications is important research. This work focuses on studying the performance characteristics under different manoeuvres using inertial measurements integrated with single point positioning, Real-Time-Kinematic (RTK), and additional navigational aiding sensors. Furthermore, the performance of the inertial sensors is tested during Global Positioning System (GPS) signal outage.
2010-11-01
3-10 Multiple Images of an Image Sequence Figure 3-10 A Digital Magnetic Compass from KVH Industries 3-11 Figure 3-11 Earth’s Magnetic Field 3-11...ARINO SENER – Ingenieria y Sistemas S.A Aerospace Division Parque Tecnologico de Madrid Calle Severo Ocho 4 28760 Tres Cantos Madrid Email...experts from government, academia, industry and the military produced an analysis of future navigation sensors and systems whose performance
NASA Technical Reports Server (NTRS)
Pines, S.; Hueschen, R. M.
1978-01-01
This paper describes the navigation and guidance system developed for the TCV B-737, a Langley Field NASA research aircraft, and presents the results of an evaluation during final approach, landing, rollout and turnoff obtained through a nonlinear digital simulation. A Kalman filter (implemented in square root form) and a third order complementary filter were developed and compared for navigation. The Microwave Landing Systems (MLS) is used for all phases of the flight for navigation and guidance. In addition, for rollout and turnoff, a three coil sensor which detects the magnetic field induced by a buried wire in the runway (magnetic leader cable) is used. The outputs of the sensor are processed into measurements of position and heading deviation from the wire. The results show the concept to be both feasible and practical for commercial type aircraft terminal area control.
Kotze, Ben; Jordaan, Gerrit
2014-08-25
Automatic Guided Vehicles (AGVs) are navigated utilising multiple types of sensors for detecting the environment. In this investigation such sensors are replaced and/or minimized by the use of a single omnidirectional camera picture stream. An area of interest is extracted, and by using image processing the vehicle is navigated on a set path. Reconfigurability is added to the route layout by signs incorporated in the navigation process. The result is the possible manipulation of a number of AGVs, each on its own designated colour-signed path. This route is reconfigurable by the operator with no programming alteration or intervention. A low resolution camera and a Matlab® software development platform are utilised. The use of Matlab® lends itself to speedy evaluation and implementation of image processing options on the AGV, but its functioning in such an environment needs to be assessed.
Kotze, Ben; Jordaan, Gerrit
2014-01-01
Automatic Guided Vehicles (AGVs) are navigated utilising multiple types of sensors for detecting the environment. In this investigation such sensors are replaced and/or minimized by the use of a single omnidirectional camera picture stream. An area of interest is extracted, and by using image processing the vehicle is navigated on a set path. Reconfigurability is added to the route layout by signs incorporated in the navigation process. The result is the possible manipulation of a number of AGVs, each on its own designated colour-signed path. This route is reconfigurable by the operator with no programming alteration or intervention. A low resolution camera and a Matlab® software development platform are utilised. The use of Matlab® lends itself to speedy evaluation and implementation of image processing options on the AGV, but its functioning in such an environment needs to be assessed. PMID:25157548
A navigation system for the visually impaired an intelligent white cane.
Fukasawa, A Jin; Magatani, Kazusihge
2012-01-01
In this paper, we describe about a developed navigation system that supports the independent walking of the visually impaired in the indoor space. Our developed instrument consists of a navigation system and a map information system. These systems are installed on a white cane. Our navigation system can follow a colored navigation line that is set on the floor. In this system, a color sensor installed on the tip of a white cane, this sensor senses a color of navigation line and the system informs the visually impaired that he/she is walking along the navigation line by vibration. This color recognition system is controlled by a one-chip microprocessor. RFID tags and a receiver for these tags are used in the map information system. RFID tags are set on the colored navigation line. An antenna for RFID tags and a tag receiver are also installed on a white cane. The receiver receives the area information as a tag-number and notifies map information to the user by mp3 formatted pre-recorded voice. And now, we developed the direction identification technique. Using this technique, we can detect a user's walking direction. A triaxiality acceleration sensor is used in this system. Three normal subjects who were blindfolded with an eye mask were tested with our developed navigation system. All of them were able to walk along the navigation line perfectly. We think that the performance of the system is good. Therefore, our system will be extremely valuable in supporting the activities of the visually impaired.
NASA Technical Reports Server (NTRS)
Dennehy, Cornelius J.
2013-01-01
The NASA Engineering and Safety Center (NESC) received a request from the NASA Associate Administrator (AA) for Human Exploration and Operations Mission Directorate (HEOMD), to quantitatively evaluate the individual performance of three light detection and ranging (LIDAR) rendezvous sensors flown as orbiter's development test objective on Space Transportation System (STS)-127, STS-133, STS-134, and STS-135. This document contains the outcome of the NESC assessment.
Zhang, Jiayu; Li, Jie; Zhang, Xi; Che, Xiaorui; Huang, Yugang; Feng, Kaiqiang
2018-05-04
The Semi-Strapdown Inertial Navigation System (SSINS) provides a new solution to attitude measurement of a high-speed rotating missile. However, micro-electro-mechanical-systems (MEMS) inertial measurement unit (MIMU) outputs are corrupted by significant sensor errors. In order to improve the navigation precision, a rotation modulation technology method called Rotation Semi-Strapdown Inertial Navigation System (RSSINS) is introduced into SINS. In fact, the stability of the modulation angular rate is difficult to achieve in a high-speed rotation environment. The changing rotary angular rate has an impact on the inertial sensor error self-compensation. In this paper, the influence of modulation angular rate error, including acceleration-deceleration process, and instability of the angular rate on the navigation accuracy of RSSINS is deduced and the error characteristics of the reciprocating rotation scheme are analyzed. A new compensation method is proposed to remove or reduce sensor errors so as to make it possible to maintain high precision autonomous navigation performance by MIMU when there is no external aid. Experiments have been carried out to validate the performance of the method. In addition, the proposed method is applicable for modulation angular rate error compensation under various dynamic conditions.
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Gu, Chengfan
2018-01-01
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation. PMID:29415509
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter.
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Zhong, Yongmin; Gu, Chengfan
2018-02-06
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation.
Calibration Of An Omnidirectional Vision Navigation System Using An Industrial Robot
NASA Astrophysics Data System (ADS)
Oh, Sung J.; Hall, Ernest L.
1989-09-01
The characteristics of an omnidirectional vision navigation system were studied to determine position accuracy for the navigation and path control of a mobile robot. Experiments for calibration and other parameters were performed using an industrial robot to conduct repetitive motions. The accuracy and repeatability of the experimental setup and the alignment between the robot and the sensor provided errors of less than 1 pixel on each axis. Linearity between zenith angle and image location was tested at four different locations. Angular error of less than 1° and radial error of less than 1 pixel were observed at moderate speed variations. The experimental information and the test of coordinated operation of the equipment provide understanding of characteristics as well as insight into the evaluation and improvement of the prototype dynamic omnivision system. The calibration of the sensor is important since the accuracy of navigation influences the accuracy of robot motion. This sensor system is currently being developed for a robot lawn mower; however, wider applications are obvious. The significance of this work is that it adds to the knowledge of the omnivision sensor.
Doppler lidar sensor for precision navigation in GPS-deprived environment
NASA Astrophysics Data System (ADS)
Amzajerdian, F.; Pierrottet, D. F.; Hines, G. D.; Petway, L. B.; Barnes, B. W.
2013-05-01
Landing mission concepts that are being developed for exploration of solar system bodies are increasingly ambitious in their implementations and objectives. Most of these missions require accurate position and velocity data during their descent phase in order to ensure safe, soft landing at the pre-designated sites. Data from the vehicle's Inertial Measurement Unit will not be sufficient due to significant drift error after extended travel time in space. Therefore, an onboard sensor is required to provide the necessary data for landing in the GPS-deprived environment of space. For this reason, NASA Langley Research Center has been developing an advanced Doppler lidar sensor capable of providing accurate and reliable data suitable for operation in the highly constrained environment of space. The Doppler lidar transmits three laser beams in different directions toward the ground. The signal from each beam provides the platform velocity and range to the ground along the laser line-of-sight (LOS). The six LOS measurements are then combined in order to determine the three components of the vehicle velocity vector, and to accurately measure altitude and attitude angles relative to the local ground. These measurements are used by an autonomous Guidance, Navigation, and Control system to accurately navigate the vehicle from a few kilometers above the ground to the designated location and to execute a gentle touchdown. A prototype version of our lidar sensor has been completed for a closed-loop demonstration onboard a rocket-powered terrestrial free-flyer vehicle.
Doppler Lidar Sensor for Precision Navigation in GPS-Deprived Environment
NASA Technical Reports Server (NTRS)
Amzajerdian, F.; Pierrottet, D. F.; Hines, G. D.; Hines, G. D.; Petway, L. B.; Barnes, B. W.
2013-01-01
Landing mission concepts that are being developed for exploration of solar system bodies are increasingly ambitious in their implementations and objectives. Most of these missions require accurate position and velocity data during their descent phase in order to ensure safe, soft landing at the pre-designated sites. Data from the vehicle's Inertial Measurement Unit will not be sufficient due to significant drift error after extended travel time in space. Therefore, an onboard sensor is required to provide the necessary data for landing in the GPS-deprived environment of space. For this reason, NASA Langley Research Center has been developing an advanced Doppler lidar sensor capable of providing accurate and reliable data suitable for operation in the highly constrained environment of space. The Doppler lidar transmits three laser beams in different directions toward the ground. The signal from each beam provides the platform velocity and range to the ground along the laser line-of-sight (LOS). The six LOS measurements are then combined in order to determine the three components of the vehicle velocity vector, and to accurately measure altitude and attitude angles relative to the local ground. These measurements are used by an autonomous Guidance, Navigation, and Control system to accurately navigate the vehicle from a few kilometers above the ground to the designated location and to execute a gentle touchdown. A prototype version of our lidar sensor has been completed for a closed-loop demonstration onboard a rocket-powered terrestrial free-flyer vehicle.
Guidance Of A Mobile Robot Using An Omnidirectional Vision Navigation System
NASA Astrophysics Data System (ADS)
Oh, Sung J.; Hall, Ernest L.
1987-01-01
Navigation and visual guidance are key topics in the design of a mobile robot. Omnidirectional vision using a very wide angle or fisheye lens provides a hemispherical view at a single instant that permits target location without mechanical scanning. The inherent image distortion with this view and the numerical errors accumulated from vision components can be corrected to provide accurate position determination for navigation and path control. The purpose of this paper is to present the experimental results and analyses of the imaging characteristics of the omnivision system including the design of robot-oriented experiments and the calibration of raw results. Errors less than one picture element on each axis were observed by testing the accuracy and repeatability of the experimental setup and the alignment between the robot and the sensor. Similar results were obtained for four different locations using corrected results of the linearity test between zenith angle and image location. Angular error of less than one degree and radial error of less than one Y picture element were observed at moderate relative speed. The significance of this work is that the experimental information and the test of coordinated operation of the equipment provide a greater understanding of the dynamic omnivision system characteristics, as well as insight into the evaluation and improvement of the prototype sensor for a mobile robot. Also, the calibration of the sensor is important, since the results provide a cornerstone for future developments. This sensor system is currently being developed for a robot lawn mower.
Huang, Haoqian; Chen, Xiyuan; Zhang, Bo; Wang, Jian
2017-01-01
The underwater navigation system, mainly consisting of MEMS inertial sensors, is a key technology for the wide application of underwater gliders and plays an important role in achieving high accuracy navigation and positioning for a long time of period. However, the navigation errors will accumulate over time because of the inherent errors of inertial sensors, especially for MEMS grade IMU (Inertial Measurement Unit) generally used in gliders. The dead reckoning module is added to compensate the errors. In the complicated underwater environment, the performance of MEMS sensors is degraded sharply and the errors will become much larger. It is difficult to establish the accurate and fixed error model for the inertial sensor. Therefore, it is very hard to improve the accuracy of navigation information calculated by sensors. In order to solve the problem mentioned, the more suitable filter which integrates the multi-model method with an EKF approach can be designed according to different error models to give the optimal estimation for the state. The key parameters of error models can be used to determine the corresponding filter. The Adams explicit formula which has an advantage of high precision prediction is simultaneously fused into the above filter to achieve the much more improvement in attitudes estimation accuracy. The proposed algorithm has been proved through theory analyses and has been tested by both vehicle experiments and lake trials. Results show that the proposed method has better accuracy and effectiveness in terms of attitudes estimation compared with other methods mentioned in the paper for inertial navigation applied to underwater gliders. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Integration of Kinect and Low-Cost Gnss for Outdoor Navigation
NASA Astrophysics Data System (ADS)
Pagliaria, D.; Pinto, L.; Reguzzoni, M.; Rossi, L.
2016-06-01
Since its launch on the market, Microsoft Kinect sensor has represented a great revolution in the field of low cost navigation, especially for indoor robotic applications. In fact, this system is endowed with a depth camera, as well as a visual RGB camera, at a cost of about 200. The characteristics and the potentiality of the Kinect sensor have been widely studied for indoor applications. The second generation of this sensor has been announced to be capable of acquiring data even outdoors, under direct sunlight. The task of navigating passing from an indoor to an outdoor environment (and vice versa) is very demanding because the sensors that work properly in one environment are typically unsuitable in the other one. In this sense the Kinect could represent an interesting device allowing bridging the navigation solution between outdoor and indoor. In this work the accuracy and the field of application of the new generation of Kinect sensor have been tested outdoor, considering different lighting conditions and the reflective properties of the emitted ray on different materials. Moreover, an integrated system with a low cost GNSS receiver has been studied, with the aim of taking advantage of the GNSS positioning when the satellite visibility conditions are good enough. A kinematic test has been performed outdoor by using a Kinect sensor and a GNSS receiver and it is here presented.
Achieving Real-Time Tracking Mobile Wireless Sensors Using SE-KFA
NASA Astrophysics Data System (ADS)
Kadhim Hoomod, Haider, Dr.; Al-Chalabi, Sadeem Marouf M.
2018-05-01
Nowadays, Real-Time Achievement is very important in different fields, like: Auto transport control, some medical applications, celestial body tracking, controlling agent movements, detections and monitoring, etc. This can be tested by different kinds of detection devices, which named "sensors" as such as: infrared sensors, ultrasonic sensor, radars in general, laser light sensor, and so like. Ultrasonic Sensor is the most fundamental one and it has great impact and challenges comparing with others especially when navigating (as an agent). In this paper, concerning to the ultrasonic sensor, sensor(s) detecting and delimitation by themselves then navigate inside a limited area to estimating Real-Time using Speed Equation with Kalman Filter Algorithm as an intelligent estimation algorithm. Then trying to calculate the error comparing to the factual rate of tracking. This paper used Ultrasonic Sensor HC-SR04 with Arduino-UNO as Microcontroller.
Attitude Estimation in Fractionated Spacecraft Cluster Systems
NASA Technical Reports Server (NTRS)
Hadaegh, Fred Y.; Blackmore, James C.
2011-01-01
An attitude estimation was examined in fractioned free-flying spacecraft. Instead of a single, monolithic spacecraft, a fractionated free-flying spacecraft uses multiple spacecraft modules. These modules are connected only through wireless communication links and, potentially, wireless power links. The key advantage of this concept is the ability to respond to uncertainty. For example, if a single spacecraft module in the cluster fails, a new one can be launched at a lower cost and risk than would be incurred with onorbit servicing or replacement of the monolithic spacecraft. In order to create such a system, however, it is essential to know what the navigation capabilities of the fractionated system are as a function of the capabilities of the individual modules, and to have an algorithm that can perform estimation of the attitudes and relative positions of the modules with fractionated sensing capabilities. Looking specifically at fractionated attitude estimation with startrackers and optical relative attitude sensors, a set of mathematical tools has been developed that specify the set of sensors necessary to ensure that the attitude of the entire cluster ( cluster attitude ) can be observed. Also developed was a navigation filter that can estimate the cluster attitude if these conditions are satisfied. Each module in the cluster may have either a startracker, a relative attitude sensor, or both. An extended Kalman filter can be used to estimate the attitude of all modules. A range of estimation performances can be achieved depending on the sensors used and the topology of the sensing network.
Song, Tianxiao; Wang, Xueyun; Liang, Wenwei; Xing, Li
2018-05-14
Benefiting from frame structure, RINS can improve the navigation accuracy by modulating the inertial sensor errors with proper rotation scheme. In the traditional motor control method, the measurements of the photoelectric encoder are always adopted to drive inertial measurement unit (IMU) to rotate. However, when carrier conducts heading motion, the inertial sensor errors may no longer be zero-mean in navigation coordinate. Meanwhile, some high-speed carriers like aircraft need to roll a certain angle to balance the centrifugal force during the heading motion, which may result in non-negligible coupling errors, caused by the FOG installation errors and scale factor errors. Moreover, the error parameters of FOG are susceptible to the temperature and magnetic field, and the pre-calibration is a time-consuming process which is difficult to completely suppress the FOG-related errors. In this paper, an improved motor control method with the measurements of FOG is proposed to address these problems, with which the outer frame can insulate the carrier's roll motion and the inner frame can simultaneously achieve the rotary modulation on the basis of insulating the heading motion. The results of turntable experiments indicate that the navigation performance of dual-axis RINS has been significantly improved over the traditional method, which could still be maintained even with large FOG installation errors and scale factor errors, proving that the proposed method can relax the requirements for the accuracy of FOG-related errors.
NASA Technical Reports Server (NTRS)
Pastor, P. Rick; Bishop, Robert H.; Striepe, Scott A.
2000-01-01
A first order simulation analysis of the navigation accuracy expected from various Navigation Quick-Look data sets is performed. Here quick-look navigation data are observations obtained by hypothetical telemetried data transmitted on the fly during a Mars probe's atmospheric entry. In this simulation study, navigation data consists of 3-axis accelerometer sensor and attitude information data. Three entry vehicle guidance types are studied: I. a Maneuvering entry vehicle (as with Mars 01 guidance where angle of attack and bank angle are controlled); II. Zero angle-of-attack controlled entry vehicle (as with Mars 98); and III. Ballistic, or spin stabilized entry vehicle (as with Mars Pathfinder);. For each type, sensitivity to progressively under sampled navigation data and inclusion of sensor errors are characterized. Attempts to mitigate the reconstructed trajectory errors, including smoothing, interpolation and changing integrator characteristics are also studied.
Analysis of a novel device-level SINS/ACFSS deeply integrated navigation method
NASA Astrophysics Data System (ADS)
Zhang, Hao; Qin, Shiqiao; Wang, Xingshu; Jiang, Guangwen; Tan, Wenfeng; Wu, Wei
2017-02-01
The combination of the strap-down inertial navigation system(SINS) and the celestial navigation system(CNS) is one of the popular measures to constitute the integrated navigation system. A star sensor(SS) is used as a precise attitude determination device in CNS. To solve the problem that the star image obtained by SS is motion-blurred under dynamic conditions, the attitude-correlated frames(ACF) approach is presented and the star sensor which works based on ACF approach is named ACFSS. Depending on the ACF approach, a novel device-level SINS/ACFSS deeply integrated navigation method is proposed in this paper. Feedback to the ACF process from the error of the gyro is one of the typical characters of the SINS/CNS deeply integrated navigation method. Herein, simulation results have verified its validity and efficiency in improving the accuracy of gyro and it can be proved that this method is feasible.
Prol, Fabricio dos Santos; El Issaoui, Aimad; Hakala, Teemu
2018-01-01
The use of Personal Mobile Terrestrial System (PMTS) has increased considerably for mobile mapping applications because these systems offer dynamic data acquisition with ground perspective in places where the use of wheeled platforms is unfeasible, such as forests and indoor buildings. PMTS has become more popular with emerging technologies, such as miniaturized navigation sensors and off-the-shelf omnidirectional cameras, which enable low-cost mobile mapping approaches. However, most of these sensors have not been developed for high-accuracy metric purposes and therefore require rigorous methods of data acquisition and data processing to obtain satisfactory results for some mapping applications. To contribute to the development of light, low-cost PMTS and potential applications of these off-the-shelf sensors for forest mapping, this paper presents a low-cost PMTS approach comprising an omnidirectional camera with off-the-shelf navigation systems and its evaluation in a forest environment. Experimental assessments showed that the integrated sensor orientation approach using navigation data as the initial information can increase the trajectory accuracy, especially in covered areas. The point cloud generated with the PMTS data had accuracy consistent with the Ground Sample Distance (GSD) range of omnidirectional images (3.5–7 cm). These results are consistent with those obtained for other PMTS approaches. PMID:29522467
Performance analysis of device-level SINS/ACFSS deeply integrated navigation method
NASA Astrophysics Data System (ADS)
Zhang, Hao; Qin, Shiqiao; Wang, Xingshu; Jiang, Guangwen; Tan, Wenfeng
2016-10-01
The Strap-Down Inertial Navigation System (SINS) is a widely used navigation system. The combination of SINS and the Celestial Navigation System (CNS) is one of the popular measures to constitute the integrated navigation system. A Star Sensor (SS) is used as a precise attitude determination device in CNS. To solve the problem that the star image obtained by SS under dynamic conditions is motion-blurred, the Attitude Correlated Frames (ACF) is presented and the star sensor which works based on ACF approach is named ACFSS. Depending on the ACF approach, a novel device-level SINS/ACFSS deeply integrated navigation method is proposed in this paper. Feedback to the ACF process from the error of the gyro is one of the typical characters of the SINS/CNS deeply integrated navigation method. Herein, simulation results have verified its validity and efficiency in improving the accuracy of gyro and it can be proved that this method is feasible in theory.
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1991-01-01
The volume on data fusion from multiple sources discusses fusing multiple views, temporal analysis and 3D motion interpretation, sensor fusion and eye-to-hand coordination, and integration in human shape perception. Attention is given to surface reconstruction, statistical methods in sensor fusion, fusing sensor data with environmental knowledge, computational models for sensor fusion, and evaluation and selection of sensor fusion techniques. Topics addressed include the structure of a scene from two and three projections, optical flow techniques for moving target detection, tactical sensor-based exploration in a robotic environment, and the fusion of human and machine skills for remote robotic operations. Also discussed are K-nearest-neighbor concepts for sensor fusion, surface reconstruction with discontinuities, a sensor-knowledge-command fusion paradigm for man-machine systems, coordinating sensing and local navigation, and terrain map matching using multisensing techniques for applications to autonomous vehicle navigation.
Integrated GNSS Attitude Determination and Positioning for Direct Geo-Referencing
Nadarajah, Nandakumaran; Paffenholz, Jens-André; Teunissen, Peter J. G.
2014-01-01
Direct geo-referencing is an efficient methodology for the fast acquisition of 3D spatial data. It requires the fusion of spatial data acquisition sensors with navigation sensors, such as Global Navigation Satellite System (GNSS) receivers. In this contribution, we consider an integrated GNSS navigation system to provide estimates of the position and attitude (orientation) of a 3D laser scanner. The proposed multi-sensor system (MSS) consists of multiple GNSS antennas rigidly mounted on the frame of a rotating laser scanner and a reference GNSS station with known coordinates. Precise GNSS navigation requires the resolution of the carrier phase ambiguities. The proposed method uses the multivariate constrained integer least-squares (MC-LAMBDA) method for the estimation of rotating frame ambiguities and attitude angles. MC-LAMBDA makes use of the known antenna geometry to strengthen the underlying attitude model and, hence, to enhance the reliability of rotating frame ambiguity resolution and attitude determination. The reliable estimation of rotating frame ambiguities is consequently utilized to enhance the relative positioning of the rotating frame with respect to the reference station. This integrated (array-aided) method improves ambiguity resolution, as well as positioning accuracy between the rotating frame and the reference station. Numerical analyses of GNSS data from a real-data campaign confirm the improved performance of the proposed method over the existing method. In particular, the integrated method yields reliable ambiguity resolution and reduces position standard deviation by a factor of about 0.8, matching the theoretical gain of 3/4 for two antennas on the rotating frame and a single antenna at the reference station. PMID:25036330
Integrated GNSS attitude determination and positioning for direct geo-referencing.
Nadarajah, Nandakumaran; Paffenholz, Jens-André; Teunissen, Peter J G
2014-07-17
Direct geo-referencing is an efficient methodology for the fast acquisition of 3D spatial data. It requires the fusion of spatial data acquisition sensors with navigation sensors, such as Global Navigation Satellite System (GNSS) receivers. In this contribution, we consider an integrated GNSS navigation system to provide estimates of the position and attitude (orientation) of a 3D laser scanner. The proposed multi-sensor system (MSS) consists of multiple GNSS antennas rigidly mounted on the frame of a rotating laser scanner and a reference GNSS station with known coordinates. Precise GNSS navigation requires the resolution of the carrier phase ambiguities. The proposed method uses the multivariate constrained integer least-squares (MC-LAMBDA) method for the estimation of rotating frame ambiguities and attitude angles. MC-LAMBDA makes use of the known antenna geometry to strengthen the underlying attitude model and, hence, to enhance the reliability of rotating frame ambiguity resolution and attitude determination. The reliable estimation of rotating frame ambiguities is consequently utilized to enhance the relative positioning of the rotating frame with respect to the reference station. This integrated (array-aided) method improves ambiguity resolution, as well as positioning accuracy between the rotating frame and the reference station. Numerical analyses of GNSS data from a real-data campaign confirm the improved performance of the proposed method over the existing method. In particular, the integrated method yields reliable ambiguity resolution and reduces position standard deviation by a factor of about 0:8, matching the theoretical gain of √ 3/4 for two antennas on the rotating frame and a single antenna at the reference station.
INS integrated motion analysis for autonomous vehicle navigation
NASA Technical Reports Server (NTRS)
Roberts, Barry; Bazakos, Mike
1991-01-01
The use of inertial navigation system (INS) measurements to enhance the quality and robustness of motion analysis techniques used for obstacle detection is discussed with particular reference to autonomous vehicle navigation. The approach to obstacle detection used here employs motion analysis of imagery generated by a passive sensor. Motion analysis of imagery obtained during vehicle travel is used to generate range measurements to points within the field of view of the sensor, which can then be used to provide obstacle detection. Results obtained with an INS integrated motion analysis approach are reviewed.
Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion
NASA Astrophysics Data System (ADS)
Steinhaus, Peter; Strand, Marcus; Dillmann, Rüdiger
2007-12-01
Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO) for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media) in Karlsruhe.
Vision Based Navigation for Autonomous Cooperative Docking of CubeSats
NASA Astrophysics Data System (ADS)
Pirat, Camille; Ankersen, Finn; Walker, Roger; Gass, Volker
2018-05-01
A realistic rendezvous and docking navigation solution applicable to CubeSats is investigated. The scalability analysis of the ESA Autonomous Transfer Vehicle Guidance, Navigation & Control (GNC) performances and the Russian docking system, shows that the docking of two CubeSats would require a lateral control performance of the order of 1 cm. Line of sight constraints and multipath effects affecting Global Navigation Satellite System (GNSS) measurements in close proximity prevent the use of this sensor for the final approach. This consideration and the high control accuracy requirement led to the use of vision sensors for the final 10 m of the rendezvous and docking sequence. A single monocular camera on the chaser satellite and various sets of Light-Emitting Diodes (LEDs) on the target vehicle ensure the observability of the system throughout the approach trajectory. The simple and novel formulation of the measurement equations allows differentiating unambiguously rotations from translations between the target and chaser docking port and allows a navigation performance better than 1 mm at docking. Furthermore, the non-linear measurement equations can be solved in order to provide an analytic navigation solution. This solution can be used to monitor the navigation filter solution and ensure its stability, adding an extra layer of robustness for autonomous rendezvous and docking. The navigation filter initialization is addressed in detail. The proposed method is able to differentiate LEDs signals from Sun reflections as demonstrated by experimental data. The navigation filter uses a comprehensive linearised coupled rotation/translation dynamics, describing the chaser to target docking port motion. The handover, between GNSS and vision sensor measurements, is assessed. The performances of the navigation function along the approach trajectory is discussed.
Optical Displacement Sensor for Sub-Hertz Applications
NASA Technical Reports Server (NTRS)
Abramovici, Alexander; Chiao, Meng P.; Dekens, Frank G.
2008-01-01
A document discusses a sensor made from off-the-shelf electro-optical photodiodes and electronics that achieves 20 nm/(Hz)(exp 1/2) displacement sensitivity at 1 mHz. This innovation was created using a fiber-coupled laser diode (or Nd:YAG) through a collimator and an aperture as the illumination source. Together with a germanium quad photodiode, the above-mentioned displacement sensor sensitivities have been achieved. This system was designed to aid the Laser Interferometer Space Antenna (LISA) with microthruster tests and to be a backup sensor for monitoring the relative position between a proof mass and a spacecraft for drag-free navigation. The optical displacement sensor can be used to monitor any small displacement from a remote location with minimal invasion on the system.
NASA Technical Reports Server (NTRS)
Strube, Matthew; Henry, Ross; Skeleton, Eugene; Eepoel, John Van; Gill, Nat; McKenna, Reed
2015-01-01
Since the last Hubble Servicing Mission five years ago, the Satellite Servicing Capabilities Office (SSCO) at the NASA Goddard Space Flight Center (GSFC) has been focusing on maturing the technologies necessary to robotically service orbiting legacy assets-spacecraft not necessarily designed for in-flight service. Raven, SSCO's next orbital experiment to the International Space Station (ISS), is a real-time autonomous non-cooperative relative navigation system that will mature the estimation algorithms required for rendezvous and proximity operations for a satellite-servicing mission. Raven will fly as a hosted payload as part of the Space Test Program's STP-H5 mission, which will be mounted on an external ExPRESS Logistics Carrier (ELC) and will image the many visiting vehicles arriving and departing from the ISS as targets for observation. Raven will host multiple sensors: a visible camera with a variable field of view lens, a long-wave infrared camera, and a short-wave flash lidar. This sensor suite can be pointed via a two-axis gimbal to provide a wide field of regard to track the visiting vehicles as they make their approach. Various real-time vision processing algorithms will produce range, bearing, and six degree of freedom pose measurements that will be processed in a relative navigation filter to produce an optimal relative state estimate. In this overview paper, we will cover top-level requirements, experimental concept of operations, system design, and the status of Raven integration and test activities.
Navigation Algorithms for the SeaWiFS Mission
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Patt, Frederick S.; McClain, Charles R. (Technical Monitor)
2002-01-01
The navigation algorithms for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) were designed to meet the requirement of 1-pixel accuracy-a standard deviation (sigma) of 2. The objective has been to extract the best possible accuracy from the spacecraft telemetry and avoid the need for costly manual renavigation or geometric rectification. The requirement is addressed by postprocessing of both the Global Positioning System (GPS) receiver and Attitude Control System (ACS) data in the spacecraft telemetry stream. The navigation algorithms described are separated into four areas: orbit processing, attitude sensor processing, attitude determination, and final navigation processing. There has been substantial modification during the mission of the attitude determination and attitude sensor processing algorithms. For the former, the basic approach was completely changed during the first year of the mission, from a single-frame deterministic method to a Kalman smoother. This was done for several reasons: a) to improve the overall accuracy of the attitude determination, particularly near the sub-solar point; b) to reduce discontinuities; c) to support the single-ACS-string spacecraft operation that was started after the first mission year, which causes gaps in attitude sensor coverage; and d) to handle data quality problems (which became evident after launch) in the direct-broadcast data. The changes to the attitude sensor processing algorithms primarily involved the development of a model for the Earth horizon height, also needed for single-string operation; the incorporation of improved sensor calibration data; and improved data quality checking and smoothing to handle the data quality issues. The attitude sensor alignments have also been revised multiple times, generally in conjunction with the other changes. The orbit and final navigation processing algorithms have remained largely unchanged during the mission, aside from refinements to data quality checking. Although further improvements are certainly possible, future evolution of the algorithms is expected to be limited to refinements of the methods presented here, and no substantial changes are anticipated.
Zhang, Jiayu; Li, Jie; Zhang, Xi; Che, Xiaorui; Huang, Yugang; Feng, Kaiqiang
2018-01-01
The Semi-Strapdown Inertial Navigation System (SSINS) provides a new solution to attitude measurement of a high-speed rotating missile. However, micro-electro-mechanical-systems (MEMS) inertial measurement unit (MIMU) outputs are corrupted by significant sensor errors. In order to improve the navigation precision, a rotation modulation technology method called Rotation Semi-Strapdown Inertial Navigation System (RSSINS) is introduced into SINS. In fact, the stability of the modulation angular rate is difficult to achieve in a high-speed rotation environment. The changing rotary angular rate has an impact on the inertial sensor error self-compensation. In this paper, the influence of modulation angular rate error, including acceleration-deceleration process, and instability of the angular rate on the navigation accuracy of RSSINS is deduced and the error characteristics of the reciprocating rotation scheme are analyzed. A new compensation method is proposed to remove or reduce sensor errors so as to make it possible to maintain high precision autonomous navigation performance by MIMU when there is no external aid. Experiments have been carried out to validate the performance of the method. In addition, the proposed method is applicable for modulation angular rate error compensation under various dynamic conditions. PMID:29734707
Coordinating sensing and local navigation
NASA Technical Reports Server (NTRS)
Slack, Marc G.
1991-01-01
Based on Navigation Templates (or NaTs), this work presents a new paradigm for local navigation which addresses the noisy and uncertain nature of sensor data. Rather than creating a new navigation plan each time the robot's perception of the world changes, the technique incorporates perceptual changes directly into the existing navigation plan. In this way, the robot's navigation plan is quickly and continuously modified, resulting in actions that remain coordinated with its changing perception of the world.
Preliminary Design of the Guidance, Navigation, and Control System of the Altair Lunar Lander
NASA Technical Reports Server (NTRS)
Lee, Allan Y.; Ely, Todd; Sostaric, Ronald; Strahan, Alan; Riedel, Joseph E.; Ingham, Mitch; Wincentsen, James; Sarani, Siamak
2010-01-01
Guidance, Navigation, and Control (GN&C) is the measurement and control of spacecraft position, velocity, and attitude in support of mission objectives. This paper provides an overview of a preliminary design of the GN&C system of the Lunar Lander Altair. Key functions performed by the GN&C system in various mission phases will first be described. A set of placeholder GN&C sensors that is needed to support these functions is next described. To meet Crew safety requirements, there must be high degrees of redundancy in the selected sensor configuration. Two sets of thrusters, one on the Ascent Module (AM) and the other on the Descent Module (DM), will be used by the GN&C system. The DM thrusters will be used, among other purposes, to perform course correction burns during the Trans-lunar Coast. The AM thrusters will be used, among other purposes, to perform precise angular and translational controls of the ascent module in order to dock the ascent module with Orion. Navigation is the process of measurement and control of the spacecraft's "state" (both the position and velocity vectors of the spacecraft). Tracking data from the Earth-Based Ground System (tracking antennas) as well as data from onboard optical sensors will be used to estimate the vehicle state. A driving navigation requirement is to land Altair on the Moon with a landing accuracy that is better than 1 km (radial 95%). Preliminary performance of the Altair GN&C design, relative to this and other navigation requirements, will be given. Guidance is the onboard process that uses the estimated state vector, crew inputs, and pre-computed reference trajectories to guide both the rotational and the translational motions of the spacecraft during powered flight phases. Design objectives of reference trajectories for various mission phases vary. For example, the reference trajectory for the descent "approach" phase (the last 3-4 minutes before touchdown) will sacrifice fuel utilization efficiency in order to provide landing site visibility for both the crew and the terrain hazard detection sensor system. One output of Guidance is the steering angle commands sent to the 2 degree-of-freedom (dof) gimbal actuation system of the descent engine. The engine gimbal actuation system is controlled by a Thrust Vector Control algorithm that is designed taking into account the large quantities of sloshing liquids in tanks mounted on Altair. In this early design phase of Altair, the GN&C system is described only briefly in this paper and the emphasis is on the GN&C architecture (that is still evolving). Multiple companion papers will provide details that are related to navigation, optical navigation, guidance, fuel sloshing, rendezvous and docking, machine-pilot interactions, and others. The similarities and differences of GN&C designs for Lunar and Mars landers are briefly compared.
NASA Technical Reports Server (NTRS)
Kanning, G.; Cicolani, L. S.; Schmidt, S. F.
1983-01-01
Translational state estimation in terminal area operations, using a set of commonly available position, air data, and acceleration sensors, is described. Kalman filtering is applied to obtain maximum estimation accuracy from the sensors but feasibility in real-time computations requires a variety of approximations and devices aimed at minimizing the required computation time with only negligible loss of accuracy. Accuracy behavior throughout the terminal area, its relation to sensor accuracy, its effect on trajectory tracking errors and control activity in an automatic flight control system, and its adequacy in terms of existing criteria for various terminal area operations are examined. The principal investigative tool is a simulation of the system.
Integrating Terrain Maps Into a Reactive Navigation Strategy
NASA Technical Reports Server (NTRS)
Howard, Ayanna; Werger, Barry; Seraji, Homayoun
2006-01-01
An improved method of processing information for autonomous navigation of a robotic vehicle across rough terrain involves the integration of terrain maps into a reactive navigation strategy. Somewhat more precisely, the method involves the incorporation, into navigation logic, of data equivalent to regional traversability maps. The terrain characteristic is mapped using a fuzzy-logic representation of the difficulty of traversing the terrain. The method is robust in that it integrates a global path-planning strategy with sensor-based regional and local navigation strategies to ensure a high probability of success in reaching a destination and avoiding obstacles along the way. The sensor-based strategies use cameras aboard the vehicle to observe the regional terrain, defined as the area of the terrain that covers the immediate vicinity near the vehicle to a specified distance a few meters away.
Strapdown cost trend study and forecast
NASA Technical Reports Server (NTRS)
Eberlein, A. J.; Savage, P. G.
1975-01-01
The potential cost advantages offered by advanced strapdown inertial technology in future commercial short-haul aircraft are summarized. The initial procurement cost and six year cost-of-ownership, which includes spares and direct maintenance cost were calculated for kinematic and inertial navigation systems such that traditional and strapdown mechanization costs could be compared. Cost results for the inertial navigation systems showed that initial costs and the cost of ownership for traditional triple redundant gimbaled inertial navigators are three times the cost of the equivalent skewed redundant strapdown inertial navigator. The net cost advantage for the strapdown kinematic system is directly attributable to the reduction in sensor count for strapdown. The strapdown kinematic system has the added advantage of providing a fail-operational inertial navigation capability for no additional cost due to the use of inertial grade sensors and attitude reference computers.
NASA Astrophysics Data System (ADS)
Welch, Sharon S.
Topics discussed in this volume include aircraft guidance and navigation, optics for visual guidance of aircraft, spacecraft and missile guidance and navigation, lidar and ladar systems, microdevices, gyroscopes, cockpit displays, and automotive displays. Papers are presented on optical processing for range and attitude determination, aircraft collision avoidance using a statistical decision theory, a scanning laser aircraft surveillance system for carrier flight operations, star sensor simulation for astroinertial guidance and navigation, autonomous millimeter-wave radar guidance systems, and a 1.32-micron long-range solid state imaging ladar. Attention is also given to a microfabricated magnetometer using Young's modulus changes in magnetoelastic materials, an integrated microgyroscope, a pulsed diode ring laser gyroscope, self-scanned polysilicon active-matrix liquid-crystal displays, the history and development of coated contrast enhancement filters for cockpit displays, and the effect of the display configuration on the attentional sampling performance. (For individual items see A93-28152 to A93-28176, A93-28178 to A93-28180)
ARK: Autonomous mobile robot in an industrial environment
NASA Technical Reports Server (NTRS)
Nickerson, S. B.; Jasiobedzki, P.; Jenkin, M.; Jepson, A.; Milios, E.; Down, B.; Service, J. R. R.; Terzopoulos, D.; Tsotsos, J.; Wilkes, D.
1994-01-01
This paper describes research on the ARK (Autonomous Mobile Robot in a Known Environment) project. The technical objective of the project is to build a robot that can navigate in a complex industrial environment using maps with permanent structures. The environment is not altered in any way by adding easily identifiable beacons and the robot relies on naturally occurring objects to use as visual landmarks for navigation. The robot is equipped with various sensors that can detect unmapped obstacles, landmarks and objects. In this paper we describe the robot's industrial environment, it's architecture, a novel combined range and vision sensor and our recent results in controlling the robot in the real-time detection of objects using their color and in the processing of the robot's range and vision sensor data for navigation.
A Leo Satellite Navigation Algorithm Based on GPS and Magnetometer Data
NASA Technical Reports Server (NTRS)
Deutschmann, Julie; Harman, Rick; Bar-Itzhack, Itzhack
2001-01-01
The Global Positioning System (GPS) has become a standard method for low cost onboard satellite orbit determination. The use of a GPS receiver as an attitude and rate sensor has also been developed in the recent past. Additionally, focus has been given to attitude and orbit estimation using the magnetometer, a low cost, reliable sensor. Combining measurements from both GPS and a magnetometer can provide a robust navigation system that takes advantage of the estimation qualities of both measurements. Ultimately, a low cost, accurate navigation system can result, potentially eliminating the need for more costly sensors, including gyroscopes. This work presents the development of a technique to eliminate numerical differentiation of the GPS phase measurements and also compares the use of one versus two GPS satellites.
NASA Technical Reports Server (NTRS)
Carson, John M., III; Johnson, Andrew E.; Anderson, F. Scott; Condon, Gerald L.; Nguyen, Louis H.; Olansen, Jon B.; Devolites, Jennifer L.; Harris, William J.; Hines, Glenn D.; Lee, David E.;
2016-01-01
The Lunar MARE (Moon Age and Regolith Explorer) Discovery Mission concept targets delivery of a science payload to the lunar surface for sample collection and dating. The mission science is within a 100-meter radius region of smooth lunar maria terrain near Aristarchus crater. The location has several small, sharp craters and rocks that present landing hazards to the spacecraft. For successful delivery of the science payload to the surface, the vehicle Guidance, Navigation and Control (GN&C) subsystem requires safe and precise landing capability, so design infuses the NASA Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) and a gimbaled, throttleable LOX/LCH4 main engine. The ALHAT system implemented for Lunar MARE is a specialization of prototype technologies in work within NASA for the past two decades, including a passive optical Terrain Relative Navigation (TRN) sensor, a Navigation Doppler Lidar (NDL) velocity and range sensor, and a Lidar-based Hazard Detection (HD) sensor. The landing descent profile is from a retrograde orbit over lighted terrain with landing near lunar dawn. The GN&C subsystem with ALHAT capabilities will deliver the science payload to the lunar surface within a 20-meter landing ellipse of the target location and at a site having greater than 99% safety probability, which minimizes risk to safe landing and delivery of the MARE science payload to the intended terrain region.
A Navigation System for the Visually Impaired: A Fusion of Vision and Depth Sensor
Kanwal, Nadia; Bostanci, Erkan; Currie, Keith; Clark, Adrian F.
2015-01-01
For a number of years, scientists have been trying to develop aids that can make visually impaired people more independent and aware of their surroundings. Computer-based automatic navigation tools are one example of this, motivated by the increasing miniaturization of electronics and the improvement in processing power and sensing capabilities. This paper presents a complete navigation system based on low cost and physically unobtrusive sensors such as a camera and an infrared sensor. The system is based around corners and depth values from Kinect's infrared sensor. Obstacles are found in images from a camera using corner detection, while input from the depth sensor provides the corresponding distance. The combination is both efficient and robust. The system not only identifies hurdles but also suggests a safe path (if available) to the left or right side and tells the user to stop, move left, or move right. The system has been tested in real time by both blindfolded and blind people at different indoor and outdoor locations, demonstrating that it operates adequately. PMID:27057135
Georeferencing in Gnss-Challenged Environment: Integrating Uwb and Imu Technologies
NASA Astrophysics Data System (ADS)
Toth, C. K.; Koppanyi, Z.; Navratil, V.; Grejner-Brzezinska, D.
2017-05-01
Acquiring geospatial data in GNSS compromised environments remains a problem in mapping and positioning in general. Urban canyons, heavily vegetated areas, indoor environments represent different levels of GNSS signal availability from weak to no signal reception. Even outdoors, with multiple GNSS systems, with an ever-increasing number of satellites, there are many situations with limited or no access to GNSS signals. Independent navigation sensors, such as IMU can provide high-data rate information but their initial accuracy degrades quickly, as the measurement data drift over time unless positioning fixes are provided from another source. At The Ohio State University's Satellite Positioning and Inertial Navigation (SPIN) Laboratory, as one feasible solution, Ultra- Wideband (UWB) radio units are used to aid positioning and navigating in GNSS compromised environments, including indoor and outdoor scenarios. Here we report about experiences obtained with georeferencing a pushcart based sensor system under canopied areas. The positioning system is based on UWB and IMU sensor integration, and provides sensor platform orientation for an electromagnetic inference (EMI) sensor. Performance evaluation results are provided for various test scenarios, confirming acceptable results for applications where high accuracy is not required.
Insect-Inspired Optical-Flow Navigation Sensors
NASA Technical Reports Server (NTRS)
Thakoor, Sarita; Morookian, John M.; Chahl, Javan; Soccol, Dean; Hines, Butler; Zornetzer, Steven
2005-01-01
Integrated circuits that exploit optical flow to sense motions of computer mice on or near surfaces ( optical mouse chips ) are used as navigation sensors in a class of small flying robots now undergoing development for potential use in such applications as exploration, search, and surveillance. The basic principles of these robots were described briefly in Insect-Inspired Flight Control for Small Flying Robots (NPO-30545), NASA Tech Briefs, Vol. 29, No. 1 (January 2005), page 61. To recapitulate from the cited prior article: The concept of optical flow can be defined, loosely, as the use of texture in images as a source of motion cues. The flight-control and navigation systems of these robots are inspired largely by the designs and functions of the vision systems and brains of insects, which have been demonstrated to utilize optical flow (as detected by their eyes and brains) resulting from their own motions in the environment. Optical flow has been shown to be very effective as a means of avoiding obstacles and controlling speeds and altitudes in robotic navigation. Prior systems used in experiments on navigating by means of optical flow have involved the use of panoramic optics, high-resolution image sensors, and programmable imagedata- processing computers.
Calibration of Magnetometers with GNSS Receivers and Magnetometer-Aided GNSS Ambiguity Fixing
Henkel, Patrick
2017-01-01
Magnetometers provide compass information, and are widely used for navigation, orientation and alignment of objects. As magnetometers are affected by sensor biases and eventually by systematic distortions of the Earth magnetic field, a calibration is needed. In this paper, a method for calibration of magnetometers with three Global Navigation Satellite System (GNSS) receivers is presented. We perform a least-squares estimation of the magnetic flux and sensor biases using GNSS-based attitude information. The attitude is obtained from the relative positions between the GNSS receivers in the North-East-Down coordinate frame and prior knowledge of these relative positions in the platform’s coordinate frame. The relative positions and integer ambiguities of the periodic carrier phase measurements are determined with an integer least-squares estimation using an integer decorrelation and sequential tree search. Prior knowledge on the relative positions is used to increase the success rate of ambiguity fixing. We have validated the proposed method with low-cost magnetometers and GNSS receivers on a vehicle in a test drive. The calibration enabled a consistent heading determination with an accuracy of five degrees. This precise magnetometer-based attitude information allows an instantaneous GNSS integer ambiguity fixing. PMID:28594369
Calibration of Magnetometers with GNSS Receivers and Magnetometer-Aided GNSS Ambiguity Fixing.
Henkel, Patrick
2017-06-08
Magnetometers provide compass information, and are widely used for navigation, orientation and alignment of objects. As magnetometers are affected by sensor biases and eventually by systematic distortions of the Earth magnetic field, a calibration is needed. In this paper, a method for calibration of magnetometers with three Global Navigation Satellite System (GNSS) receivers is presented. We perform a least-squares estimation of the magnetic flux and sensor biases using GNSS-based attitude information. The attitude is obtained from the relative positions between the GNSS receivers in the North-East-Down coordinate frame and prior knowledge of these relative positions in the platform's coordinate frame. The relative positions and integer ambiguities of the periodic carrier phase measurements are determined with an integer least-squares estimation using an integer decorrelation and sequential tree search. Prior knowledge on the relative positions is used to increase the success rate of ambiguity fixing. We have validated the proposed method with low-cost magnetometers and GNSS receivers on a vehicle in a test drive. The calibration enabled a consistent heading determination with an accuracy of five degrees. This precise magnetometer-based attitude information allows an instantaneous GNSS integer ambiguity fixing.
Indoor Navigation using Direction Sensor and Beacons
NASA Technical Reports Server (NTRS)
Shields, Joel; Jeganathan, Muthu
2004-01-01
A system for indoor navigation of a mobile robot includes (1) modulated infrared beacons at known positions on the walls and ceiling of a room and (2) a cameralike sensor, comprising a wide-angle lens with a position-sensitive photodetector at the focal plane, mounted in a known position and orientation on the robot. The system also includes a computer running special-purpose software that processes the sensor readings to obtain the position and orientation of the robot in all six degrees of freedom in a coordinate system embedded in the room.
GPS compound eye attitude and navigation sensor and method
NASA Technical Reports Server (NTRS)
Quinn, David A. (Inventor)
2003-01-01
The present invention is a GPS system for navigation and attitude determination, comprising a sensor array including a convex hemispherical mounting structure having a plurality of mounting surfaces, and a plurality of antennas mounted to the mounting surfaces for receiving signals from space vehicles of a GPS constellation. The present invention also includes a receiver for collecting the signals and making navigation and attitude determinations. In an alternate embodiment the present invention may include two opposing convex hemispherical mounting structures, each of the mounting structures having a plurality of mounting surfaces, and a plurality of antennas mounted to the mounting surfaces.
Unstructured Facility Navigation by Applying the NIST 4D/RCS Architecture
2006-07-01
control, and the planner); wire- less data and emergency stop radios; GPS receiver; inertial navigation unit; dual stereo cameras; infrared sensors...current Actuators Wheel motors, camera controls Scale & filter signals status commands commands commands GPS Antenna Dual stereo cameras...used in the sensory processing module include the two pairs of stereo color cameras, the physical bumper and infrared bumper sensors, the motor
DOE Office of Scientific and Technical Information (OSTI.GOV)
EISLER, G. RICHARD
This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstratemore » the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.« less
Przemyslaw, Baranski; Pawel, Strumillo
2012-01-01
The paper presents an algorithm for estimating a pedestrian location in an urban environment. The algorithm is based on the particle filter and uses different data sources: a GPS receiver, inertial sensors, probability maps and a stereo camera. Inertial sensors are used to estimate a relative displacement of a pedestrian. A gyroscope estimates a change in the heading direction. An accelerometer is used to count a pedestrian's steps and their lengths. The so-called probability maps help to limit GPS inaccuracy by imposing constraints on pedestrian kinematics, e.g., it is assumed that a pedestrian cannot cross buildings, fences etc. This limits position inaccuracy to ca. 10 m. Incorporation of depth estimates derived from a stereo camera that are compared to the 3D model of an environment has enabled further reduction of positioning errors. As a result, for 90% of the time, the algorithm is able to estimate a pedestrian location with an error smaller than 2 m, compared to an error of 6.5 m for a navigation based solely on GPS. PMID:22969321
Demonstration of coherent Doppler lidar for navigation in GPS-denied environments
NASA Astrophysics Data System (ADS)
Amzajerdian, Farzin; Hines, Glenn D.; Pierrottet, Diego F.; Barnes, Bruce W.; Petway, Larry B.; Carson, John M.
2017-05-01
A coherent Doppler lidar has been developed to address NASA's need for a high-performance, compact, and cost-effective velocity and altitude sensor onboard its landing vehicles. Future robotic and manned missions to solar system bodies require precise ground-relative velocity vector and altitude data to execute complex descent maneuvers and safe, soft landing at a pre-designated site. This lidar sensor, referred to as a Navigation Doppler Lidar (NDL), meets the required performance of the landing missions while complying with vehicle size, mass, and power constraints. Operating from up to four kilometers altitude, the NDL obtains velocity and range precision measurements reaching 2 cm/sec and 2 meters, respectively, dominated by the vehicle motion. Terrestrial aerial vehicles will also benefit from NDL data products as enhancement or replacement to GPS systems when GPS is unavailable or redundancy is needed. The NDL offers a viable option to aircraft navigation in areas where the GPS signal can be blocked or jammed by intentional or unintentional interference. The NDL transmits three laser beams at different pointing angles toward the ground to measure range and velocity along each beam using a frequency modulated continuous wave (FMCW) technique. The three line-of-sight measurements are then combined in order to determine the three components of the vehicle velocity vector and its altitude relative to the ground. This paper describes the performance and capabilities that the NDL demonstrated through extensive ground tests, helicopter flight tests, and onboard an autonomous rocket-powered test vehicle while operating in closedloop with a guidance, navigation, and control (GN and C) system.
Development of Tools and Techniques for Processing STORRM Flight Data
NASA Technical Reports Server (NTRS)
Robinson, Shane; D'Souza, Christopher
2011-01-01
While at JSC for the summer of 2011, I was assigned to work on the sensor test for Orion relative-navigation risk mitigation (STORRM) development test objective (DTO). The STORRM DTO was flown on-board Endeavor during STS-134. The objective of the STORRM DTO is to test the visual navigation system (VNS), which will be used as the primary relative navigation sensor for the Orion spacecraft. The VNS is a flash lidar system intended to provide both line of sight and range information during rendezvous and proximity operations. The STORRM DTO also serves as a testbed for the high-resolution docking camera. This docking camera will be used to provide piloting cues for the crew during proximity operations. These instruments were mounted next to the trajectory control sensor (TCS) in Endeavour s payload bay. My principle objective for the summer was to generate a best estimated trajectory (BET) for Endeavor using the flight data collected by the VNS during rendezvous and the unprecedented re-rendezvous with the ISS. I processed the raw images from the VNS to produce range and bearing measurements. I then aggregated these measurements and extracted the measurements corresponding to individual reflectors. I combined the information contained in these measurements with data from the Endeavour's inertial sensors using Kalman smoothing techniques to ultimately produce a BET. This work culminated with a final presentation of the result to division management. Development of this tool required that traditional linear smoothing techniques be modified in a novel fashion to permit for the inclusion of non-linear measurements. This internship has greatly helped me further my career by providing exposure to real engineering projects. I also have benefited immensely from the mentorship of the engineers working on these projects. Many of the lessons I learned and experiences I had are of particular value because then can only be found in a place like JSC.
ULTOR(Registered TradeMark) Passive Pose and Position Engine For Spacecraft Relative Navigation
NASA Technical Reports Server (NTRS)
Hannah, S. Joel
2008-01-01
The ULTOR(Registered TradeMark) Passive Pose and Position Engine (P3E) technology, developed by Advanced Optical Systems, Inc (AOS), uses real-time image correlation to provide relative position and pose data for spacecraft guidance, navigation, and control. Potential data sources include a wide variety of sensors, including visible and infrared cameras. ULTOR(Registered TradeMark) P3E has been demonstrated on a number of host processing platforms. NASA is integrating ULTOR(Registerd TradeMark) P3E into its Relative Navigation System (RNS), which is being developed for the upcoming Hubble Space Telescope (HST) Servicing Mission 4 (SM4). During SM4 ULTOR(Registered TradeMark) P3E will perform realtime pose and position measurements during both the approach and departure phases of the mission. This paper describes the RNS implementation of ULTOR(Registered TradeMark) P3E, and presents results from NASA's hardware-in-the-loop simulation testing against the HST mockup.
Results from a GPS Shuttle Training Aircraft flight test
NASA Technical Reports Server (NTRS)
Saunders, Penny E.; Montez, Moises N.; Robel, Michael C.; Feuerstein, David N.; Aerni, Mike E.; Sangchat, S.; Rater, Lon M.; Cryan, Scott P.; Salazar, Lydia R.; Leach, Mark P.
1991-01-01
A series of Global Positioning System (GPS) flight tests were performed on a National Aeronautics and Space Administration's (NASA's) Shuttle Training Aircraft (STA). The objective of the tests was to evaluate the performance of GPS-based navigation during simulated Shuttle approach and landings for possible replacement of the current Shuttle landing navigation aid, the Microwave Scanning Beam Landing System (MSBLS). In particular, varying levels of sensor data integration would be evaluated to determine the minimum amount of integration required to meet the navigation accuracy requirements for a Shuttle landing. Four flight tests consisting of 8 to 9 simulation runs per flight test were performed at White Sands Space Harbor in April 1991. Three different GPS receivers were tested. The STA inertial navigation, tactical air navigation, and MSBLS sensor data were also recorded during each run. C-band radar aided laser trackers were utilized to provide the STA 'truth' trajectory.
Navigation studies based on the ubiquitous positioning technologies
NASA Astrophysics Data System (ADS)
Ye, Lei; Mi, Weijie; Wang, Defeng
2007-11-01
This paper summarized the nowadays positioning technologies, such as absolute positioning methods and relative positioning methods, indoor positioning and outdoor positioning, active positioning and passive positioning. Global Navigation Satellite System (GNSS) technologies were introduced as the omnipresent out-door positioning technologies, including GPS, GLONASS, Galileo and BD-1/2. After analysis of the shortcomings of GNSS, indoor positioning technologies were discussed and compared, including A-GPS, Cellular network, Infrared, Electromagnetism, Computer Vision Cognition, Embedded Pressure Sensor, Ultrasonic, RFID (Radio Frequency IDentification), Bluetooth, WLAN etc.. Then the concept and characteristics of Ubiquitous Positioning was proposed. After the ubiquitous positioning technologies contrast and selection followed by system engineering methodology, a navigation system model based on Incorporate Indoor-Outdoor Positioning Solution was proposed. And this model was simulated in the Galileo Demonstration for World Expo Shanghai project. In the conclusion, the prospects of ubiquitous positioning based navigation were shown, especially to satisfy the public location information acquiring requirement.
Assessment of navigation cues with proximal force sensing during endovascular catheterization.
Rafii-Taril, Hedyeh; Payne, Christopher J; Riga, Celia; Bicknell, Colin; Lee, Su-Lin; Yang, Guang-Zhong
2012-01-01
Despite increased use of robotic catheter navigation systems for endovascular intervention procedures, current master-slave platforms have not yet taken into account dexterous manipulation skill used in traditional catheterization procedures. Information on tool forces applied by operators is often limited. A novel force/torque sensor is developed in this paper to obtain behavioural data across different experience levels and identify underlying factors that affect overall operator performance. The miniature device can be attached to any part of the proximal end of the catheter, together with a position sensor attached to the catheter tip, for relating tool forces to catheter dynamics and overall performance. The results show clear differences in manipulation skills between experience groups, thus providing insights into different patterns and range of forces applied during routine endovascular procedures. They also provide important design specifications for ergonomically optimized catheter manipulation platforms with added haptic feedback while maintaining natural skills of the operators.
Space Shuttle Navigation in the GPS Era
NASA Technical Reports Server (NTRS)
Goodman, John L.
2001-01-01
The Space Shuttle navigation architecture was originally designed in the 1970s. A variety of on-board and ground based navigation sensors and computers are used during the ascent, orbit coast, rendezvous, (including proximity operations and docking) and entry flight phases. With the advent of GPS navigation and tightly coupled GPS/INS Units employing strapdown sensors, opportunities to improve and streamline the Shuttle navigation process are being pursued. These improvements can potentially result in increased safety, reliability, and cost savings in maintenance through the replacement of older technologies and elimination of ground support systems (such as Tactical Air Control and Navigation (TACAN), Microwave Landing System (MLS) and ground radar). Selection and missionization of "off the shelf" GPS and GPS/INS units pose a unique challenge since the units in question were not originally designed for the Space Shuttle application. Various options for integrating GPS and GPS/INS units with the existing orbiter avionics system were considered in light of budget constraints, software quality concerns, and schedule limitations. An overview of Shuttle navigation methodology from 1981 to the present is given, along with how GPS and GPS/INS technology will change, or not change, the way Space Shuttle navigation is performed in the 21 5 century.
NASA Astrophysics Data System (ADS)
Uijt de Haag, Maarten; Campbell, Jacob; van Graas, Frank
2005-05-01
Synthetic Vision Systems (SVS) provide pilots with a virtual visual depiction of the external environment. When using SVS for aircraft precision approach guidance systems accurate positioning relative to the runway with a high level of integrity is required. Precision approach guidance systems in use today require ground-based electronic navigation components with at least one installation at each airport, and in many cases multiple installations to service approaches to all qualifying runways. A terrain-referenced approach guidance system is envisioned to provide precision guidance to an aircraft without the use of ground-based electronic navigation components installed at the airport. This autonomy makes it a good candidate for integration with an SVS. At the Ohio University Avionics Engineering Center (AEC), work has been underway in the development of such a terrain referenced navigation system. When used in conjunction with an Inertial Measurement Unit (IMU) and a high accuracy/resolution terrain database, this terrain referenced navigation system can provide navigation and guidance information to the pilot on a SVS or conventional instruments. The terrain referenced navigation system, under development at AEC, operates on similar principles as other terrain navigation systems: a ground sensing sensor (in this case an airborne laser scanner) gathers range measurements to the terrain; this data is then matched in some fashion with an onboard terrain database to find the most likely position solution and used to update an inertial sensor-based navigator. AEC's system design differs from today's common terrain navigators in its use of a high resolution terrain database (~1 meter post spacing) in conjunction with an airborne laser scanner which is capable of providing tens of thousands independent terrain elevation measurements per second with centimeter-level accuracies. When combined with data from an inertial navigator the high resolution terrain database and laser scanner system is capable of providing near meter-level horizontal and vertical position estimates. Furthermore, the system under development capitalizes on 1) The position and integrity benefits provided by the Wide Area Augmentation System (WAAS) to reduce the initial search space size and; 2) The availability of high accuracy/resolution databases. This paper presents results from flight tests where the terrain reference navigator is used to provide guidance cues for a precision approach.
Navigation in Difficult Environments: Multi-Sensor Fusion Techniques
2010-03-01
Hwang , Introduction to Random Signals and Applied Kalman Filtering, 3rd ed., John Wiley & Sons, Inc., New York, 1997. [17] J. L. Farrell, “GPS/INS...nav solution Navigation outputs Estimation of inertial errors ( Kalman filter) Error estimates Core sensor Incoming signal INS Estimates of signal...the INS drift terms is performed using the mechanism of a complementary Kalman filter. The idea is that a signal parameter can be generally
Design and Calibration of a Novel Bio-Inspired Pixelated Polarized Light Compass.
Han, Guoliang; Hu, Xiaoping; Lian, Junxiang; He, Xiaofeng; Zhang, Lilian; Wang, Yujie; Dong, Fengliang
2017-11-14
Animals, such as Savannah sparrows and North American monarch butterflies, are able to obtain compass information from skylight polarization patterns to help them navigate effectively and robustly. Inspired by excellent navigation ability of animals, this paper proposes a novel image-based polarized light compass, which has the advantages of having a small size and being light weight. Firstly, the polarized light compass, which is composed of a Charge Coupled Device (CCD) camera, a pixelated polarizer array and a wide-angle lens, is introduced. Secondly, the measurement method of a skylight polarization pattern and the orientation method based on a single scattering Rayleigh model are presented. Thirdly, the error model of the sensor, mainly including the response error of CCD pixels and the installation error of the pixelated polarizer, is established. A calibration method based on iterative least squares estimation is proposed. In the outdoor environment, the skylight polarization pattern can be measured in real time by our sensor. The orientation accuracy of the sensor increases with the decrease of the solar elevation angle, and the standard deviation of orientation error is 0 . 15 ∘ at sunset. Results of outdoor experiments show that the proposed polarization navigation sensor can be used for outdoor autonomous navigation.
Design and Calibration of a Novel Bio-Inspired Pixelated Polarized Light Compass
Hu, Xiaoping; Lian, Junxiang; He, Xiaofeng; Zhang, Lilian; Wang, Yujie; Dong, Fengliang
2017-01-01
Animals, such as Savannah sparrows and North American monarch butterflies, are able to obtain compass information from skylight polarization patterns to help them navigate effectively and robustly. Inspired by excellent navigation ability of animals, this paper proposes a novel image-based polarized light compass, which has the advantages of having a small size and being light weight. Firstly, the polarized light compass, which is composed of a Charge Coupled Device (CCD) camera, a pixelated polarizer array and a wide-angle lens, is introduced. Secondly, the measurement method of a skylight polarization pattern and the orientation method based on a single scattering Rayleigh model are presented. Thirdly, the error model of the sensor, mainly including the response error of CCD pixels and the installation error of the pixelated polarizer, is established. A calibration method based on iterative least squares estimation is proposed. In the outdoor environment, the skylight polarization pattern can be measured in real time by our sensor. The orientation accuracy of the sensor increases with the decrease of the solar elevation angle, and the standard deviation of orientation error is 0.15∘ at sunset. Results of outdoor experiments show that the proposed polarization navigation sensor can be used for outdoor autonomous navigation. PMID:29135927
Schwein, Adeline; Kramer, Ben; Chinnadurai, Ponraj; Walker, Sean; O'Malley, Marcia; Lumsden, Alan; Bismuth, Jean
2017-02-01
One limitation of the use of robotic catheters is the lack of real-time three-dimensional (3D) localization and position updating: they are still navigated based on two-dimensional (2D) X-ray fluoroscopic projection images. Our goal was to evaluate whether incorporating an electromagnetic (EM) sensor on a robotic catheter tip could improve endovascular navigation. Six users were tasked to navigate using a robotic catheter with incorporated EM sensors in an aortic aneurysm phantom. All users cannulated two anatomic targets (left renal artery and posterior "gate") using four visualization modes: (1) standard fluoroscopy mode (control), (2) 2D fluoroscopy mode showing real-time virtual catheter orientation from EM tracking, (3) 3D model of the phantom with anteroposterior and endoluminal view, and (4) 3D model with anteroposterior and lateral view. Standard X-ray fluoroscopy was always available. Cannulation and fluoroscopy times were noted for every mode. 3D positions of the EM tip sensor were recorded at 4 Hz to establish kinematic metrics. The EM sensor-incorporated catheter navigated as expected according to all users. The success rate for cannulation was 100%. For the posterior gate target, mean cannulation times in minutes:seconds were 8:12, 4:19, 4:29, and 3:09, respectively, for modes 1, 2, 3 and 4 (P = .013), and mean fluoroscopy times were 274, 20, 29, and 2 seconds, respectively (P = .001). 3D path lengths, spectral arc length, root mean dimensionless jerk, and number of submovements were significantly improved when EM tracking was used (P < .05), showing higher quality of catheter movement with EM navigation. The EM tracked robotic catheter allowed better real-time 3D orientation, facilitating navigation, with a reduction in cannulation and fluoroscopy times and improvement of motion consistency and efficiency. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
2014-01-01
Background People with severe disabilities, e.g. due to neurodegenerative disease, depend on technology that allows for accurate wheelchair control. For those who cannot operate a wheelchair with a joystick, brain-computer interfaces (BCI) may offer a valuable option. Technology depending on visual or auditory input may not be feasible as these modalities are dedicated to processing of environmental stimuli (e.g. recognition of obstacles, ambient noise). Herein we thus validated the feasibility of a BCI based on tactually-evoked event-related potentials (ERP) for wheelchair control. Furthermore, we investigated use of a dynamic stopping method to improve speed of the tactile BCI system. Methods Positions of four tactile stimulators represented navigation directions (left thigh: move left; right thigh: move right; abdomen: move forward; lower neck: move backward) and N = 15 participants delivered navigation commands by focusing their attention on the desired tactile stimulus in an oddball-paradigm. Results Participants navigated a virtual wheelchair through a building and eleven participants successfully completed the task of reaching 4 checkpoints in the building. The virtual wheelchair was equipped with simulated shared-control sensors (collision avoidance), yet these sensors were rarely needed. Conclusion We conclude that most participants achieved tactile ERP-BCI control sufficient to reliably operate a wheelchair and dynamic stopping was of high value for tactile ERP classification. Finally, this paper discusses feasibility of tactile ERPs for BCI based wheelchair control. PMID:24428900
Sabatini, Angelo Maria; Genovese, Vincenzo
2014-07-24
A sensor fusion method was developed for vertical channel stabilization by fusing inertial measurements from an Inertial Measurement Unit (IMU) and pressure altitude measurements from a barometric altimeter integrated in the same device (baro-IMU). An Extended Kalman Filter (EKF) estimated the quaternion from the sensor frame to the navigation frame; the sensed specific force was rotated into the navigation frame and compensated for gravity, yielding the vertical linear acceleration; finally, a complementary filter driven by the vertical linear acceleration and the measured pressure altitude produced estimates of height and vertical velocity. A method was also developed to condition the measured pressure altitude using a whitening filter, which helped to remove the short-term correlation due to environment-dependent pressure changes from raw pressure altitude. The sensor fusion method was implemented to work on-line using data from a wireless baro-IMU and tested for the capability of tracking low-frequency small-amplitude vertical human-like motions that can be critical for stand-alone inertial sensor measurements. Validation tests were performed in different experimental conditions, namely no motion, free-fall motion, forced circular motion and squatting. Accurate on-line tracking of height and vertical velocity was achieved, giving confidence to the use of the sensor fusion method for tracking typical vertical human motions: velocity Root Mean Square Error (RMSE) was in the range 0.04-0.24 m/s; height RMSE was in the range 5-68 cm, with statistically significant performance gains when the whitening filter was used by the sensor fusion method to track relatively high-frequency vertical motions.
Gyroscope-reduced inertial navigation system for flight vehicle motion estimation
NASA Astrophysics Data System (ADS)
Wang, Xin; Xiao, Lu
2017-01-01
In this paper, a novel configuration of strategically distributed accelerometer sensors with the aid of one gyro to infer a flight vehicle's angular motion is presented. The MEMS accelerometer and gyro sensors are integrated to form a gyroscope-reduced inertial measurement unit (GR-IMU). The motivation for gyro aided accelerometers array is to have direct measurements of angular rates, which is an improvement to the traditional gyroscope-free inertial system that employs only direct measurements of specific force. Some technical issues regarding error calibration in accelerometers and gyro in GR-IMU are put forward. The GR-IMU based inertial navigation system can be used to find a complete attitude solution for flight vehicle motion estimation. Results of numerical simulation are given to illustrate the effectiveness of the proposed configuration. The gyroscope-reduced inertial navigation system based on distributed accelerometer sensors can be developed into a cost effective solution for a fast reaction, MEMS based motion capture system. Future work will include the aid from external navigation references (e.g. GPS) to improve long time mission performance.
Sensor image prediction techniques
NASA Astrophysics Data System (ADS)
Stenger, A. J.; Stone, W. R.; Berry, L.; Murray, T. J.
1981-02-01
The preparation of prediction imagery is a complex, costly, and time consuming process. Image prediction systems which produce a detailed replica of the image area require the extensive Defense Mapping Agency data base. The purpose of this study was to analyze the use of image predictions in order to determine whether a reduced set of more compact image features contains enough information to produce acceptable navigator performance. A job analysis of the navigator's mission tasks was performed. It showed that the cognitive and perceptual tasks he performs during navigation are identical to those performed for the targeting mission function. In addition, the results of the analysis of his performance when using a particular sensor can be extended to the analysis of this mission tasks using any sensor. An experimental approach was used to determine the relationship between navigator performance and the type of amount of information in the prediction image. A number of subjects were given image predictions containing varying levels of scene detail and different image features, and then asked to identify the predicted targets in corresponding dynamic flight sequences over scenes of cultural, terrain, and mixed (both cultural and terrain) content.
Autonomous Navigation Using Celestial Objects
NASA Technical Reports Server (NTRS)
Folta, David; Gramling, Cheryl; Leung, Dominic; Belur, Sheela; Long, Anne
1999-01-01
In the twenty-first century, National Aeronautics and Space Administration (NASA) Enterprises envision frequent low-cost missions to explore the solar system, observe the universe, and study our planet. Satellite autonomy is a key technology required to reduce satellite operating costs. The Guidance, Navigation, and Control Center (GNCC) at the Goddard Space Flight Center (GSFC) currently sponsors several initiatives associated with the development of advanced spacecraft systems to provide autonomous navigation and control. Autonomous navigation has the potential both to increase spacecraft navigation system performance and to reduce total mission cost. By eliminating the need for routine ground-based orbit determination and special tracking services, autonomous navigation can streamline spacecraft ground systems. Autonomous navigation products can be included in the science telemetry and forwarded directly to the scientific investigators. In addition, autonomous navigation products are available onboard to enable other autonomous capabilities, such as attitude control, maneuver planning and orbit control, and communications signal acquisition. Autonomous navigation is required to support advanced mission concepts such as satellite formation flying. GNCC has successfully developed high-accuracy autonomous navigation systems for near-Earth spacecraft using NASA's space and ground communications systems and the Global Positioning System (GPS). Recently, GNCC has expanded its autonomous navigation initiative to include satellite orbits that are beyond the regime in which use of GPS is possible. Currently, GNCC is assessing the feasibility of using standard spacecraft attitude sensors and communication components to provide autonomous navigation for missions including: libration point, gravity assist, high-Earth, and interplanetary orbits. The concept being evaluated uses a combination of star, Sun, and Earth sensor measurements along with forward-link Doppler measurements from the command link carrier to autonomously estimate the spacecraft's orbit and reference oscillator's frequency. To support autonomous attitude determination and control and maneuver planning and control, the orbit determination accuracy should be on the order of kilometers in position and centimeters per second in velocity. A less accurate solution (one hundred kilometers in position) could be used for acquisition purposes for command and science downloads. This paper provides performance results for both libration point orbiting and high Earth orbiting satellites as a function of sensor measurement accuracy, measurement types, measurement frequency, initial state errors, and dynamic modeling errors.
Design and Development of a Mobile Sensor Based the Blind Assistance Wayfinding System
NASA Astrophysics Data System (ADS)
Barati, F.; Delavar, M. R.
2015-12-01
The blind and visually impaired people are facing a number of challenges in their daily life. One of the major challenges is finding their way both indoor and outdoor. For this reason, routing and navigation independently, especially in urban areas are important for the blind. Most of the blind undertake route finding and navigation with the help of a guide. In addition, other tools such as a cane, guide dog or electronic aids are used by the blind. However, in some cases these aids are not efficient enough in a wayfinding around obstacles and dangerous areas for the blind. As a result, the need to develop effective methods as decision support using a non-visual media is leading to improve quality of life for the blind through their increased mobility and independence. In this study, we designed and implemented an outdoor mobile sensor-based wayfinding system for the blind. The objectives of this study are to guide the blind for the obstacle recognition and the design and implementation of a wayfinding and navigation mobile sensor system for them. In this study an ultrasonic sensor is used to detect obstacles and GPS is employed for positioning and navigation in the wayfinding. This type of ultrasonic sensor measures the interval between sending waves and receiving the echo signals with respect to the speed of sound in the environment to estimate the distance to the obstacles. In this study the coordinates and characteristics of all the obstacles in the study area are already stored in a GIS database. All of these obstacles were labeled on the map. The ultrasonic sensor designed and constructed in this study has the ability to detect the obstacles in a distance of 2cm to 400cm. The implementation and the results obtained from the interview of a number of blind persons who employed the sensor verified that the designed mobile sensor system for wayfinding was very satisfactory.
Open-Loop Flight Testing of COBALT GN&C Technologies for Precise Soft Landing
NASA Technical Reports Server (NTRS)
Carson, John M., III; Amzajerdian, Farzin; Seubert, Carl R.; Restrepo, Carolina I.
2017-01-01
A terrestrial, open-loop (OL) flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) platform was conducted onboard the Masten Xodiac suborbital rocket testbed, with support through the NASA Advanced Exploration Systems (AES), Game Changing Development (GCD), and Flight Opportunities (FO) Programs. The COBALT platform integrates NASA Guidance, Navigation and Control (GN&C) sensing technologies for autonomous, precise soft landing, including the Navigation Doppler Lidar (NDL) velocity and range sensor and the Lander Vision System (LVS) Terrain Relative Navigation (TRN) system. A specialized navigation filter running onboard COBALT fuzes the NDL and LVS data in real time to produce a precise navigation solution that is independent of the Global Positioning System (GPS) and suitable for future, autonomous planetary landing systems. The OL campaign tested COBALT as a passive payload, with COBALT data collection and filter execution, but with the Xodiac vehicle Guidance and Control (G&C) loops closed on a Masten GPS-based navigation solution. The OL test was performed as a risk reduction activity in preparation for an upcoming 2017 closed-loop (CL) flight campaign in which Xodiac G&C will act on the COBALT navigation solution and the GPS-based navigation will serve only as a backup monitor.
Yang, Yanqiang; Zhang, Chunxi; Lu, Jiazhen
2017-01-16
Strapdown inertial navigation system/celestial navigation system (SINS/CNS) integrated navigation is a fully autonomous and high precision method, which has been widely used to improve the hitting accuracy and quick reaction capability of near-Earth flight vehicles. The installation errors between SINS and star sensors have been one of the main factors that restrict the actual accuracy of SINS/CNS. In this paper, an integration algorithm based on the star vector observations is derived considering the star sensor installation error. Then, the star sensor installation error is accurately estimated based on Kalman Filtering (KF). Meanwhile, a local observability analysis is performed on the rank of observability matrix obtained via linearization observation equation, and the observable conditions are presented and validated. The number of star vectors should be greater than or equal to 2, and the times of posture adjustment also should be greater than or equal to 2. Simulations indicate that the star sensor installation error could be readily observable based on the maneuvering condition; moreover, the attitude errors of SINS are less than 7 arc-seconds. This analysis method and conclusion are useful in the ballistic trajectory design of near-Earth flight vehicles.
Mixed-mode VLSI optic flow sensors for micro air vehicles
NASA Astrophysics Data System (ADS)
Barrows, Geoffrey Louis
We develop practical, compact optic flow sensors. To achieve the desired weight of 1--2 grams, mixed-mode and mixed-signal VLSI techniques are used to develop compact circuits that directly perform computations necessary to measure optic flow. We discuss several implementations, including a version fully integrated in VLSI, and several "hybrid sensors" in which the front end processing is performed with an analog chip and the back end processing is performed with a microcontroller. We extensively discuss one-dimensional optic flow sensors based on the linear competitive feature tracker (LCFT) algorithm. Hardware implementations of this algorithm are shown able to measure visual motion with contrast levels on the order of several percent. We argue that the development of one-dimensional optic flow sensors is therefore reduced to a problem of engineering. We also introduce two related two-dimensional optic flow algorithms that are amenable to implementation in VLSI. This includes the planar competitive feature tracker (PCFT) algorithm and the trajectory method. These sensors are being developed to solve small-scale navigation problems in micro air vehicles, which are autonomous aircraft whose maximum dimension is on the order of 15 cm. We obtain a proof-of-principle of small-scale navigation by mounting a prototype sensor onto a toy glider and programming the sensor to control a rudder or an elevator to affect the glider's path during flight. We demonstrate the determination of altitude by measuring optic flow in the downward direction. We also demonstrate steering to avoid a collision with a wall, when the glider is tossed towards the wall at a shallow angle, by measuring the optic flow in the direction of the glider's left and right side.
NASA Technical Reports Server (NTRS)
Trube, Matthew J.; Hyslop, Andrew M.; Carignan, Craig R.; Easley, Joseph W.
2012-01-01
A hardware-in-the-loop ground system was developed for simulating a robotic servicer spacecraft tracking a target satellite at short range. A relative navigation sensor package "Argon" is mounted on the end-effector of a Fanuc 430 manipulator, which functions as the base platform of the robotic spacecraft servicer. Machine vision algorithms estimate the pose of the target spacecraft, mounted on a Rotopod R-2000 platform, relay the solution to a simulation of the servicer spacecraft running in "Freespace", which performs guidance, navigation and control functions, integrates dynamics, and issues motion commands to a Fanuc platform controller so that it tracks the simulated servicer spacecraft. Results will be reviewed for several satellite motion scenarios at different ranges. Key words: robotics, satellite, servicing, guidance, navigation, tracking, control, docking.
Hub, Andreas; Hartter, Tim; Kombrink, Stefan; Ertl, Thomas
2008-01-01
PURPOSE.: This study describes the development of a multi-functional assistant system for the blind which combines localisation, real and virtual navigation within modelled environments and the identification and tracking of fixed and movable objects. The approximate position of buildings is determined with a global positioning sensor (GPS), then the user establishes exact position at a specific landmark, like a door. This location initialises indoor navigation, based on an inertial sensor, a step recognition algorithm and map. Tracking of movable objects is provided by another inertial sensor and a head-mounted stereo camera, combined with 3D environmental models. This study developed an algorithm based on shape and colour to identify objects and used a common face detection algorithm to inform the user of the presence and position of others. The system allows blind people to determine their position with approximately 1 metre accuracy. Virtual exploration of the environment can be accomplished by moving one's finger on a touch screen of a small portable tablet PC. The name of rooms, building features and hazards, modelled objects and their positions are presented acoustically or in Braille. Given adequate environmental models, this system offers blind people the opportunity to navigate independently and safely, even within unknown environments. Additionally, the system facilitates education and rehabilitation by providing, in several languages, object names, features and relative positions.
Altair Navigation During Trans-Lunar Cruise, Lunar Orbit, Descent and Landing
NASA Technical Reports Server (NTRS)
Ely, Todd A.; Heyne, Martin; Riedel, Joseph E.
2010-01-01
The Altair lunar lander navigation system is driven by a set of requirements that not only specify a need to land within 100 m of a designated spot on the Moon, but also be capable of a safe return to an orbiting Orion capsule in the event of loss of Earth ground support. These requirements lead to the need for a robust and capable on-board navigation system that works in conjunction with an Earth ground navigation system that uses primarily ground-based radiometric tracking. The resulting system relies heavily on combining a multiplicity of data types including navigation state updates from the ground based navigation system, passive optical imaging from a gimbaled camera, a stable inertial measurement unit, and a capable radar altimeter and velocimeter. The focus of this paper is on navigation performance during the trans-lunar cruise, lunar orbit, and descent/landing mission phases with the goal of characterizing knowledge and delivery errors to key mission events, bound the statistical delta V costs for executing the mission, as well as the determine the landing dispersions due to navigation. This study examines the nominal performance that can be obtained using the current best estimate of the vehicle, sensor, and environment models. Performance of the system under a variety sensor outages and parametric trades is also examined.
Laser-Camera Vision Sensing for Spacecraft Mobile Robot Navigation
NASA Technical Reports Server (NTRS)
Maluf, David A.; Khalil, Ahmad S.; Dorais, Gregory A.; Gawdiak, Yuri
2002-01-01
The advent of spacecraft mobile robots-free-flyng sensor platforms and communications devices intended to accompany astronauts or remotely operate on space missions both inside and outside of a spacecraft-has demanded the development of a simple and effective navigation schema. One such system under exploration involves the use of a laser-camera arrangement to predict relative positioning of the mobile robot. By projecting laser beams from the robot, a 3D reference frame can be introduced. Thus, as the robot shifts in position, the position reference frame produced by the laser images is correspondingly altered. Using normalization and camera registration techniques presented in this paper, the relative translation and rotation of the robot in 3D are determined from these reference frame transformations.
Spoofing Detection Using GNSS/INS/Odometer Coupling for Vehicular Navigation
Broumandan, Ali; Lachapelle, Gérard
2018-01-01
Location information is one of the most vital information required to achieve intelligent and context-aware capability for various applications such as driverless cars. However, related security and privacy threats are a major holdback. With increasing focus on using Global Navigation Satellite Systems (GNSS) for autonomous navigation and related applications, it is important to provide robust navigation solutions, yet signal spoofing for illegal or covert transportation and misleading receiver timing is increasing and now frequent. Hence, detection and mitigation of spoofing attacks has become an important topic. Several contributions on spoofing detection have been made, focusing on different layers of a GNSS receiver. This paper focuses on spoofing detection utilizing self-contained sensors, namely inertial measurement units (IMUs) and vehicle odometer outputs. A spoofing detection approach based on a consistency check between GNSS and IMU/odometer mechanization is proposed. To detect a spoofing attack, the method analyses GNSS and IMU/odometer measurements independently during a pre-selected observation window and cross checks the solutions provided by GNSS and inertial navigation solution (INS)/odometer mechanization. The performance of the proposed method is verified in real vehicular environments. Mean spoofing detection time and detection performance in terms of receiver operation characteristics (ROC) in sub-urban and dense urban environments are evaluated. PMID:29695064
Spoofing Detection Using GNSS/INS/Odometer Coupling for Vehicular Navigation.
Broumandan, Ali; Lachapelle, Gérard
2018-04-24
Location information is one of the most vital information required to achieve intelligent and context-aware capability for various applications such as driverless cars. However, related security and privacy threats are a major holdback. With increasing focus on using Global Navigation Satellite Systems (GNSS) for autonomous navigation and related applications, it is important to provide robust navigation solutions, yet signal spoofing for illegal or covert transportation and misleading receiver timing is increasing and now frequent. Hence, detection and mitigation of spoofing attacks has become an important topic. Several contributions on spoofing detection have been made, focusing on different layers of a GNSS receiver. This paper focuses on spoofing detection utilizing self-contained sensors, namely inertial measurement units (IMUs) and vehicle odometer outputs. A spoofing detection approach based on a consistency check between GNSS and IMU/odometer mechanization is proposed. To detect a spoofing attack, the method analyses GNSS and IMU/odometer measurements independently during a pre-selected observation window and cross checks the solutions provided by GNSS and inertial navigation solution (INS)/odometer mechanization. The performance of the proposed method is verified in real vehicular environments. Mean spoofing detection time and detection performance in terms of receiver operation characteristics (ROC) in sub-urban and dense urban environments are evaluated.
An Environmental for Hardware-in-the-Loop Formation Navigation and Control
NASA Technical Reports Server (NTRS)
Burns, Rich; Naasz, Bo; Gaylor, Dave; Higinbotham, John
2004-01-01
Recent interest in formation flying satellite systems has spurred a considerable amount of research in the relative navigation and control of satellites. Development in this area has included new estimation and control algorithms as well as sensor and actuator development specifically geared toward the relative control problem. This paper describes a simulation facility, the Formation Flying Test Bed (FFTB) at NASA Goddard Space Flight Center, which allows engineers to test new algorithms for the formation flying problem with relevant GN&C hardware in a closed loop simulation. The FFTB currently supports the inclusion of GPS receiver hardware in the simulation loop. Support for satellite crosslink ranging technology is at a prototype stage. This closed-loop, hardware inclusive simulation capability permits testing of navigation and control software in the presence of the actual hardware with which the algorithms must interact. This capability provides the navigation or control developer with a perspective on how the algorithms perform as part of the closed-loop system. In this paper, the overall design and evolution of the FFTB are presented. Each component of the FFTB is then described. Interfaces between the components of the FFTB are shown and the interfaces to and between navigation and control software are described. Finally, an example of closed-loop formation control with GPS receivers in the loop is presented.
Advanced Integration of WiFi and Inertial Navigation Systems for Indoor Mobile Positioning
NASA Astrophysics Data System (ADS)
Evennou, Frédéric; Marx, François
2006-12-01
This paper presents an aided dead-reckoning navigation structure and signal processing algorithms for self localization of an autonomous mobile device by fusing pedestrian dead reckoning and WiFi signal strength measurements. WiFi and inertial navigation systems (INS) are used for positioning and attitude determination in a wide range of applications. Over the last few years, a number of low-cost inertial sensors have become available. Although they exhibit large errors, WiFi measurements can be used to correct the drift weakening the navigation based on this technology. On the other hand, INS sensors can interact with the WiFi positioning system as they provide high-accuracy real-time navigation. A structure based on a Kalman filter and a particle filter is proposed. It fuses the heterogeneous information coming from those two independent technologies. Finally, the benefits of the proposed architecture are evaluated and compared with the pure WiFi and INS positioning systems.
Indoor Pedestrian Navigation Using Foot-Mounted IMU and Portable Ultrasound Range Sensors
Girard, Gabriel; Côté, Stéphane; Zlatanova, Sisi; Barette, Yannick; St-Pierre, Johanne; van Oosterom, Peter
2011-01-01
Many solutions have been proposed for indoor pedestrian navigation. Some rely on pre-installed sensor networks, which offer good accuracy but are limited to areas that have been prepared for that purpose, thus requiring an expensive and possibly time-consuming process. Such methods are therefore inappropriate for navigation in emergency situations since the power supply may be disturbed. Other types of solutions track the user without requiring a prepared environment. However, they may have low accuracy. Offline tracking has been proposed to increase accuracy, however this prevents users from knowing their position in real time. This paper describes a real time indoor navigation system that does not require prepared building environments and provides tracking accuracy superior to previously described tracking methods. The system uses a combination of four techniques: foot-mounted IMU (Inertial Motion Unit), ultrasonic ranging, particle filtering and model-based navigation. The very purpose of the project is to combine these four well-known techniques in a novel way to provide better indoor tracking results for pedestrians. PMID:22164034
Autonomous navigation accuracy using simulated horizon sensor and sun sensor observations
NASA Technical Reports Server (NTRS)
Pease, G. E.; Hendrickson, H. T.
1980-01-01
A relatively simple autonomous system which would use horizon crossing indicators, a sun sensor, a quartz oscillator, and a microprogrammed computer is discussed. The sensor combination is required only to effectively measure the angle between the centers of the Earth and the Sun. Simulations for a particular orbit indicate that 2 km r.m.s. orbit determination uncertainties may be expected from a system with 0.06 deg measurement uncertainty. A key finding is that knowledge of the satellite orbit plane orientation can be maintained to this level because of the annual motion of the Sun and the predictable effects of Earth oblateness. The basic system described can be updated periodically by transits of the Moon through the IR horizon crossing indicator fields of view.
NASA Technical Reports Server (NTRS)
Crain, Timothy P.; Bishop, Robert H.; Carson, John M., III; Trawny, Nikolas; Hanak, Chad; Sullivan, Jacob; Christian, John; DeMars, Kyle; Campbell, Tom; Getchius, Joel
2016-01-01
The Morpheus Project began in late 2009 as an ambitious e ort code-named Project M to integrate three ongoing multi-center NASA technology developments: humanoid robotics, liquid oxygen/liquid methane (LOX/LCH4) propulsion and Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) into a single engineering demonstration mission to be own to the Moon by 2013. The humanoid robot e ort was redirected to a deploy- ment of Robonaut 2 on the International Space Station in February of 2011 while Morpheus continued as a terrestrial eld test project integrating the existing ALHAT Project's tech- nologies into a sub-orbital ight system using the world's rst LOX/LCH4 main propulsion and reaction control system fed from the same blowdown tanks. A series of 33 tethered tests with the Morpheus 1.0 vehicle and Morpheus 1.5 vehicle were conducted from April 2011 - December 2013 before successful, sustained free ights with the primary Vertical Testbed (VTB) navigation con guration began with Free Flight 3 on December 10, 2013. Over the course of the following 12 free ights and 3 tethered ights, components of the ALHAT navigation system were integrated into the Morpheus vehicle, operations, and ight control loop. The ALHAT navigation system was integrated and run concurrently with the VTB navigation system as a reference and fail-safe option in ight (see touchdown position esti- mate comparisons in Fig. 1). Flight testing completed with Free Flight 15 on December 15, 2014 with a completely autonomous Hazard Detection and Avoidance (HDA), integration of surface relative and Hazard Relative Navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman lter software, and landing within 2 meters of the VTB GPS-based navigation solution at the safe landing site target. This paper describes the Mor- pheus joint VTB/ALHAT navigation architecture, the sensors utilized during the terrestrial ight campaign, issues resolved during testing, and the navigation results from the ight tests.
An Adaptive Technique for a Redundant-Sensor Navigation System. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chien, T. T.
1972-01-01
An on-line adaptive technique is developed to provide a self-contained redundant-sensor navigation system with a capability to utilize its full potentiality in reliability and performance. The gyro navigation system is modeled as a Gauss-Markov process, with degradation modes defined as changes in characteristics specified by parameters associated with the model. The adaptive system is formulated as a multistage stochastic process: (1) a detection system, (2) an identification system and (3) a compensation system. It is shown that the sufficient statistics for the partially observable process in the detection and identification system is the posterior measure of the state of degradation, conditioned on the measurement history.
Noguchi during STORMM Reflector Relocation
2010-04-16
S131-E-010335 (16 April 2010) --- Japan Aerospace Exploration Agency (JAXA) astronaut Soichi Noguchi, Expedition 23 flight engineer, works to relocate a reflective element on the PMA-2 docking target in support of the Sensor Test for Orion Relative Navigation Risk Mitigation (STORRM) on the International Space Station while space shuttle Discovery (STS-131) remains docked with the station.
Real-time Terrain Relative Navigation Test Results from a Relevant Environment for Mars Landing
NASA Technical Reports Server (NTRS)
Johnson, Andrew E.; Cheng, Yang; Montgomery, James; Trawny, Nikolas; Tweddle, Brent; Zheng, Jason
2015-01-01
Terrain Relative Navigation (TRN) is an on-board GN&C function that generates a position estimate of a spacecraft relative to a map of a planetary surface. When coupled with a divert, the position estimate enables access to more challenging landing sites through pin-point landing or large hazard avoidance. The Lander Vision System (LVS) is a smart sensor system that performs terrain relative navigation by matching descent camera imagery to a map of the landing site and then fusing this with inertial measurements to obtain high rate map relative position, velocity and attitude estimates. A prototype of the LVS was recently tested in a helicopter field test over Mars analog terrain at altitudes representative of Mars Entry Descent and Landing conditions. TRN ran in real-time on the LVS during the flights without human intervention or tuning. The system was able to compute estimates accurate to 40m (3 sigma) in 10 seconds on a flight like processing system. This paper describes the Mars operational test space definition, how the field test was designed to cover that operational envelope, the resulting TRN performance across the envelope and an assessment of test space coverage.
Real-time MRI-guided needle intervention for cryoablation: a phantom study
NASA Astrophysics Data System (ADS)
Gao, Wenpeng; Jiang, Baichuan; Kacher, Dan F.; Fetics, Barry; Nevo, Erez; Lee, Thomas C.; Jayender, Jagadeesan
2017-03-01
MRI-guided needle intervention for cryoablation is a promising way to relieve the pain and treat the cancer. However, the limited size of MRI bore makes it impossible for clinicians to perform the operation in the bore. The patients had to be moved into the bore for scanning to verify the position of the needle's tip and out for adjusting the needle's trajectory. Real-time needle tracking and shown in MR images is of importance for clinicians to perform the operation more efficiently. In this paper, we have instrumented the cryotherapy needle with a MRI-safe electromagnetic (EM) sensor and optical sensor to measure the needle's position and orientation. To overcome the limitation of line-of-sight for optical sensor and the poor dynamic performance of the EM sensor, Kalman filter based data fusion is developed. Further, we developed a navigation system in open-source software, 3D Slicer, to provide accurate visualization of the needle and the surrounding anatomy. Experiment of simulation the needle intervention at the entrance was performed with a realistic spine phantom to quantify the accuracy of the navigation using the retrospective analysis method. Eleven trials of needle insertion were performed independently. The target accuracy with the navigation using only EM sensor, only optical sensor and data fusion are 2.27 +/-1.60 mm, 4.11 +/- 1.77 mm and 1.91 - 1.10 mm, respectively.
Sabatini, Angelo Maria; Genovese, Vincenzo
2014-01-01
A sensor fusion method was developed for vertical channel stabilization by fusing inertial measurements from an Inertial Measurement Unit (IMU) and pressure altitude measurements from a barometric altimeter integrated in the same device (baro-IMU). An Extended Kalman Filter (EKF) estimated the quaternion from the sensor frame to the navigation frame; the sensed specific force was rotated into the navigation frame and compensated for gravity, yielding the vertical linear acceleration; finally, a complementary filter driven by the vertical linear acceleration and the measured pressure altitude produced estimates of height and vertical velocity. A method was also developed to condition the measured pressure altitude using a whitening filter, which helped to remove the short-term correlation due to environment-dependent pressure changes from raw pressure altitude. The sensor fusion method was implemented to work on-line using data from a wireless baro-IMU and tested for the capability of tracking low-frequency small-amplitude vertical human-like motions that can be critical for stand-alone inertial sensor measurements. Validation tests were performed in different experimental conditions, namely no motion, free-fall motion, forced circular motion and squatting. Accurate on-line tracking of height and vertical velocity was achieved, giving confidence to the use of the sensor fusion method for tracking typical vertical human motions: velocity Root Mean Square Error (RMSE) was in the range 0.04–0.24 m/s; height RMSE was in the range 5–68 cm, with statistically significant performance gains when the whitening filter was used by the sensor fusion method to track relatively high-frequency vertical motions. PMID:25061835
Distributed sensor management for space situational awareness via a negotiation game
NASA Astrophysics Data System (ADS)
Jia, Bin; Shen, Dan; Pham, Khanh; Blasch, Erik; Chen, Genshe
2015-05-01
Space situational awareness (SSA) is critical to many space missions serving weather analysis, communications, and navigation. However, the number of sensors used in space situational awareness is limited which hinders collision avoidance prediction, debris assessment, and efficient routing. Hence, it is critical to use such sensor resources efficiently. In addition, it is desired to develop the SSA sensor management algorithm in a distributed manner. In this paper, a distributed sensor management approach using the negotiation game (NG-DSM) is proposed for the SSA. Specifically, the proposed negotiation game is played by each sensor and its neighboring sensors. The bargaining strategies are developed for each sensor based on negotiating for accurately tracking desired targets (e.g., satellite, debris, etc.) . The proposed NG-DSM method is tested in a scenario which includes eight space objects and three different sensor modalities which include a space based optical sensor, a ground radar, or a ground Electro-Optic sensor. The geometric relation between the sensor, the Sun, and the space object is also considered. The simulation results demonstrate the effectiveness of the proposed NG-DSM sensor management methods, which facilitates an application of multiple-sensor multiple-target tracking for space situational awareness.
Overview of Fiber-Optical Sensors
NASA Technical Reports Server (NTRS)
Depaula, Ramon P.; Moore, Emery L.
1987-01-01
Design, development, and sensitivity of sensors using fiber optics reviewed. State-of-the-art and probable future developments of sensors using fiber optics described in report including references to work in field. Serves to update previously published surveys. Systems incorporating fiber-optic sensors used in medical diagnosis, navigation, robotics, sonar, power industry, and industrial controls.
Systematic methods for knowledge acquisition and expert system development
NASA Technical Reports Server (NTRS)
Belkin, Brenda L.; Stengel, Robert F.
1991-01-01
Nine cooperating rule-based systems, collectively called AUTOCREW, were designed to automate functions and decisions associated with a combat aircraft's subsystem. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base, and to assess the cooperation between the rule-bases. Each AUTOCREW subsystem is composed of several expert systems that perform specific tasks. AUTOCREW's NAVIGATOR was analyzed in detail to understand the difficulties involved in designing the system and to identify tools and methodologies that ease development. The NAVIGATOR determines optimal navigation strategies from a set of available sensors. A Navigation Sensor Management (NSM) expert system was systematically designed from Kalman filter covariance data; four ground-based, a satellite-based, and two on-board INS-aiding sensors were modeled and simulated to aid an INS. The NSM Expert was developed using the Analysis of Variance (ANOVA) and the ID3 algorithm. Navigation strategy selection is based on an RSS position error decision metric, which is computed from the covariance data. Results show that the NSM Expert predicts position error correctly between 45 and 100 percent of the time for a specified navaid configuration and aircraft trajectory. The NSM Expert adapts to new situations, and provides reasonable estimates of hybrid performance. The systematic nature of the ANOVA/ID3 method makes it broadly applicable to expert system design when experimental or simulation data is available.
Yang, Yanqiang; Zhang, Chunxi; Lu, Jiazhen
2017-01-01
Strapdown inertial navigation system/celestial navigation system (SINS/CNS) integrated navigation is a fully autonomous and high precision method, which has been widely used to improve the hitting accuracy and quick reaction capability of near-Earth flight vehicles. The installation errors between SINS and star sensors have been one of the main factors that restrict the actual accuracy of SINS/CNS. In this paper, an integration algorithm based on the star vector observations is derived considering the star sensor installation error. Then, the star sensor installation error is accurately estimated based on Kalman Filtering (KF). Meanwhile, a local observability analysis is performed on the rank of observability matrix obtained via linearization observation equation, and the observable conditions are presented and validated. The number of star vectors should be greater than or equal to 2, and the times of posture adjustment also should be greater than or equal to 2. Simulations indicate that the star sensor installation error could be readily observable based on the maneuvering condition; moreover, the attitude errors of SINS are less than 7 arc-seconds. This analysis method and conclusion are useful in the ballistic trajectory design of near-Earth flight vehicles. PMID:28275211
Radar range data signal enhancement tracker
NASA Technical Reports Server (NTRS)
1975-01-01
The design, fabrication, and performance characteristics are described of two digital data signal enhancement filters which are capable of being inserted between the Space Shuttle Navigation Sensor outputs and the guidance computer. Commonality of interfaces has been stressed so that the filters may be evaluated through operation with simulated sensors or with actual prototype sensor hardware. The filters will provide both a smoothed range and range rate output. Different conceptual approaches are utilized for each filter. The first filter is based on a combination low pass nonrecursive filter and a cascaded simple average smoother for range and range rate, respectively. Filter number two is a tracking filter which is capable of following transient data of the type encountered during burn periods. A test simulator was also designed which generates typical shuttle navigation sensor data.
Meng, Zhijun; Yang, Jun; Guo, Xiye; Zhou, Yongbin
2017-01-01
Global Navigation Satellite System performance can be significantly enhanced by introducing inter-satellite links (ISLs) in navigation constellation. The improvement in position, velocity, and time accuracy as well as the realization of autonomous functions requires ISL distance measurement data as the original input. To build a high-performance ISL, the ranging consistency among navigation satellites is an urgent problem to be solved. In this study, we focus on the variation in the ranging delay caused by the sensitivity of the ISL payload equipment to the ambient temperature in space and propose a simple and low-power temperature-sensing ranging compensation sensor suitable for onboard equipment. The experimental results show that, after the temperature-sensing ranging compensation of the ISL payload equipment, the ranging consistency becomes less than 0.2 ns when the temperature change is 90 °C. PMID:28608809
Absolute Navigation Performance of the Orion Exploration Fight Test 1
NASA Technical Reports Server (NTRS)
Zanetti, Renato; Holt, Greg; Gay, Robert; D'Souza, Christopher; Sud, Jastesh
2016-01-01
Launched in December 2014 atop a Delta IV Heavy from the Kennedy Space Center, the Orion vehicle's Exploration Flight Test-1 (EFT-1) successfully completed the objective to stress the system by placing the un-crewed vehicle on a high-energy parabolic trajectory replicating conditions similar to those that would be experienced when returning from an asteroid or a lunar mission. Unique challenges associated with designing the navigation system for EFT-1 are presented with an emphasis on how redundancy and robustness influenced the architecture. Two Inertial Measurement Units (IMUs), one GPS receiver and three barometric altimeters (BALTs) comprise the navigation sensor suite. The sensor data is multiplexed using conventional integration techniques and the state estimate is refined by the GPS pseudorange and deltarange measurements in an Extended Kalman Filter (EKF) that employs UDU factorization. The performance of the navigation system during flight is presented to substantiate the design.
Applications of Payload Directed Flight
NASA Technical Reports Server (NTRS)
Ippolito, Corey; Fladeland, Matthew M.; Yeh, Yoo Hsiu
2009-01-01
Next generation aviation flight control concepts require autonomous and intelligent control system architectures that close control loops directly around payload sensors in manner more integrated and cohesive that in traditional autopilot designs. Research into payload directed flight control at NASA Ames Research Center is investigating new and novel architectures that can satisfy the requirements for next generation control and automation concepts for aviation. Tighter integration between sensor and machine requires definition of specific sensor-directed control modes to tie the sensor data directly into a vehicle control structures throughout the entire control architecture, from low-level stability- and control loops, to higher level mission planning and scheduling reasoning systems. Payload directed flight systems can thus provide guidance, navigation, and control for vehicle platforms hosting a suite of onboard payload sensors. This paper outlines related research into the field of payload directed flight; and outlines requirements and operating concepts for payload directed flight systems based on identified needs from the scientific literature.'
2018-04-02
iss055e008318 (April 2, 2018) --- Expedition 55 Flight Engineer Drew Feustel works inside the Japanese Kibo laboratory module with tiny internal satellites known as SPHERES, or Synchronized Position Hold, Engage, Reorient, Experimental Satellites. Feustel was operating the SPHERES for the Smoothing-Based Relative Navigation (SmoothNav) experiment which is developing an algorithm to obtain the most probable estimate of the relative positions and velocities between all spacecraft using all available sensor information, including past measurements.
Nature-Inspired Acoustic Sensor Projects
1999-08-24
m). The pager motors are worn on the wrists. Yale Intelligent Sensors Lab 8 Autonomous vehicle navigation Yago – Yale Autonomous Go-Cart Yago is used...proximity sensor determined the presence of close-by objects missed by the sonars. Yago operated autonomously by avoiding obstacles. Problems being
NASA Astrophysics Data System (ADS)
Chu, Q. P.; Van Woerkom, P. Th. L. M.
The Global Positioning System or GPS has been developed for the purpose of enabling accurate positioning and navigation anywhere on or near the surface of the Earth. In addition to the US system GPS-NAVSTAR, the Russian GLONASS system is also in place and operational. Other such systems are under study. The key measurement involved is the time of travel of signals from a particular GPS spacecraft to the navigating receiver. Navigation accuracies of the order of tenths of meters are achievable, and accuracies at the centimeter level can also be obtained with special enhancement techniques. In recent years spacecraft have already been exploring the use of GPS for in-orbit navigation. As the receiver is solid state, rugged, power-lean, and cheap, GPS for autonomous navigation will be an objective even for low-cost spacecraft of only modest sophistication. When the GPS receiver is equipped with multiple antennas with baselines even as low as about one meter, it can also give attitude information. In this case, the position of the spacecraft needs to be known with only very moderate accuracy. However, the phase differences between signals received by the different antennas now constitute the key measurements. In this case a centimeter level accuracy of range difference can be obtained. Receivers carrying out the processing of such measurements are already on the market, even in space-qualified versions. For spacecraft maneuvering at low rates, accuracies of the order of tenths of a degree are achievable. There are reasons for maintaining classical attitude sensor suites on a spacecraft even when a GPS receiver is added. In this case the classical sensors may be allowed to be of modest quality only, as subsequent fusion of their data with those from the GPS receiver may restore the accuracy of the final estimate again to an acceptable level. Hence, low-cost attitude sensors combined with a low-cost GPS receiver can still satisfy non-trivial attitude reconstitution accuracy requirements. As carrier phase difference measurements are ambiguous because of the unknown number of GPS signal cycles received, the estimated attitude is in principle ambiguous as well. Therefore, resolution of the GPS signal cycle ambiguity becomes a necessary task before determining the attitude for a stand-alone GPS attitude sensing system. This problem may be solved by introducing additional low-cost reference attitude sensors like three-axis magnetometers. This is also one of the advantages of integrated sensor systems. The paper is organized as follows. Global Positioning System and GPS observables are described in the first two sections. The main attitude determination concepts are presented in the next section. For small spacecraft, GPS integrated with other low-cost attitude sensors results in a data fusion concept, to be discussed next. The last section highlights experiences and on-going projects related to the spacecraft attitude determination using GPS.
Institute of Navigation, Annual Meeting, 47th, Williamsburg, VA, June 10-12, 1991, Proceedings
NASA Astrophysics Data System (ADS)
1991-11-01
The present volume of navigation and exploration discusses space exploration, mapping and geodesy, aircraft navigation, undersea navigation, land and vehicular location, international and legal aspects of navigation, the history of navigation technology and applications, Loran development and implementation, GPS and GLONASS developments, and search and rescue. Topics addressed include stabilization of low orbiting spacecraft using GPS, the employment of laser navigation for automatic rendezvous and docking systems, enhanced pseudostatic processing, and the expanding role of sensor fusion. Attention is given to a gravity-aided inertial navigation system, recent developments in aviation products liability and navigation, the ICAO future air navigation system, and Loran's implementation in NAS. Also discussed are Inmarsat integrated navigation/communication activities, the GPS program status, the evolution of military GPS technology into the Navcore V receiver engine, and Sarsat location algorithms.
Compact autonomous navigation system (CANS)
NASA Astrophysics Data System (ADS)
Hao, Y. C.; Ying, L.; Xiong, K.; Cheng, H. Y.; Qiao, G. D.
2017-11-01
Autonomous navigation of Satellite and constellation has series of benefits, such as to reduce operation cost and ground station workload, to avoid the event of crises of war and natural disaster, to increase spacecraft autonomy, and so on. Autonomous navigation satellite is independent of ground station support. Many systems are developed for autonomous navigation of satellite in the past 20 years. Along them American MANS (Microcosm Autonomous Navigation System) [1] of Microcosm Inc. and ERADS [2] [3] (Earth Reference Attitude Determination System) of Honeywell Inc. are well known. The systems anticipate a series of good features of autonomous navigation and aim low cost, integrated structure, low power consumption and compact layout. The ERADS is an integrated small 3-axis attitude sensor system with low cost and small volume. It has the Earth center measurement accuracy higher than the common IR sensor because the detected ultraviolet radiation zone of the atmosphere has a brightness gradient larger than that of the IR zone. But the ERADS is still a complex system because it has to eliminate many problems such as making of the sapphire sphere lens, birefringence effect of sapphire, high precision image transfer optical fiber flattener, ultraviolet intensifier noise, and so on. The marginal sphere FOV of the sphere lens of the ERADS is used to star imaging that may be bring some disadvantages., i.e. , the image energy and attitude measurements accuracy may be reduced due to the tilt image acceptance end of the fiber flattener in the FOV. Besides Japan, Germany and Russia developed visible earth sensor for GEO [4] [5]. Do we have a way to develop a cheaper/easier and more accurate autonomous navigation system that can be used to all LEO spacecraft, especially, to LEO small and micro satellites? To return this problem we provide a new type of the system—CANS (Compact Autonomous Navigation System) [6].
Development of voice navigation system for the visually impaired by using IC tags.
Takatori, Norihiko; Nojima, Kengo; Matsumoto, Masashi; Yanashima, Kenji; Magatani, Kazushige
2006-01-01
There are about 300,000 visually impaired persons in Japan. Most of them are old persons and, cannot become skillful in using a white cane, even if they make effort to learn how to use a white cane. Therefore, some guiding system that supports the independent activities of the visually impaired are required. In this paper, we will describe about a developed white cane system that supports the independent walking of the visually impaired in the indoor space. This system is composed of colored navigation lines that include IC tags and an intelligent white cane that has a navigation computer. In our system colored navigation lines that are put on the floor of the target space from the start point to the destination and IC tags that are set at the landmark point are used for indication of the route to the destination. The white cane has a color sensor, an IC tag transceiver and a computer system that includes a voice processor. This white cane senses the navigation line that has target color by a color sensor. When a color sensor finds the target color, the white cane informs a white cane user that he/she is on the navigation line by vibration. So, only following this vibration, the user can reach the destination. However, at some landmark points, guidance is necessary. At these points, an IC tag is set under the navigation line. The cane makes communication with the tag and informs the user about the land mark pint by pre recorded voice. Ten normal subjects who were blindfolded were tested with our developed system. All of them could walk along navigation line. And the IC tag information system worked well. Therefore, we have concluded that our system will be a very valuable one to support activities of the visually impaired.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-21
... inputs to semiautomatic self-contained dead reckoning navigation systems which were not continuously... Doppler sensor equipment that provides inputs to dead reckoning navigation systems obsolete. On August 18...
Autonomous Navigation of Small Uavs Based on Vehicle Dynamic Model
NASA Astrophysics Data System (ADS)
Khaghani, M.; Skaloud, J.
2016-03-01
This paper presents a novel approach to autonomous navigation for small UAVs, in which the vehicle dynamic model (VDM) serves as the main process model within the navigation filter. The proposed method significantly increases the accuracy and reliability of autonomous navigation, especially for small UAVs with low-cost IMUs on-board. This is achieved with no extra sensor added to the conventional INS/GNSS setup. This improvement is of special interest in case of GNSS outages, where inertial coasting drifts very quickly. In the proposed architecture, the solution to VDM equations provides the estimate of position, velocity, and attitude, which is updated within the navigation filter based on available observations, such as IMU data or GNSS measurements. The VDM is also fed with the control input to the UAV, which is available within the control/autopilot system. The filter is capable of estimating wind velocity and dynamic model parameters, in addition to navigation states and IMU sensor errors. Monte Carlo simulations reveal major improvements in navigation accuracy compared to conventional INS/GNSS navigation system during the autonomous phase, when satellite signals are not available due to physical obstruction or electromagnetic interference for example. In case of GNSS outages of a few minutes, position and attitude accuracy experiences improvements of orders of magnitude compared to inertial coasting. It means that during such scenario, the position-velocity-attitude (PVA) determination is sufficiently accurate to navigate the UAV to a home position without any signal that depends on vehicle environment.
Context-Aided Sensor Fusion for Enhanced Urban Navigation
Martí, Enrique David; Martín, David; García, Jesús; de la Escalera, Arturo; Molina, José Manuel; Armingol, José María
2012-01-01
The deployment of Intelligent Vehicles in urban environments requires reliable estimation of positioning for urban navigation. The inherent complexity of this kind of environments fosters the development of novel systems which should provide reliable and precise solutions to the vehicle. This article details an advanced GNSS/IMU fusion system based on a context-aided Unscented Kalman filter for navigation in urban conditions. The constrained non-linear filter is here conditioned by a contextual knowledge module which reasons about sensor quality and driving context in order to adapt it to the situation, while at the same time it carries out a continuous estimation and correction of INS drift errors. An exhaustive analysis has been carried out with available data in order to characterize the behavior of available sensors and take it into account in the developed solution. The performance is then analyzed with an extensive dataset containing representative situations. The proposed solution suits the use of fusion algorithms for deploying Intelligent Transport Systems in urban environments. PMID:23223080
Improving CAR Navigation with a Vision-Based System
NASA Astrophysics Data System (ADS)
Kim, H.; Choi, K.; Lee, I.
2015-08-01
The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.
Improving Car Navigation with a Vision-Based System
NASA Astrophysics Data System (ADS)
Kim, H.; Choi, K.; Lee, I.
2015-08-01
The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.
Research on the optimal structure configuration of dither RLG used in skewed redundant INS
NASA Astrophysics Data System (ADS)
Gao, Chunfeng; Wang, Qi; Wei, Guo; Long, Xingwu
2016-05-01
The actual combat effectiveness of weapon equipment is restricted by the performance of Inertial Navigation System (INS), especially in high reliability required situations such as fighter, satellite and submarine. Through the use of skewed sensor geometries, redundant technique has been applied to reduce the cost and improve the reliability of the INS. In this paper, the structure configuration and the inertial sensor characteristics of Skewed Redundant Strapdown Inertial Navigation System (SRSINS) using dithered Ring Laser Gyroscope (RLG) are analyzed. For the dither coupling effects of the dither gyro, the system measurement errors can be amplified either the individual gyro dither frequency is near one another or the structure of the SRSINS is unreasonable. Based on the characteristics of RLG, the research on coupled vibration of dithered RLG in SRSINS is carried out. On the principle of optimal navigation performance, optimal reliability and optimal cost-effectiveness, the comprehensive evaluation scheme of the inertial sensor configuration of SRINS is given.
Context-aided sensor fusion for enhanced urban navigation.
Martí, Enrique David; Martín, David; García, Jesús; de la Escalera, Arturo; Molina, José Manuel; Armingol, José María
2012-12-06
The deployment of Intelligent Vehicles in urban environments requires reliable estimation of positioning for urban navigation. The inherent complexity of this kind of environments fosters the development of novel systems which should provide reliable and precise solutions to the vehicle. This article details an advanced GNSS/IMU fusion system based on a context-aided Unscented Kalman filter for navigation in urban conditions. The constrained non-linear filter is here conditioned by a contextual knowledge module which reasons about sensor quality and driving context in order to adapt it to the situation, while at the same time it carries out a continuous estimation and correction of INS drift errors. An exhaustive analysis has been carried out with available data in order to characterize the behavior of available sensors and take it into account in the developed solution. The performance is then analyzed with an extensive dataset containing representative situations. The proposed solution suits the use of fusion algorithms for deploying Intelligent Transport Systems in urban environments.
Inertial navigation sensor integrated motion analysis for autonomous vehicle navigation
NASA Technical Reports Server (NTRS)
Roberts, Barry; Bhanu, Bir
1992-01-01
Recent work on INS integrated motion analysis is described. Results were obtained with a maximally passive system of obstacle detection (OD) for ground-based vehicles and rotorcraft. The OD approach involves motion analysis of imagery acquired by a passive sensor in the course of vehicle travel to generate range measurements to world points within the sensor FOV. INS data and scene analysis results are used to enhance interest point selection, the matching of the interest points, and the subsequent motion-based computations, tracking, and OD. The most important lesson learned from the research described here is that the incorporation of inertial data into the motion analysis program greatly improves the analysis and makes the process more robust.
Navigation system for autonomous mapper robots
NASA Astrophysics Data System (ADS)
Halbach, Marc; Baudoin, Yvan
1993-05-01
This paper describes the conception and realization of a fast, robust, and general navigation system for a mobile (wheeled or legged) robot. A database, representing a high level map of the environment is generated and continuously updated. The first part describes the legged target vehicle and the hexapod robot being developed. The second section deals with spatial and temporal sensor fusion for dynamic environment modeling within an obstacle/free space probabilistic classification grid. Ultrasonic sensors are used, others are suspected to be integrated, and a-priori knowledge is treated. US sensors are controlled by the path planning module. The third part concerns path planning and a simulation of a wheeled robot is also presented.
Low-cost lightweight airborne laser-based sensors for pipeline leak detection and reporting
NASA Astrophysics Data System (ADS)
Frish, Michael B.; Wainner, Richard T.; Laderer, Matthew C.; Allen, Mark G.; Rutherford, James; Wehnert, Paul; Dey, Sean; Gilchrist, John; Corbi, Ron; Picciaia, Daniele; Andreussi, Paolo; Furry, David
2013-05-01
Laser sensing enables aerial detection of natural gas pipeline leaks without need to fly through a hazardous gas plume. This paper describes adaptations of commercial laser-based methane sensing technology that provide relatively low-cost lightweight and battery-powered aerial leak sensors. The underlying technology is near-infrared Standoff Tunable Diode Laser Absorption Spectroscopy (sTDLAS). In one configuration, currently in commercial operation for pipeline surveillance, sTDLAS is combined with automated data reduction, alerting, navigation, and video imagery, integrated into a single-engine single-pilot light fixed-wing aircraft or helicopter platform. In a novel configuration for mapping landfill methane emissions, a miniaturized ultra-lightweight sTDLAS sensor flies aboard a small quad-rotor unmanned aerial vehicle (UAV).
Autonomous vehicle navigation utilizing fuzzy controls concepts for a next generation wheelchair.
Hansen, J D; Barrett, S F; Wright, C H G; Wilcox, M
2008-01-01
Three different positioning techniques were investigated to create an autonomous vehicle that could accurately navigate towards a goal: Global Positioning System (GPS), compass dead reckoning, and Ackerman steering. Each technique utilized a fuzzy logic controller that maneuvered a four-wheel car towards a target. The reliability and the accuracy of the navigation methods were investigated by modeling the algorithms in software and implementing them in hardware. To implement the techniques in hardware, positioning sensors were interfaced to a remote control car and a microprocessor. The microprocessor utilized the sensor measurements to orient the car with respect to the target. Next, a fuzzy logic control algorithm adjusted the front wheel steering angle to minimize the difference between the heading and bearing. After minimizing the heading error, the car maintained a straight steering angle along its path to the final destination. The results of this research can be used to develop applications that require precise navigation. The design techniques can also be implemented on alternate platforms such as a wheelchair to assist with autonomous navigation.
SGA-WZ: A New Strapdown Airborne Gravimeter
Huang, Yangming; Olesen, Arne Vestergaard; Wu, Meiping; Zhang, Kaidong
2012-01-01
Inertial navigation systems and gravimeters are now routinely used to map the regional gravitational quantities from an aircraft with mGal accuracy and a spatial resolution of a few kilometers. However, airborne gravimeter of this kind is limited by the inaccuracy of the inertial sensor performance, the integrated navigation technique and the kinematic acceleration determination. As the GPS technique developed, the vehicle acceleration determination is no longer the limiting factor in airborne gravity due to the cancellation of the common mode acceleration in differential mode. A new airborne gravimeter taking full advantage of the inertial navigation system is described with improved mechanical design, high precision time synchronization, better thermal control and optimized sensor modeling. Apart from the general usage, the Global Positioning System (GPS) after differentiation is integrated to the inertial navigation system which provides not only more precise altitude information along with the navigation aiding, but also an effective way to calculate the vehicle acceleration. Design description and test results on the performance of the gyroscopes and accelerations will be emphasized. Analysis and discussion of the airborne field test results are also given. PMID:23012545
Fusion of navigational data in River Information Services
NASA Astrophysics Data System (ADS)
Kazimierski, W.
2009-04-01
River Information Services (RIS) is the complex system of solutions and services for inland shipping. It has been the scope of works carried out in most of European countries for last several years. There were also a few major pan-European projects like INDRIS or COMPRIS launched for these purposes. The main idea of RIS is to harmonize the activities of various companies, authorities and other users of inland waterways in Europe. In the last time growing activity in this area in Poland can be also noticed. The leading example can be the works carried out in Chair of Geoinformatics in Maritime University of Szczecin regarding RIS for the needs of Odra River. The Directive 2005/44/EC of European Parliament and Europe Council, followed by European Commission regulations, give precise guidelines on implementing RIS in Europe, stating the services that should be provided. Among them Traffic Information and Traffic Management services can be found. As per guidelines they should be based on tracking and tracing of ships in the inland waters. The results of tracking and tracing are Tactical Traffic Image and Strategic Traffic Image. The guidelines stated that, tracking and tracing system in RIS shall consist of various type sensors. The most important of them is thought to be Automatic Information System (AIS), and particularly its river version - Inland AIS. It is based on determining the position of ships by satellite positioning systems (mainly DGPS) and transmitting it to other users on radio VHF frequences. This guarantees usually high accuracy of data related to movement of ships (assuming proper functioning of system and ship's sensors), and gives the possibility of transmitting additional information about ship, like dimensions, port of destination, cargo, etc. However the other sensors that can be used for tracking shall not be forgotten. The most important of them are radar (traditionally used for tracking purposes in Vessel Traffic Systems) and video camera. Their main advantage over AIS is total independence from tracked target's facilities. For example, wrong indications of ship's GPS would affect AIS accuracy, but wouldn't have any impact on values estimated by radar. In addition to this in many times update rate for AIS data is longer than for radar. Thus, it can be noticed, that efficient tracking system introduced in RIS shall use both AIS receivers (based on satellite derived positions), and independent radar and camera sensors. This will however cause determining at least two different set of information about positions and movement parameters of targets. Doubled or multiplied vectors for single target are unacceptable, due to safety of navigation and traffic management. Hence the need of data fusion in RIS is obvious. The main goal is to develop unambiguous, clear and reliable information about ships' position and movement for all users in the system. Data fusion itself is not a new problem in maritime navigation. There are systems of Integrated Bridge on sea-going ships, which use information coming out from different sources. However the possibilities of integration of navigational information in the aspect of inland navigation, especially in River Information Services, still needs to be thoroughly surveyed. It is quite useful for simplifying the deduction, to introduce two data fusion levels. First of them is being done on board of the vessel. Its aim is to integrate all information coming from different sensors in the so called Integrated Navigational System. The other task of this fusion is to estimate reliable information about other objects based on AIS and radar. The second level is the integration of AIS, radar and closed-circuit television (CCTV) carried out in coastal station in order to determine Tactical and Strategic Traffic Image. The navigational information in RIS itself can be divided into two main groups. The first one is called static data and contains al basic information related to ship itself and the voyage, like dimensions, destination, etc. The second group is called dynamic data and contains all the information, which variability is important for creating Tactical Traffic Image. Both groups require different fusion algorithms, which take into consideration sources, update rate and method, accuracy and reliability. The article contains different issues related to navigational information fusion in River Information Services. It includes short description of structures and sources of navigational information and also the most popular integration methods. More detailed analysis was made for fusion of position derived from satellite systems (GPS) and from radar. The concept of tracking system, combining Inland AIS, radar and CCTV for the needs of RIS is introduced.
An Environment for Hardware-in-the-Loop Formation Navigation and Control Simulation
NASA Technical Reports Server (NTRS)
Burns, Rich
2004-01-01
Recent interest in formation flying satellite systems has spurred a considerable amount of research in the relative navigation and control of satellites. Development in this area has included new estimation and control algorithms as well as sensor and actuator development specifically geared toward the relative control problem. This paper describes a simulation facility, the Formation Flying Testbed (FFTB) at NASA's Goddard Space Flight Center, which allows engineers to test new algorithms for the formation flying problem with relevant GN&C hardware in a closed loop simulation. The FFTB currently supports the injection of GPS receiver hardware into the simulation loop, and support for satellite crosslink ranging technology is at a prototype stage. This closed-loop, hardware inclusive simulation capability permits testing of navigation and control software in the presence of the actual hardware with which the algorithms must interact. This capability provides the navigation or control developer with a perspective on how the algorithms perform as part of the closed-loop system. In this paper, the overall design and evolution of the FFTB are presented. Each component of the FFTB is then described in detail. Interfaces between the components of the FFTB are shown and the interfaces to and between navigation and control software are described in detail. Finally, an example of closed-loop formation control with GPS receivers in the loop is presented and results are analyzed.
NASA Astrophysics Data System (ADS)
Theil, S.; Ammann, N.; Andert, F.; Franz, T.; Krüger, H.; Lehner, H.; Lingenauber, M.; Lüdtke, D.; Maass, B.; Paproth, C.; Wohlfeil, J.
2018-03-01
Since 2010 the German Aerospace Center is working on the project Autonomous Terrain-based Optical Navigation (ATON). Its objective is the development of technologies which allow autonomous navigation of spacecraft in orbit around and during landing on celestial bodies like the Moon, planets, asteroids and comets. The project developed different image processing techniques and optical navigation methods as well as sensor data fusion. The setup—which is applicable to many exploration missions—consists of an inertial measurement unit, a laser altimeter, a star tracker and one or multiple navigation cameras. In the past years, several milestones have been achieved. It started with the setup of a simulation environment including the detailed simulation of camera images. This was continued by hardware-in-the-loop tests in the Testbed for Robotic Optical Navigation (TRON) where images were generated by real cameras in a simulated downscaled lunar landing scene. Data were recorded in helicopter flight tests and post-processed in real-time to increase maturity of the algorithms and to optimize the software. Recently, two more milestones have been achieved. In late 2016, the whole navigation system setup was flying on an unmanned helicopter while processing all sensor information onboard in real time. For the latest milestone the navigation system was tested in closed-loop on the unmanned helicopter. For that purpose the ATON navigation system provided the navigation state for the guidance and control of the unmanned helicopter replacing the GPS-based standard navigation system. The paper will give an introduction to the ATON project and its concept. The methods and algorithms of ATON are briefly described. The flight test results of the latest two milestones are presented and discussed.
Smoothing-Based Relative Navigation and Coded Aperture Imaging
NASA Technical Reports Server (NTRS)
Saenz-Otero, Alvar; Liebe, Carl Christian; Hunter, Roger C.; Baker, Christopher
2017-01-01
This project will develop an efficient smoothing software for incremental estimation of the relative poses and velocities between multiple, small spacecraft in a formation, and a small, long range depth sensor based on coded aperture imaging that is capable of identifying other spacecraft in the formation. The smoothing algorithm will obtain the maximum a posteriori estimate of the relative poses between the spacecraft by using all available sensor information in the spacecraft formation.This algorithm will be portable between different satellite platforms that possess different sensor suites and computational capabilities, and will be adaptable in the case that one or more satellites in the formation become inoperable. It will obtain a solution that will approach an exact solution, as opposed to one with linearization approximation that is typical of filtering algorithms. Thus, the algorithms developed and demonstrated as part of this program will enhance the applicability of small spacecraft to multi-platform operations, such as precisely aligned constellations and fractionated satellite systems.
Multiple nodes transfer alignment for airborne missiles based on inertial sensor network
NASA Astrophysics Data System (ADS)
Si, Fan; Zhao, Yan
2017-09-01
Transfer alignment is an important initialization method for airborne missiles because the alignment accuracy largely determines the performance of the missile. However, traditional alignment methods are limited by complicated and unknown flexure angle, and cannot meet the actual requirement when wing flexure deformation occurs. To address this problem, we propose a new method that uses the relative navigation parameters between the weapons and fighter to achieve transfer alignment. First, in the relative inertial navigation algorithm, the relative attitudes and positions are constantly computed in wing flexure deformation situations. Secondly, the alignment results of each weapon are processed using a data fusion algorithm to improve the overall performance. Finally, the feasibility and performance of the proposed method were evaluated under two typical types of deformation, and the simulation results demonstrated that the new transfer alignment method is practical and has high-precision.
Light Detection and Ranging-Based Terrain Navigation: A Concept Exploration
NASA Technical Reports Server (NTRS)
Campbell, Jacob; UijtdeHaag, Maarten; vanGraas, Frank; Young, Steve
2003-01-01
This paper discusses the use of Airborne Light Detection And Ranging (LiDAR) equipment for terrain navigation. Airborne LiDAR is a relatively new technology used primarily by the geo-spatial mapping community to produce highly accurate and dense terrain elevation maps. In this paper, the term LiDAR refers to a scanning laser ranger rigidly mounted to an aircraft, as opposed to an integrated sensor system that consists of a scanning laser ranger integrated with Global Positioning System (GPS) and Inertial Measurement Unit (IMU) data. Data from the laser range scanner and IMU will be integrated with a terrain database to estimate the aircraft position and data from the laser range scanner will be integrated with GPS to estimate the aircraft attitude. LiDAR data was collected using NASA Dryden's DC-8 flying laboratory in Reno, NV and was used to test the proposed terrain navigation system. The results of LiDAR-based terrain navigation shown in this paper indicate that airborne LiDAR is a viable technology enabler for fully autonomous aircraft navigation. The navigation performance is highly dependent on the quality of the terrain databases used for positioning and therefore high-resolution (2 m post-spacing) data was used as the terrain reference.
High resolution hybrid optical and acoustic sea floor maps (Invited)
NASA Astrophysics Data System (ADS)
Roman, C.; Inglis, G.
2013-12-01
This abstract presents a method for creating hybrid optical and acoustic sea floor reconstructions at centimeter scale grid resolutions with robotic vehicles. Multibeam sonar and stereo vision are two common sensing modalities with complementary strengths that are well suited for data fusion. We have recently developed an automated two stage pipeline to create such maps. The steps can be broken down as navigation refinement and map construction. During navigation refinement a graph-based optimization algorithm is used to align 3D point clouds created with both the multibeam sonar and stereo cameras. The process combats the typical growth in navigation error that has a detrimental affect on map fidelity and typically introduces artifacts at small grid sizes. During this process we are able to automatically register local point clouds created by each sensor to themselves and to each other where they overlap in a survey pattern. The process also estimates the sensor offsets, such as heading, pitch and roll, that describe how each sensor is mounted to the vehicle. The end results of the navigation step is a refined vehicle trajectory that ensures the points clouds from each sensor are consistently aligned, and the individual sensor offsets. In the mapping step, grid cells in the map are selectively populated by choosing data points from each sensor in an automated manner. The selection process is designed to pick points that preserve the best characteristics of each sensor and honor some specific map quality criteria to reduce outliers and ghosting. In general, the algorithm selects dense 3D stereo points in areas of high texture and point density. In areas where the stereo vision is poor, such as in a scene with low contrast or texture, multibeam sonar points are inserted in the map. This process is automated and results in a hybrid map populated with data from both sensors. Additional cross modality checks are made to reject outliers in a robust manner. The final hybrid map retains the strengths of both sensors and shows improvement over the single modality maps and a naively assembled multi-modal map where all the data points are included and averaged. Results will be presented from marine geological and archaeological applications using a 1350 kHz BlueView multibeam sonar and 1.3 megapixel digital still cameras.
Satellite Imagery Assisted Road-Based Visual Navigation System
NASA Astrophysics Data System (ADS)
Volkova, A.; Gibbens, P. W.
2016-06-01
There is a growing demand for unmanned aerial systems as autonomous surveillance, exploration and remote sensing solutions. Among the key concerns for robust operation of these systems is the need to reliably navigate the environment without reliance on global navigation satellite system (GNSS). This is of particular concern in Defence circles, but is also a major safety issue for commercial operations. In these circumstances, the aircraft needs to navigate relying only on information from on-board passive sensors such as digital cameras. An autonomous feature-based visual system presented in this work offers a novel integral approach to the modelling and registration of visual features that responds to the specific needs of the navigation system. It detects visual features from Google Earth* build a feature database. The same algorithm then detects features in an on-board cameras video stream. On one level this serves to localise the vehicle relative to the environment using Simultaneous Localisation and Mapping (SLAM). On a second level it correlates them with the database to localise the vehicle with respect to the inertial frame. The performance of the presented visual navigation system was compared using the satellite imagery from different years. Based on comparison results, an analysis of the effects of seasonal, structural and qualitative changes of the imagery source on the performance of the navigation algorithm is presented. * The algorithm is independent of the source of satellite imagery and another provider can be used
Seamless positioning and navigation by using geo-referenced images and multi-sensor data.
Li, Xun; Wang, Jinling; Li, Tao
2013-07-12
Ubiquitous positioning is considered to be a highly demanding application for today's Location-Based Services (LBS). While satellite-based navigation has achieved great advances in the past few decades, positioning and navigation in indoor scenarios and deep urban areas has remained a challenging topic of substantial research interest. Various strategies have been adopted to fill this gap, within which vision-based methods have attracted growing attention due to the widespread use of cameras on mobile devices. However, current vision-based methods using image processing have yet to revealed their full potential for navigation applications and are insufficient in many aspects. Therefore in this paper, we present a hybrid image-based positioning system that is intended to provide seamless position solution in six degrees of freedom (6DoF) for location-based services in both outdoor and indoor environments. It mainly uses visual sensor input to match with geo-referenced images for image-based positioning resolution, and also takes advantage of multiple onboard sensors, including the built-in GPS receiver and digital compass to assist visual methods. Experiments demonstrate that such a system can greatly improve the position accuracy for areas where the GPS signal is negatively affected (such as in urban canyons), and it also provides excellent position accuracy for indoor environments.
Seamless Positioning and Navigation by Using Geo-Referenced Images and Multi-Sensor Data
Li, Xun; Wang, Jinling; Li, Tao
2013-01-01
Ubiquitous positioning is considered to be a highly demanding application for today's Location-Based Services (LBS). While satellite-based navigation has achieved great advances in the past few decades, positioning and navigation in indoor scenarios and deep urban areas has remained a challenging topic of substantial research interest. Various strategies have been adopted to fill this gap, within which vision-based methods have attracted growing attention due to the widespread use of cameras on mobile devices. However, current vision-based methods using image processing have yet to revealed their full potential for navigation applications and are insufficient in many aspects. Therefore in this paper, we present a hybrid image-based positioning system that is intended to provide seamless position solution in six degrees of freedom (6DoF) for location-based services in both outdoor and indoor environments. It mainly uses visual sensor input to match with geo-referenced images for image-based positioning resolution, and also takes advantage of multiple onboard sensors, including the built-in GPS receiver and digital compass to assist visual methods. Experiments demonstrate that such a system can greatly improve the position accuracy for areas where the GPS signal is negatively affected (such as in urban canyons), and it also provides excellent position accuracy for indoor environments. PMID:23857267
Loosely Coupled GPS-Aided Inertial Navigation System for Range Safety
NASA Technical Reports Server (NTRS)
Heatwole, Scott; Lanzi, Raymond J.
2010-01-01
The Autonomous Flight Safety System (AFSS) aims to replace the human element of range safety operations, as well as reduce reliance on expensive, downrange assets for launches of expendable launch vehicles (ELVs). The system consists of multiple navigation sensors and flight computers that provide a highly reliable platform. It is designed to ensure that single-event failures in a flight computer or sensor will not bring down the whole system. The flight computer uses a rules-based structure derived from range safety requirements to make decisions whether or not to destroy the rocket.
NASA Technical Reports Server (NTRS)
Frederick, D. K.; Lashmet, P. K.; Sandor, G. N.; Shen, C. N.; Smith, E. J.; Yerazunis, S. W.
1971-01-01
Investigation of problems related to control of a mobile planetary vehicle according to a systematic plan for the exploration of Mars has been undertaken. Problem areas receiving attention include: (1) overall systems analysis; (2) vehicle configuration and dynamics; (3) toroidal wheel design and evaluation; (4) on-board navigation systems; (5) satellite-vehicle navigation systems; (6) obstacle detection systems; (7) terrain sensing, interpretation and modeling; (8) computer simulation of terrain sensor-path selection systems; and (9) chromatographic systems design concept studies. The specific tasks which have been undertaken are defined and the progress which has been achieved during the period July 1, 1971 to December 31, 1971 is summarized.
Autonomous Vision Navigation for Spacecraft in Lunar Orbit
NASA Astrophysics Data System (ADS)
Bader, Nolan A.
NASA aims to achieve unprecedented navigational reliability for the first manned lunar mission of the Orion spacecraft in 2023. A technique for accomplishing this is to integrate autonomous feature tracking as an added means of improving position and velocity estimation. In this thesis, a template matching algorithm and optical sensor are tested onboard three simulated lunar trajectories using linear covariance techniques under various conditions. A preliminary characterization of the camera gives insight into its ability to determine azimuth and elevation angles to points on the surface of the Moon. A navigation performance analysis shows that an optical camera sensor can aid in decreasing position and velocity errors, particularly in a loss of communication scenario. Furthermore, it is found that camera quality and computational capability are driving factors affecting the performance of such a system.
Tian, Qinglin; Salcic, Zoran; Wang, Kevin I-Kai; Pan, Yun
2015-12-05
Pedestrian dead reckoning is a common technique applied in indoor inertial navigation systems that is able to provide accurate tracking performance within short distances. Sensor drift is the main bottleneck in extending the system to long-distance and long-term tracking. In this paper, a hybrid system integrating traditional pedestrian dead reckoning based on the use of inertial measurement units, short-range radio frequency systems and particle filter map matching is proposed. The system is a drift-free pedestrian navigation system where position error and sensor drift is regularly corrected and is able to provide long-term accurate and reliable tracking. Moreover, the whole system is implemented on a commercial off-the-shelf smartphone and achieves real-time positioning and tracking performance with satisfactory accuracy.
Giraldo, Paula Jimena Ramos; Aguirre, Álvaro Guerrero; Muñoz, Carlos Mario; Prieto, Flavio Augusto; Oliveros, Carlos Eugenio
2017-04-06
Smartphones show potential for controlling and monitoring variables in agriculture. Their processing capacity, instrumentation, connectivity, low cost, and accessibility allow farmers (among other users in rural areas) to operate them easily with applications adjusted to their specific needs. In this investigation, the integration of inertial sensors, a GPS, and a camera are presented for the monitoring of a coffee crop. An Android-based application was developed with two operating modes: ( i ) Navigation: for georeferencing trees, which can be as close as 0.5 m from each other; and ( ii ) Acquisition: control of video acquisition, based on the movement of the mobile device over a branch, and measurement of image quality, using clarity indexes to select the most appropriate frames for application in future processes. The integration of inertial sensors in navigation mode, shows a mean relative error of ±0.15 m, and total error ±5.15 m. In acquisition mode, the system correctly identifies the beginning and end of mobile phone movement in 99% of cases, and image quality is determined by means of a sharpness factor which measures blurriness. With the developed system, it will be possible to obtain georeferenced information about coffee trees, such as their production, nutritional state, and presence of plagues or diseases.
Ramos Giraldo, Paula Jimena; Guerrero Aguirre, Álvaro; Muñoz, Carlos Mario; Prieto, Flavio Augusto; Oliveros, Carlos Eugenio
2017-01-01
Smartphones show potential for controlling and monitoring variables in agriculture. Their processing capacity, instrumentation, connectivity, low cost, and accessibility allow farmers (among other users in rural areas) to operate them easily with applications adjusted to their specific needs. In this investigation, the integration of inertial sensors, a GPS, and a camera are presented for the monitoring of a coffee crop. An Android-based application was developed with two operating modes: (i) Navigation: for georeferencing trees, which can be as close as 0.5 m from each other; and (ii) Acquisition: control of video acquisition, based on the movement of the mobile device over a branch, and measurement of image quality, using clarity indexes to select the most appropriate frames for application in future processes. The integration of inertial sensors in navigation mode, shows a mean relative error of ±0.15 m, and total error ±5.15 m. In acquisition mode, the system correctly identifies the beginning and end of mobile phone movement in 99% of cases, and image quality is determined by means of a sharpness factor which measures blurriness. With the developed system, it will be possible to obtain georeferenced information about coffee trees, such as their production, nutritional state, and presence of plagues or diseases. PMID:28383494
A tesselated probabilistic representation for spatial robot perception and navigation
NASA Technical Reports Server (NTRS)
Elfes, Alberto
1989-01-01
The ability to recover robust spatial descriptions from sensory information and to efficiently utilize these descriptions in appropriate planning and problem-solving activities are crucial requirements for the development of more powerful robotic systems. Traditional approaches to sensor interpretation, with their emphasis on geometric models, are of limited use for autonomous mobile robots operating in and exploring unknown and unstructured environments. Here, researchers present a new approach to robot perception that addresses such scenarios using a probabilistic tesselated representation of spatial information called the Occupancy Grid. The Occupancy Grid is a multi-dimensional random field that maintains stochastic estimates of the occupancy state of each cell in the grid. The cell estimates are obtained by interpreting incoming range readings using probabilistic models that capture the uncertainty in the spatial information provided by the sensor. A Bayesian estimation procedure allows the incremental updating of the map using readings taken from several sensors over multiple points of view. An overview of the Occupancy Grid framework is given, and its application to a number of problems in mobile robot mapping and navigation are illustrated. It is argued that a number of robotic problem-solving activities can be performed directly on the Occupancy Grid representation. Some parallels are drawn between operations on Occupancy Grids and related image processing operations.
NASA Astrophysics Data System (ADS)
Lu, Jiazhen; Lei, Chaohua; Yang, Yanqiang; Liu, Ming
2016-12-01
An integrated inertial/celestial navigation system (INS/CNS) has wide applicability in lunar rovers as it provides accurate and autonomous navigational information. Initialization is particularly vital for a INS. This paper proposes a two-position initialization method based on a standard Kalman filter. The difference between the computed star vector and the measured star vector is measured. With the aid of a star sensor and the two positions, the attitudinal and positional errors can be greatly reduced, and the biases of three gyros and accelerometers can also be estimated. The semi-physical simulation results show that the positional and attitudinal errors converge within 0.07″ and 0.1 m, respectively, when the given initial positional error is 1 km and the attitudinal error is 10°. These good results show that the proposed method can accomplish alignment, positioning and calibration functions simultaneously. Thus the proposed two-position initialization method has the potential for application in lunar rover navigation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, N.S.V.; Kareti, S.; Shi, Weimin
A formal framework for navigating a robot in a geometric terrain by an unknown set of obstacles is considered. Here the terrain model is not a priori known, but the robot is equipped with a sensor system (vision or touch) employed for the purpose of navigation. The focus is restricted to the non-heuristic algorithms which can be theoretically shown to be correct within a given framework of models for the robot, terrain and sensor system. These formulations, although abstract and simplified compared to real-life scenarios, provide foundations for practical systems by highlighting the underlying critical issues. First, the authors considermore » the algorithms that are shown to navigate correctly without much consideration given to the performance parameters such as distance traversed, etc. Second, they consider non-heuristic algorithms that guarantee bounds on the distance traversed or the ratio of the distance traversed to the shortest path length (computed if the terrain model is known). Then they consider the navigation of robots with very limited computational capabilities such as finite automata, etc.« less
Fully autonomous navigation for the NASA cargo transfer vehicle
NASA Technical Reports Server (NTRS)
Wertz, James R.; Skulsky, E. David
1991-01-01
A great deal of attention has been paid to navigation during the close approach (less than or equal to 1 km) phase of spacecraft rendezvous. However, most spacecraft also require a navigation system which provides the necessary accuracy for placing both satellites within the range of the docking sensors. The Microcosm Autonomous Navigation System (MANS) is an on-board system which uses Earth-referenced attitude sensing hardware to provide precision orbit and attitude determination. The system is capable of functioning from LEO to GEO and beyond. Performance depends on the number of available sensors as well as mission geometry; however, extensive simulations have shown that MANS will provide 100 m to 400 m (3(sigma)) position accuracy and 0.03 to 0.07 deg (3(sigma)) attitude accuracy in low Earth orbit. The system is independent of any external source, including GPS. MANS is expected to have a significant impact on ground operations costs, mission definition and design, survivability, and the potential development of very low-cost, fully autonomous spacecraft.
Assistive obstacle detection and navigation devices for vision-impaired users.
Ong, S K; Zhang, J; Nee, A Y C
2013-09-01
Quality of life for the visually impaired is an urgent worldwide issue that needs to be addressed. Obstacle detection is one of the most important navigation tasks for the visually impaired. In this research, a novel range sensor placement scheme is proposed in this paper for the development of obstacle detection devices. Based on this scheme, two prototypes have been developed targeting at different user groups. This paper discusses the design issues, functional modules and the evaluation tests carried out for both prototypes. Implications for Rehabilitation Visual impairment problem is becoming more severe due to the worldwide ageing population. Individuals with visual impairment require assistance from assistive devices in daily navigation tasks. Traditional assistive devices that assist navigation may have certain drawbacks, such as the limited sensing range of a white cane. Obstacle detection devices applying the range sensor technology can identify road conditions with a higher sensing range to notify the users of potential dangers in advance.
Kikutis, Ramūnas; Stankūnas, Jonas; Rudinskas, Darius; Masiulionis, Tadas
2017-09-28
Current research on Unmanned Aerial Vehicles (UAVs) shows a lot of interest in autonomous UAV navigation. This interest is mainly driven by the necessity to meet the rules and restrictions for small UAV flights that are issued by various international and national legal organizations. In order to lower these restrictions, new levels of automation and flight safety must be reached. In this paper, a new method for ground obstacle avoidance derived by using UAV navigation based on the Dubins paths algorithm is presented. The accuracy of the proposed method has been tested, and research results have been obtained by using Software-in-the-Loop (SITL) simulation and real UAV flights, with the measurements done with a low cost Global Navigation Satellite System (GNSS) sensor. All tests were carried out in a three-dimensional space, but the height accuracy was not assessed. The GNSS navigation data for the ground obstacle avoidance algorithm is evaluated statistically.
Kikutis, Ramūnas; Stankūnas, Jonas; Rudinskas, Darius; Masiulionis, Tadas
2017-01-01
Current research on Unmanned Aerial Vehicles (UAVs) shows a lot of interest in autonomous UAV navigation. This interest is mainly driven by the necessity to meet the rules and restrictions for small UAV flights that are issued by various international and national legal organizations. In order to lower these restrictions, new levels of automation and flight safety must be reached. In this paper, a new method for ground obstacle avoidance derived by using UAV navigation based on the Dubins paths algorithm is presented. The accuracy of the proposed method has been tested, and research results have been obtained by using Software-in-the-Loop (SITL) simulation and real UAV flights, with the measurements done with a low cost Global Navigation Satellite System (GNSS) sensor. All tests were carried out in a three-dimensional space, but the height accuracy was not assessed. The GNSS navigation data for the ground obstacle avoidance algorithm is evaluated statistically. PMID:28956839
Reconnaissance and Autonomy for Small Robots (RASR)
2012-06-29
The Reconnaissance and Autonomy for Small Robots (RASR) team developed a system for the coordination of groups of unmanned ground vehicles (UGVs...development of a system that used 1) a relevant deployable platform; 2) a minimum set of relatively inexpensive navigation and LADAR sensors; 3) an...expandable and modular control system with innovative software algorithms to minimize computing footprint; and that minimized 4) required communications
Deppe, Olaf; Dorner, Georg; König, Stefan; Martin, Tim; Voigt, Sven; Zimmermann, Steffen
2017-01-01
In the following paper, we present an industry perspective of inertial sensors for navigation purposes driven by applications and customer needs. Microelectromechanical system (MEMS) inertial sensors have revolutionized consumer, automotive, and industrial applications and they have started to fulfill the high end tactical grade performance requirements of hybrid navigation systems on a series production scale. The Fiber Optic Gyroscope (FOG) technology, on the other hand, is further pushed into the near navigation grade performance region and beyond. Each technology has its special pros and cons making it more or less suitable for specific applications. In our overview paper, we present latest improvements at NG LITEF in tactical and navigation grade MEMS accelerometers, MEMS gyroscopes, and Fiber Optic Gyroscopes, based on our long-term experience in the field. We demonstrate how accelerometer performance has improved by switching from wet etching to deep reactive ion etching (DRIE) technology. For MEMS gyroscopes, we show that better than 1°/h series production devices are within reach, and for FOGs we present how limitations in noise performance were overcome by signal processing. The paper also intends a comparison of the different technologies, emphasizing suitability for different navigation applications, thus providing guidance to system engineers. PMID:28287483
Georgy, Jacques; Noureldin, Aboelmagd
2011-01-01
Satellite navigation systems such as the global positioning system (GPS) are currently the most common technique used for land vehicle positioning. However, in GPS-denied environments, there is an interruption in the positioning information. Low-cost micro-electro mechanical system (MEMS)-based inertial sensors can be integrated with GPS and enhance the performance in denied GPS environments. The traditional technique for this integration problem is Kalman filtering (KF). Due to the inherent errors of low-cost MEMS inertial sensors and their large stochastic drifts, KF, with its linearized models, has limited capabilities in providing accurate positioning. Particle filtering (PF) was recently suggested as a nonlinear filtering technique to accommodate for arbitrary inertial sensor characteristics, motion dynamics and noise distributions. An enhanced version of PF called the Mixture PF is utilized in this study to perform tightly coupled integration of a three dimensional (3D) reduced inertial sensors system (RISS) with GPS. In this work, the RISS consists of one single-axis gyroscope and a two-axis accelerometer used together with the vehicle's odometer to obtain 3D navigation states. These sensors are then integrated with GPS in a tightly coupled scheme. In loosely-coupled integration, at least four satellites are needed to provide acceptable GPS position and velocity updates for the integration filter. The advantage of the tightly-coupled integration is that it can provide GPS measurement update(s) even when the number of visible satellites is three or lower, thereby improving the operation of the navigation system in environments with partial blockages by providing continuous aiding to the inertial sensors even during limited GPS satellite availability. To effectively exploit the capabilities of PF, advanced modeling for the stochastic drift of the vertically aligned gyroscope is used. In order to benefit from measurement updates for such drift, which are loosely-coupled updates, a hybrid loosely/tightly coupled solution is proposed. This solution is suitable for downtown environments because of the long natural outages or degradation of GPS. The performance of the proposed 3D Navigation solution using Mixture PF for 3D RISS/GPS integration is examined by road test trajectories in a land vehicle and compared to the KF counterpart.
Georgy, Jacques; Noureldin, Aboelmagd
2011-01-01
Satellite navigation systems such as the global positioning system (GPS) are currently the most common technique used for land vehicle positioning. However, in GPS-denied environments, there is an interruption in the positioning information. Low-cost micro-electro mechanical system (MEMS)-based inertial sensors can be integrated with GPS and enhance the performance in denied GPS environments. The traditional technique for this integration problem is Kalman filtering (KF). Due to the inherent errors of low-cost MEMS inertial sensors and their large stochastic drifts, KF, with its linearized models, has limited capabilities in providing accurate positioning. Particle filtering (PF) was recently suggested as a nonlinear filtering technique to accommodate for arbitrary inertial sensor characteristics, motion dynamics and noise distributions. An enhanced version of PF called the Mixture PF is utilized in this study to perform tightly coupled integration of a three dimensional (3D) reduced inertial sensors system (RISS) with GPS. In this work, the RISS consists of one single-axis gyroscope and a two-axis accelerometer used together with the vehicle’s odometer to obtain 3D navigation states. These sensors are then integrated with GPS in a tightly coupled scheme. In loosely-coupled integration, at least four satellites are needed to provide acceptable GPS position and velocity updates for the integration filter. The advantage of the tightly-coupled integration is that it can provide GPS measurement update(s) even when the number of visible satellites is three or lower, thereby improving the operation of the navigation system in environments with partial blockages by providing continuous aiding to the inertial sensors even during limited GPS satellite availability. To effectively exploit the capabilities of PF, advanced modeling for the stochastic drift of the vertically aligned gyroscope is used. In order to benefit from measurement updates for such drift, which are loosely-coupled updates, a hybrid loosely/tightly coupled solution is proposed. This solution is suitable for downtown environments because of the long natural outages or degradation of GPS. The performance of the proposed 3D Navigation solution using Mixture PF for 3D RISS/GPS integration is examined by road test trajectories in a land vehicle and compared to the KF counterpart. PMID:22163846
NASA Astrophysics Data System (ADS)
Endo, Yoichiro; Balloch, Jonathan C.; Grushin, Alexander; Lee, Mun Wai; Handelman, David
2016-05-01
Control of current tactical unmanned ground vehicles (UGVs) is typically accomplished through two alternative modes of operation, namely, low-level manual control using joysticks and high-level planning-based autonomous control. Each mode has its own merits as well as inherent mission-critical disadvantages. Low-level joystick control is vulnerable to communication delay and degradation, and high-level navigation often depends on uninterrupted GPS signals and/or energy-emissive (non-stealth) range sensors such as LIDAR for localization and mapping. To address these problems, we have developed a mid-level control technique where the operator semi-autonomously drives the robot relative to visible landmarks that are commonly recognizable by both humans and machines such as closed contours and structured lines. Our novel solution relies solely on optical and non-optical passive sensors and can be operated under GPS-denied, communication-degraded environments. To control the robot using these landmarks, we developed an interactive graphical user interface (GUI) that allows the operator to select landmarks in the robot's view and direct the robot relative to one or more of the landmarks. The integrated UGV control system was evaluated based on its ability to robustly navigate through indoor environments. The system was successfully field tested with QinetiQ North America's TALON UGV and Tactical Robot Controller (TRC), a ruggedized operator control unit (OCU). We found that the proposed system is indeed robust against communication delay and degradation, and provides the operator with steady and reliable control of the UGV in realistic tactical scenarios.
On-orbit calibration for star sensors without priori information.
Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Zhang, Chengfen; Yang, Yanqiang
2017-07-24
The star sensor is a prerequisite navigation device for a spacecraft. The on-orbit calibration is an essential guarantee for its operation performance. However, traditional calibration methods rely on ground information and are invalid without priori information. The uncertain on-orbit parameters will eventually influence the performance of guidance navigation and control system. In this paper, a novel calibration method without priori information for on-orbit star sensors is proposed. Firstly, the simplified back propagation neural network is designed for focal length and main point estimation along with system property evaluation, called coarse calibration. Then the unscented Kalman filter is adopted for the precise calibration of all parameters, including focal length, main point and distortion. The proposed method benefits from self-initialization and no attitude or preinstalled sensor parameter is required. Precise star sensor parameter estimation can be achieved without priori information, which is a significant improvement for on-orbit devices. Simulations and experiments results demonstrate that the calibration is easy for operation with high accuracy and robustness. The proposed method can satisfy the stringent requirement for most star sensors.
Passive optical sensing of atmospheric polarization for GPS denied operations
NASA Astrophysics Data System (ADS)
Aycock, Todd; Lompado, Art; Wolz, Troy; Chenault, David
2016-05-01
There is a rapidly growing need for position, navigation, and timing (PNT) capability that remains effective when GPS is degraded or denied. Naturally occurring sky polarization was used as long ago as the Vikings for navigation purposes. With current polarimetric sensors, the additional polarization information measured by these sensors can be used to increase the accuracy and the availability of this technique. The Sky Polarization Azimuth Sensing System (SkyPASS) sensor measures this naturally occurring sky polarization to give absolute heading information to less than 0.1° and offers significant performance enhancement over digital compasses and sun sensors. SkyPASS has been under development for some time for terrestrial applications, but use above the atmosphere may be possible and the performance specifications and SWAP are attractive for use as an additional pose sensor on a satellite. In this paper, we will describe the phenomenology, the sensor performance, and the latest test results of terrestrial SkyPASS; we will also discuss the potential for use above the atmosphere and the expected benefits and limitations.
Flight Test Performance of a High Precision Navigation Doppler Lidar
NASA Technical Reports Server (NTRS)
Pierrottet, Diego; Amzajerdian, Farzin; Petway, Larry; Barnes, Bruce; Lockard, George
2009-01-01
A navigation Doppler Lidar (DL) was developed at NASA Langley Research Center (LaRC) for high precision velocity measurements from a lunar or planetary landing vehicle in support of the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. A unique feature of this DL is that it has the capability to provide a precision velocity vector which can be easily separated into horizontal and vertical velocity components and high accuracy line of sight (LOS) range measurements. This dual mode of operation can provide useful information, such as vehicle orientation relative to the direction of travel, and vehicle attitude relative to the sensor footprint on the ground. System performance was evaluated in a series of helicopter flight tests over the California desert. This paper provides a description of the DL system and presents results obtained from these flight tests.
Bioinspired optical sensors for unmanned aerial systems
NASA Astrophysics Data System (ADS)
Chahl, Javaan; Rosser, Kent; Mizutani, Akiko
2011-04-01
Insects are dependant on the spatial, spectral and temporal distributions of light in the environment for flight control and navigation. This paper reports on flight trials of implementations of insect inspired behaviors on unmanned aerial vehicles. Optical flow methods for maintaining a constant height above ground and a constant course have been demonstrated to provide navigation capabilities that are impossible using conventional avionics sensors. Precision control of height above ground and ground course were achieved over long distances. Other vision based techniques demonstrated include a biomimetic stabilization sensor that uses the ultraviolet and green bands of the spectrum, and a sky polarization compass. Both of these sensors were tested over long trajectories in different directions, in each case showing performance similar to low cost inertial heading and attitude systems. The behaviors demonstrate some of the core functionality found in the lower levels of the sensorimotor system of flying insects and shows promise for more integrated solutions in the future.
Extraction of user's navigation commands from upper body force interaction in walker assisted gait.
Frizera Neto, Anselmo; Gallego, Juan A; Rocon, Eduardo; Pons, José L; Ceres, Ramón
2010-08-05
The advances in technology make possible the incorporation of sensors and actuators in rollators, building safer robots and extending the use of walkers to a more diverse population. This paper presents a new method for the extraction of navigation related components from upper-body force interaction data in walker assisted gait. A filtering architecture is designed to cancel: (i) the high-frequency noise caused by vibrations on the walker's structure due to irregularities on the terrain or walker's wheels and (ii) the cadence related force components caused by user's trunk oscillations during gait. As a result, a third component related to user's navigation commands is distinguished. For the cancelation of high-frequency noise, a Benedict-Bordner g-h filter was designed presenting very low values for Kinematic Tracking Error ((2.035 +/- 0.358).10(-2) kgf) and delay ((1.897 +/- 0.3697).10(1)ms). A Fourier Linear Combiner filtering architecture was implemented for the adaptive attenuation of about 80% of the cadence related components' energy from force data. This was done without compromising the information contained in the frequencies close to such notch filters. The presented methodology offers an effective cancelation of the undesired components from force data, allowing the system to extract in real-time voluntary user's navigation commands. Based on this real-time identification of voluntary user's commands, a classical approach to the control architecture of the robotic walker is being developed, in order to obtain stable and safe user assisted locomotion.
Sensor Fusion Based Model for Collision Free Mobile Robot Navigation
Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar
2015-01-01
Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot’s wheels, and 24 fuzzy rules for the robot’s movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes. PMID:26712766
Sensor Fusion Based Model for Collision Free Mobile Robot Navigation.
Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar
2015-12-26
Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot's wheels, and 24 fuzzy rules for the robot's movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes.
Orbital Express Advanced Video Guidance Sensor: Ground Testing, Flight Results and Comparisons
NASA Technical Reports Server (NTRS)
Pinson, Robin M.; Howard, Richard T.; Heaton, Andrew F.
2008-01-01
Orbital Express (OE) was a successful mission demonstrating automated rendezvous and docking. The 2007 mission consisted of two spacecraft, the Autonomous Space Transport Robotic Operations (ASTRO) and the Next Generation Serviceable Satellite (NEXTSat) that were designed to work together and test a variety of service operations in orbit. The Advanced Video Guidance Sensor, AVGS, was included as one of the primary proximity navigation sensors on board the ASTRO. The AVGS was one of four sensors that provided relative position and attitude between the two vehicles. Marshall Space Flight Center was responsible for the AVGS software and testing (especially the extensive ground testing), flight operations support, and analyzing the flight data. This paper briefly describes the historical mission, the data taken on-orbit, the ground testing that occurred, and finally comparisons between flight data and ground test data for two different flight regimes.
A Trajectory Generation Approach for Payload Directed Flight
NASA Technical Reports Server (NTRS)
Ippolito, Corey A.; Yeh, Yoo-Hsiu
2009-01-01
Presently, flight systems designed to perform payload-centric maneuvers require preconstructed procedures and special hand-tuned guidance modes. To enable intelligent maneuvering via strong coupling between the goals of payload-directed flight and the autopilot functions, there exists a need to rethink traditional autopilot design and function. Research into payload directed flight examines sensor and payload-centric autopilot modes, architectures, and algorithms that provide layers of intelligent guidance, navigation and control for flight vehicles to achieve mission goals related to the payload sensors, taking into account various constraints such as the performance limitations of the aircraft, target tracking and estimation, obstacle avoidance, and constraint satisfaction. Payload directed flight requires a methodology for accurate trajectory planning that lets the system anticipate expected return from a suite of onboard sensors. This paper presents an extension to the existing techniques used in the literature to quickly and accurately plan flight trajectories that predict and optimize the expected return of onboard payload sensors.
A novel redundant INS based on triple rotary inertial measurement units
NASA Astrophysics Data System (ADS)
Chen, Gang; Li, Kui; Wang, Wei; Li, Peng
2016-10-01
Accuracy and reliability are two key performances of inertial navigation system (INS). Rotation modulation (RM) can attenuate the bias of inertial sensors and make it possible for INS to achieve higher navigation accuracy with lower-class sensors. Therefore, the conflict between the accuracy and cost of INS can be eased. Traditional system redundancy and recently researched sensor redundancy are two primary means to improve the reliability of INS. However, how to make the best use of the redundant information from redundant sensors hasn’t been studied adequately, especially in rotational INS. This paper proposed a novel triple rotary unit strapdown inertial navigation system (TRUSINS), which combines RM and sensor redundancy design to enhance the accuracy and reliability of rotational INS. Each rotary unit independently rotates to modulate the errors of two gyros and two accelerometers. Three units can provide double sets of measurements along all three axes of body frame to constitute a couple of INSs which make TRUSINS redundant. Experiments and simulations based on a prototype which is made up of six fiber-optic gyros with drift stability of 0.05° h-1 show that TRUSINS can achieve positioning accuracy of about 0.256 n mile h-1, which is ten times better than that of a normal non-rotational INS with the same level inertial sensors. The theoretical analysis and the experimental results show that due to the advantage of the innovative structure, the designed fault detection and isolation (FDI) strategy can tolerate six sensor faults at most, and is proved to be effective and practical. Therefore, TRUSINS is particularly suitable and highly beneficial for the applications where high accuracy and high reliability is required.
Assessment of modern smartphone sensors performance on vehicle localization in urban environments
NASA Astrophysics Data System (ADS)
Lazarou, Theodoros; Danezis, Chris
2017-09-01
The advent of Global Navigation Satellite Systems (GNSS) initiated a revolution in Positioning, Navigation and Timing (PNT) applications. Besides the enormous impact on geospatial data acquisition and reality capture, satellite navigation has penetrated everyday life, a fact which is proved by the increasing degree of human reliance on GNSS-enabled smart devices to perform casual activities. Nevertheless, GNSS does not perform well in all cases. Specifically, in GNSS-challenging environments, such as urban canyons or forested areas, navigation performance may be significantly degraded or even nullified. Consequently, positioning is achieved by combining GNSS with additional heterogeneous information or sensors, such as inertial sensors. To date, most smartphones are equipped with at least accelerometers and gyroscopes, besides GNSS chipsets. In the frame of this research, difficult localization scenarios were investigated to assess the performance of these low-cost inertial sensors with respect to higher grade GNSS and IMU systems. Four state-of-the-art smartphones were mounted on a specifically designed on-purpose build platform along with reference equipment. The platform was installed on top of a vehicle, which was driven by a predefined trajectory that included several GNSS-challenging parts. Consequently, positioning and inertial readings were acquired by smartphones and compared to the information collected by the reference equipment. The results indicated that although the smartphone GNSS receivers have increased sensitivity, they were unable to produce an acceptable solution for more than 30% of the driven course. However, all smartphones managed to identify, up to a satisfactory degree, distinct driving features, such as curves or bumps.
NASA Technical Reports Server (NTRS)
Mitchell, Jason W.; Barbee, Brent W.; Baldwin, Philip J.; Luquette, Richard J.
2007-01-01
The Formation Flying Testbed (FFTB) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) provides a hardware-in-the-loop test environment for formation navigation and control. The facility continues to evolve as a modular, hybrid, dynamic simulation facility for end-to-end guidance, navigation, and control (GN&C) design and analysis of formation flying spacecraft. The core capabilities of the FFTB, as a platform for testing critical hardware and software algorithms in-the-loop, are reviewed with a focus on recent improvements. With the most recent improvement, in support of Technology Readiness Level (TRL) 6 testing of the Inter-spacecraft Ranging and Alarm System (IRAS) for the Magnetospheric Multiscale (MMS) mission, the FFTB has significantly expanded its ability to perform realistic simulations that require Radio Frequency (RF) ranging sensors for relative navigation with the Path Emulator for RF Signals (PERFS). The PERFS, currently under development at NASA GSFC, modulates RF signals exchanged between spacecraft. The RF signals are modified to accurately reflect the dynamic environment through which they travel, including the effects of medium, moving platforms, and radiated power.
Drift Reduction in Pedestrian Navigation System by Exploiting Motion Constraints and Magnetic Field.
Ilyas, Muhammad; Cho, Kuk; Baeg, Seung-Ho; Park, Sangdeok
2016-09-09
Pedestrian navigation systems (PNS) using foot-mounted MEMS inertial sensors use zero-velocity updates (ZUPTs) to reduce drift in navigation solutions and estimate inertial sensor errors. However, it is well known that ZUPTs cannot reduce all errors, especially as heading error is not observable. Hence, the position estimates tend to drift and even cyclic ZUPTs are applied in updated steps of the Extended Kalman Filter (EKF). This urges the use of other motion constraints for pedestrian gait and any other valuable heading reduction information that is available. In this paper, we exploit two more motion constraints scenarios of pedestrian gait: (1) walking along straight paths; (2) standing still for a long time. It is observed that these motion constraints (called "virtual sensor"), though considerably reducing drift in PNS, still need an absolute heading reference. One common absolute heading estimation sensor is the magnetometer, which senses the Earth's magnetic field and, hence, the true heading angle can be calculated. However, magnetometers are susceptible to magnetic distortions, especially in indoor environments. In this work, an algorithm, called magnetic anomaly detection (MAD) and compensation is designed by incorporating only healthy magnetometer data in the EKF updating step, to reduce drift in zero-velocity updated INS. Experiments are conducted in GPS-denied and magnetically distorted environments to validate the proposed algorithms.
Olfaction and Hearing Based Mobile Robot Navigation for Odor/Sound Source Search
Song, Kai; Liu, Qi; Wang, Qi
2011-01-01
Bionic technology provides a new elicitation for mobile robot navigation since it explores the way to imitate biological senses. In the present study, the challenging problem was how to fuse different biological senses and guide distributed robots to cooperate with each other for target searching. This paper integrates smell, hearing and touch to design an odor/sound tracking multi-robot system. The olfactory robot tracks the chemical odor plume step by step through information fusion from gas sensors and airflow sensors, while two hearing robots localize the sound source by time delay estimation (TDE) and the geometrical position of microphone array. Furthermore, this paper presents a heading direction based mobile robot navigation algorithm, by which the robot can automatically and stably adjust its velocity and direction according to the deviation between the current heading direction measured by magnetoresistive sensor and the expected heading direction acquired through the odor/sound localization strategies. Simultaneously, one robot can communicate with the other robots via a wireless sensor network (WSN). Experimental results show that the olfactory robot can pinpoint the odor source within the distance of 2 m, while two hearing robots can quickly localize and track the olfactory robot in 2 min. The devised multi-robot system can achieve target search with a considerable success ratio and high stability. PMID:22319401
NASA Astrophysics Data System (ADS)
Anders, Niels; Suomalainen, Juha; Seeger, Manuel; Keesstra, Saskia; Bartholomeus, Harm; Paron, Paolo
2014-05-01
The recent increase of performance and endurance of electronically controlled flying platforms, such as multi-copters and fixed-wing airplanes, and decreasing size and weight of different sensors and batteries leads to increasing popularity of Unmanned Aerial Systems (UAS) for scientific purposes. Modern workflows that implement UAS include guided flight plan generation, 3D GPS navigation for fully automated piloting, and automated processing with new techniques such as "Structure from Motion" photogrammetry. UAS are often equipped with normal RGB cameras, multi- and hyperspectral sensors, radar, or other sensors, and provide a cheap and flexible solution for creating multi-temporal data sets. UAS revolutionized multi-temporal research allowing new applications related to change analysis and process monitoring. The EGU General Assembly 2014 is hosting a session on platforms, sensors and applications with UAS in soil science and geomorphology. This presentation briefly summarizes the outcome of this session, addressing the current state and future challenges of small-platform data acquisition in soil science and geomorphology.
A Personal Navigation System Based on Inertial and Magnetic Field Measurements
2010-09-01
MATLAB IMPLEMENTATION.................................................................74 G. A MODEL FOR PENDULUM MOTION SENSOR DATA...76 1. Pendulum Model for MATLAB Simulation....................................76 2. Sensor Data Generated with the Pendulum Model... PENDULUM ..................................................................................................88 I. FILTER PERFORMANCE WITH REAL PENDULUM DATA
Security applications of magnetic sensors
NASA Astrophysics Data System (ADS)
Ripka, Pavel
2013-06-01
Magnetic sensors are often used for security and military applications such as detection, discrimination and localization of ferromagnetic and conducting objects, navigation, position tracking and antitheft systems. We give only general overview, few remarks and some interesting references on these applications.
Flight test results of the strapdown ring laser gyro tetrad inertial navigation system
NASA Technical Reports Server (NTRS)
Carestia, R. A.; Hruby, R. J.; Bjorkman, W. S.
1983-01-01
A helicopter flight test program undertaken to evaluate the performance of Tetrad (a strap down, laser gyro, inertial navigation system) is described. The results of 34 flights show a mean final navigational velocity error of 5.06 knots, with a standard deviation of 3.84 knots; a corresponding mean final position error of 2.66 n. mi., with a standard deviation of 1.48 n. mi.; and a modeled mean position error growth rate for the 34 tests of 1.96 knots, with a standard deviation of 1.09 knots. No laser gyro or accelerometer failures were detected during the flight tests. Off line parity residual studies used simulated failures with the prerecorded flight test and laboratory test data. The airborne Tetrad system's failure--detection logic, exercised during the tests, successfully demonstrated the detection of simulated ""hard'' failures and the system's ability to continue successfully to navigate by removing the simulated faulted sensor from the computations. Tetrad's four ring laser gyros provided reliable and accurate angular rate sensing during the 4 yr of the test program, and no sensor failures were detected during the evaluation of free inertial navigation performance.
Indoor Positioning System Using Magnetic Field Map Navigation and an Encoder System
Kim, Han-Sol; Seo, Woojin; Baek, Kwang-Ryul
2017-01-01
In the indoor environment, variation of the magnetic field is caused by building structures, and magnetic field map navigation is based on this feature. In order to estimate position using this navigation, a three-axis magnetic field must be measured at every point to build a magnetic field map. After the magnetic field map is obtained, the position of the mobile robot can be estimated with a likelihood function whereby the measured magnetic field data and the magnetic field map are used. However, if only magnetic field map navigation is used, the estimated position can have large errors. In order to improve performance, we propose a particle filter system that integrates magnetic field map navigation and an encoder system. In this paper, multiple magnetic sensors and three magnetic field maps (a horizontal intensity map, a vertical intensity map, and a direction information map) are used to update the weights of particles. As a result, the proposed system estimates the position and orientation of a mobile robot more accurately than previous systems. Also, when the number of magnetic sensors increases, this paper shows that system performance improves. Finally, experiment results are shown from the proposed system that was implemented and evaluated. PMID:28327513
Indoor Positioning System Using Magnetic Field Map Navigation and an Encoder System.
Kim, Han-Sol; Seo, Woojin; Baek, Kwang-Ryul
2017-03-22
In the indoor environment, variation of the magnetic field is caused by building structures, and magnetic field map navigation is based on this feature. In order to estimate position using this navigation, a three-axis magnetic field must be measured at every point to build a magnetic field map. After the magnetic field map is obtained, the position of the mobile robot can be estimated with a likelihood function whereby the measured magnetic field data and the magnetic field map are used. However, if only magnetic field map navigation is used, the estimated position can have large errors. In order to improve performance, we propose a particle filter system that integrates magnetic field map navigation and an encoder system. In this paper, multiple magnetic sensors and three magnetic field maps (a horizontal intensity map, a vertical intensity map, and a direction information map) are used to update the weights of particles. As a result, the proposed system estimates the position and orientation of a mobile robot more accurately than previous systems. Also, when the number of magnetic sensors increases, this paper shows that system performance improves. Finally, experiment results are shown from the proposed system that was implemented and evaluated.
Perception for mobile robot navigation: A survey of the state of the art
NASA Technical Reports Server (NTRS)
Kortenkamp, David
1994-01-01
In order for mobile robots to navigate safely in unmapped and dynamic environments they must perceive their environment and decide on actions based on those perceptions. There are many different sensing modalities that can be used for mobile robot perception; the two most popular are ultrasonic sonar sensors and vision sensors. This paper examines the state-of-the-art in sensory-based mobile robot navigation. The first issue in mobile robot navigation is safety. This paper summarizes several competing sonar-based obstacle avoidance techniques and compares them. Another issue in mobile robot navigation is determining the robot's position and orientation (sometimes called the robot's pose) in the environment. This paper examines several different classes of vision-based approaches to pose determination. One class of approaches uses detailed, a prior models of the robot's environment. Another class of approaches triangulates using fixed, artificial landmarks. A third class of approaches builds maps using natural landmarks. Example implementations from each of these three classes are described and compared. Finally, the paper presents a completely implemented mobile robot system that integrates sonar-based obstacle avoidance with vision-based pose determination to perform a simple task.
2014-01-01
this report treats cruise missile penaids and UAV penaids, sometimes called “self-protection” (see La Franchi , 2004), interchangeably. 8 Cruise...Penaid Export Controls 41 2. Anti-Jam Equipment MTCR Item 11.A.3.b.3 (Avionics): Current text: “Receiving equipment for Global Navigation Satellite...subsystems beyond those for global navigation satellite systems to all sensor, navigation, and communications systems, and add “including multi-mode
GPS/Optical/Inertial Integration for 3D Navigation Using Multi-Copter Platforms
NASA Technical Reports Server (NTRS)
Dill, Evan T.; Young, Steven D.; Uijt De Haag, Maarten
2017-01-01
In concert with the continued advancement of a UAS traffic management system (UTM), the proposed uses of autonomous unmanned aerial systems (UAS) have become more prevalent in both the public and private sectors. To facilitate this anticipated growth, a reliable three-dimensional (3D) positioning, navigation, and mapping (PNM) capability will be required to enable operation of these platforms in challenging environments where global navigation satellite systems (GNSS) may not be available continuously. Especially, when the platform's mission requires maneuvering through different and difficult environments like outdoor opensky, outdoor under foliage, outdoor-urban and indoor, and may include transitions between these environments. There may not be a single method to solve the PNM problem for all environments. The research presented in this paper is a subset of a broader research effort, described in [1]. The research is focused on combining data from dissimilar sensor technologies to create an integrated navigation and mapping method that can enable reliable operation in both an outdoor and structured indoor environment. The integrated navigation and mapping design is utilizes a Global Positioning System (GPS) receiver, an Inertial Measurement Unit (IMU), a monocular digital camera, and three short to medium range laser scanners. This paper describes specifically the techniques necessary to effectively integrate the monocular camera data within the established mechanization. To evaluate the developed algorithms a hexacopter was built, equipped with the discussed sensors, and both hand-carried and flown through representative environments. This paper highlights the effect that the monocular camera has on the aforementioned sensor integration scheme's reliability, accuracy and availability.
Survey of computer vision technology for UVA navigation
NASA Astrophysics Data System (ADS)
Xie, Bo; Fan, Xiang; Li, Sijian
2017-11-01
Navigation based on computer version technology, which has the characteristics of strong independence, high precision and is not susceptible to electrical interference, has attracted more and more attention in the filed of UAV navigation research. Early navigation project based on computer version technology mainly applied to autonomous ground robot. In recent years, the visual navigation system is widely applied to unmanned machine, deep space detector and underwater robot. That further stimulate the research of integrated navigation algorithm based on computer version technology. In China, with many types of UAV development and two lunar exploration, the three phase of the project started, there has been significant progress in the study of visual navigation. The paper expounds the development of navigation based on computer version technology in the filed of UAV navigation research and draw a conclusion that visual navigation is mainly applied to three aspects as follows.(1) Acquisition of UAV navigation parameters. The parameters, including UAV attitude, position and velocity information could be got according to the relationship between the images from sensors and carrier's attitude, the relationship between instant matching images and the reference images and the relationship between carrier's velocity and characteristics of sequential images.(2) Autonomous obstacle avoidance. There are many ways to achieve obstacle avoidance in UAV navigation. The methods based on computer version technology ,including feature matching, template matching, image frames and so on, are mainly introduced. (3) The target tracking, positioning. Using the obtained images, UAV position is calculated by using optical flow method, MeanShift algorithm, CamShift algorithm, Kalman filtering and particle filter algotithm. The paper expounds three kinds of mainstream visual system. (1) High speed visual system. It uses parallel structure, with which image detection and processing are carried out at high speed. The system is applied to rapid response system. (2) The visual system of distributed network. There are several discrete image data acquisition sensor in different locations, which transmit image data to the node processor to increase the sampling rate. (3) The visual system combined with observer. The system combines image sensors with the external observers to make up for lack of visual equipment. To some degree, these systems overcome lacks of the early visual system, including low frequency, low processing efficiency and strong noise. In the end, the difficulties of navigation based on computer version technology in practical application are briefly discussed. (1) Due to the huge workload of image operation , the real-time performance of the system is poor. (2) Due to the large environmental impact , the anti-interference ability of the system is poor.(3) Due to the ability to work in a particular environment, the system has poor adaptability.
Real-Time GPS-Alternative Navigation Using Commodity Hardware
2007-06-01
4.1 Test Plan and Setup ..............................................................................................84 4.1.1 Component and...improvements planned , the most influential for navigation are additional signals, frequencies, and improved signal strength. These improvements will... planned and implemented to provide maximum extensibility for additional sensors and functionality without disturbing the core GPU-accelerated
Learning for autonomous navigation
NASA Technical Reports Server (NTRS)
Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric
2005-01-01
Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter.
Image-Aided Navigation Using Cooperative Binocular Stereopsis
2014-03-27
Global Postioning System . . . . . . . . . . . . . . . . . . . . . . . . . 1 IMU Inertial Measurement Unit...an intertial measurement unit ( IMU ). This technique capitalizes on an IMU’s ability to capture quick motion and the ability of GPS to constrain long...the sensor-aided IMU framework. Visual sensors provide a number of benefits, such as low cost and weight. These sensors are also able to measure
Doppler Navigation System with a Non-Stabilized Antenna as a Sea-Surface Wind Sensor.
Nekrasov, Alexey; Khachaturian, Alena; Veremyev, Vladimir; Bogachev, Mikhail
2017-06-09
We propose a concept of the utilization of an aircraft Doppler Navigation System (DNS) as a sea-surface wind sensor complementary to its normal functionality. The DNS with an antenna, which is non-stabilized physically to the local horizontal with x -configured beams, is considered. We consider the wind measurements by the DNS configured in the multi-beam scatterometer mode for a rectilinear flight scenario. The system feasibility and the efficiency of the proposed wind algorithm retrieval are supported by computer simulations. Finally, the associated limitations of the proposed approach are considered.
Doppler Navigation System with a Non-Stabilized Antenna as a Sea-Surface Wind Sensor
Nekrasov, Alexey; Khachaturian, Alena; Veremyev, Vladimir; Bogachev, Mikhail
2017-01-01
We propose a concept of the utilization of an aircraft Doppler Navigation System (DNS) as a sea-surface wind sensor complementary to its normal functionality. The DNS with an antenna, which is non-stabilized physically to the local horizontal with x-configured beams, is considered. We consider the wind measurements by the DNS configured in the multi-beam scatterometer mode for a rectilinear flight scenario. The system feasibility and the efficiency of the proposed wind algorithm retrieval are supported by computer simulations. Finally, the associated limitations of the proposed approach are considered. PMID:28598374
Robot navigation research at CESAR (Center for Engineering Systems Advanced Research)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnett, D.L.; de Saussure, G.; Pin, F.G.
1989-01-01
A considerable amount of work has been reported on the problem of robot navigation in known static terrains. Algorithms have been proposed and implemented to search for an optimum path to the goal, taking into account the finite size and shape of the robot. Not as much work has been reported on robot navigation in unknown, unstructured, or dynamic environments. A robot navigating in an unknown environment must explore with its sensors, construct an abstract representation of its global environment to plan a path to the goal, and update or revise its plan based on accumulated data obtained and processedmore » in real-time. The core of the navigation program for the CESAR robots is a production system developed on the expert-system-shell CLIPS which runs on an NCUBE hypercube on board the robot. The production system can call on C-compiled navigation procedures. The production rules can read the sensor data and address the robot's effectors. This architecture was found efficient and flexible for the development and testing of the navigation algorithms; however, in order to process intelligently unexpected emergencies, it was found necessary to be able to control the production system through externally generated asynchronous data. This led to the design of a new asynchronous production system, APS, which is now being developed on the robot. This paper will review some of the navigation algorithms developed and tested at CESAR and will discuss the need for the new APS and how it is being integrated into the robot architecture. 18 refs., 3 figs., 1 tab.« less
Vanegas, Fernando; Gonzalez, Felipe
2016-01-01
Unmanned Aerial Vehicles (UAV) can navigate with low risk in obstacle-free environments using ground control stations that plan a series of GPS waypoints as a path to follow. This GPS waypoint navigation does however become dangerous in environments where the GPS signal is faulty or is only present in some places and when the airspace is filled with obstacles. UAV navigation then becomes challenging because the UAV uses other sensors, which in turn generate uncertainty about its localisation and motion systems, especially if the UAV is a low cost platform. Additional uncertainty affects the mission when the UAV goal location is only partially known and can only be discovered by exploring and detecting a target. This navigation problem is established in this research as a Partially-Observable Markov Decision Process (POMDP), so as to produce a policy that maps a set of motion commands to belief states and observations. The policy is calculated and updated on-line while flying with a newly-developed system for UAV Uncertainty-Based Navigation (UBNAV), to navigate in cluttered and GPS-denied environments using observations and executing motion commands instead of waypoints. Experimental results in both simulation and real flight tests show that the UAV finds a path on-line to a region where it can explore and detect a target without colliding with obstacles. UBNAV provides a new method and an enabling technology for scientists to implement and test UAV navigation missions with uncertainty where targets must be detected using on-line POMDP in real flight scenarios. PMID:27171096
NASA Astrophysics Data System (ADS)
Li, Qingquan; Fang, Zhixiang; Li, Hanwu; Xiao, Hui
2005-10-01
The global positioning system (GPS) has become the most extensively used positioning and navigation tool in the world. Applications of GPS abound in surveying, mapping, transportation, agriculture, military planning, GIS, and the geosciences. However, the positional and elevation accuracy of any given GPS location is prone to error, due to a number of factors. The applications of Global Positioning System (GPS) positioning is more and more popular, especially the intelligent navigation system which relies on GPS and Dead Reckoning technology is developing quickly for future huge market in China. In this paper a practical combined positioning model of GPS/DR/MM is put forward, which integrates GPS, Gyro, Vehicle Speed Sensor (VSS) and digital navigation maps to provide accurate and real-time position for intelligent navigation system. This model is designed for automotive navigation system making use of Kalman filter to improve position and map matching veracity by means of filtering raw GPS and DR signals, and then map-matching technology is used to provide map coordinates for map displaying. In practical examples, for illustrating the validity of the model, several experiments and their results of integrated GPS/DR positioning in intelligent navigation system will be shown for the conclusion that Kalman Filter based GPS/DR integrating position approach is necessary, feasible and efficient for intelligent navigation application. Certainly, this combined positioning model, similar to other model, can not resolve all situation issues. Finally, some suggestions are given for further improving integrated GPS/DR/MM application.
Vanegas, Fernando; Gonzalez, Felipe
2016-05-10
Unmanned Aerial Vehicles (UAV) can navigate with low risk in obstacle-free environments using ground control stations that plan a series of GPS waypoints as a path to follow. This GPS waypoint navigation does however become dangerous in environments where the GPS signal is faulty or is only present in some places and when the airspace is filled with obstacles. UAV navigation then becomes challenging because the UAV uses other sensors, which in turn generate uncertainty about its localisation and motion systems, especially if the UAV is a low cost platform. Additional uncertainty affects the mission when the UAV goal location is only partially known and can only be discovered by exploring and detecting a target. This navigation problem is established in this research as a Partially-Observable Markov Decision Process (POMDP), so as to produce a policy that maps a set of motion commands to belief states and observations. The policy is calculated and updated on-line while flying with a newly-developed system for UAV Uncertainty-Based Navigation (UBNAV), to navigate in cluttered and GPS-denied environments using observations and executing motion commands instead of waypoints. Experimental results in both simulation and real flight tests show that the UAV finds a path on-line to a region where it can explore and detect a target without colliding with obstacles. UBNAV provides a new method and an enabling technology for scientists to implement and test UAV navigation missions with uncertainty where targets must be detected using on-line POMDP in real flight scenarios.
Results of the Magnetometer Navigation (MAGNAV)lnflight Experiment
NASA Technical Reports Server (NTRS)
Thienel, Julie K.; Harman, Richard R.; Bar-Itzhack, Itzhack Y.; Lambertson, Mike
2004-01-01
The Magnetometer Navigation (MAGNAV) algorithm is currently running as a flight experiment as part of the Wide Field Infrared Explorer (WIRE) Post-Science Engineering Testbed. Initialization of MAGNAV occurred on September 4, 2003. MAGNAV is designed to autonomously estimate the spacecraft orbit, attitude, and rate using magnetometer and sun sensor data. Since the Earth's magnetic field is a function of time and position, and since time is known quite precisely, the differences between the computed magnetic field and measured magnetic field components, as measured by the magnetometer throughout the entire spacecraft orbit, are a function of the spacecraft trajectory and attitude errors. Therefore, these errors are used to estimate both trajectory and attitude. In addition, the time rate of change of the magnetic field vector is used to estimate the spacecraft rotation rate. The estimation of the attitude and trajectory is augmented with the rate estimation into an Extended Kalman filter blended with a pseudo-linear Kalman filter. Sun sensor data is also used to improve the accuracy and observability of the attitude and rate estimates. This test serves to validate MAGNAV as a single low cost navigation system which utilizes reliable, flight qualified sensors. MAGNAV is intended as a backup algorithm, an initialization algorithm, or possibly a prime navigation algorithm for a mission with coarse requirements. Results from the first six months of operation are presented.
AAS/GSFC 13th International Symposium on Space Flight Dynamics. Volume 1
NASA Technical Reports Server (NTRS)
Stengle, Tom (Editor)
1998-01-01
This conference proceedings preprint includes papers and abstracts presented at the 13th International Symposium on Space Flight Dynamics. Cosponsored by American Astronautical Society and the Guidance, Navigation and Control Center of the Goddard Space Flight Center, this symposium featured technical papers on a wide range of issues related to orbit-attitude prediction, determination, and control; attitude sensor calibration; attitude dynamics; and mission design.
Improved obstacle avoidance and navigation for an autonomous ground vehicle
NASA Astrophysics Data System (ADS)
Giri, Binod; Cho, Hyunsu; Williams, Benjamin C.; Tann, Hokchhay; Shakya, Bicky; Bharam, Vishal; Ahlgren, David J.
2015-01-01
This paper presents improvements made to the intelligence algorithms employed on Q, an autonomous ground vehicle, for the 2014 Intelligent Ground Vehicle Competition (IGVC). In 2012, the IGVC committee combined the formerly separate autonomous and navigation challenges into a single AUT-NAV challenge. In this new challenge, the vehicle is required to navigate through a grassy obstacle course and stay within the course boundaries (a lane of two white painted lines) that guide it toward a given GPS waypoint. Once the vehicle reaches this waypoint, it enters an open course where it is required to navigate to another GPS waypoint while avoiding obstacles. After reaching the final waypoint, the vehicle is required to traverse another obstacle course before completing the run. Q uses modular parallel software architecture in which image processing, navigation, and sensor control algorithms run concurrently. A tuned navigation algorithm allows Q to smoothly maneuver through obstacle fields. For the 2014 competition, most revisions occurred in the vision system, which detects white lines and informs the navigation component. Barrel obstacles of various colors presented a new challenge for image processing: the previous color plane extraction algorithm would not suffice. To overcome this difficulty, laser range sensor data were overlaid on visual data. Q also participates in the Joint Architecture for Unmanned Systems (JAUS) challenge at IGVC. For 2014, significant updates were implemented: the JAUS component accepted a greater variety of messages and showed better compliance to the JAUS technical standard. With these improvements, Q secured second place in the JAUS competition.
Rendezvous Integration Complexities of NASA Human Flight Vehicles
NASA Technical Reports Server (NTRS)
Brazzel, Jack P.; Goodman, John L.
2009-01-01
Propellant-optimal trajectories, relative sensors and navigation, and docking/capture mechanisms are rendezvous disciplines that receive much attention in the technical literature. However, other areas must be considered. These include absolute navigation, maneuver targeting, attitude control, power generation, software development and verification, redundancy management, thermal control, avionics integration, robotics, communications, lighting, human factors, crew timeline, procedure development, orbital debris risk mitigation, structures, plume impingement, logistics, and in some cases extravehicular activity. While current and future spaceflight programs will introduce new technologies and operations concepts, the complexity of integrating multiple systems on multiple spacecraft will remain. The systems integration task may become more difficult as increasingly complex software is used to meet current and future automation, autonomy, and robotic operation requirements.
NASA Astrophysics Data System (ADS)
Um, Jaeyong
2001-08-01
The Space Integrated GPS/INS (SIGI) sensor is the primary navigation and attitude determination source for the International Space Station (ISS). The SIGI was successfully demonstrated on-orbit for the first time in the SIGI Orbital Attitude Readiness (SOAR) demonstration on the Space Shuttle Atlantis in May 2000. Numerous proximity operations near the ISS have been and will be performed over the lifetime of the Station. The development of an autonomous relative navigation system is needed to improve the safety and efficiency of vehicle operations near the ISS. A hardware simulation study was performed for the GPS-based relative navigation using the state vector difference approach and the interferometric approach in the absence of multipath. The interferometric approach, where the relative states are estimated directly, showed comparable results for a 1 km baseline. One of the most pressing current technical issues is the design of an autonomous relative navigation system in the proximity of the ISS, where GPS signals are blocked and maneuvers happen frequently. An integrated GPS/INS system is investigated for the possibility of a fully autonomous relative navigation system. Another application of GPS measurements is determination of the vehicle's orientation in space. This study used the SOAR experiment data to characterize the SICI's on-orbit performance for attitude determination. A cold start initialization algorithm was developed for integer ambiguity resolution in any initial orientation. The original algorithm that was used in the SIGI had an operational limitation in the integer ambiguity resolution, which was developed for terrestrial applications, and limited its effectiveness in space. The new algorithm was tested using the SOAR data and has been incorporated in the current SIGI flight software. The attitude estimation performance was examined using two different GPS/INS integration algorithms. The GPS/INS attitude solution using the SOAR data was as accurate as 0.06 deg (RMS) in 3-axis with multipath mitigation. Other improvements to the attitude determination algorithm were the development of a faster integer ambiguity resolution method and the incorporation of line bias modeling.
NASA Astrophysics Data System (ADS)
Turner, D.; Lucieer, A.; McCabe, M.; Parkes, S.; Clarke, I.
2017-08-01
In this study, we assess two push broom hyperspectral sensors as carried by small (10-15 kg) multi-rotor Unmanned Aircraft Systems (UAS). We used a Headwall Photonics micro-Hyperspec push broom sensor with 324 spectral bands (4-5 nm FWHM) and a Headwall Photonics nano-Hyperspec sensor with 270 spectral bands (6 nm FWHM) both in the VNIR spectral range (400-1000 nm). A gimbal was used to stabilise the sensors in relation to the aircraft flight dynamics, and for the micro-Hyperspec a tightly coupled dual frequency Global Navigation Satellite System (GNSS) receiver, an Inertial Measurement Unit (IMU), and Machine Vision Camera (MVC) were used for attitude and position determination. For the nano-Hyperspec, a navigation grade GNSS system and IMU provided position and attitude data. This study presents the geometric results of one flight over a grass oval on which a dense Ground Control Point (GCP) network was deployed. The aim being to ascertain the geometric accuracy achievable with the system. Using the PARGE software package (ReSe - Remote Sensing Applications) we ortho-rectify the push broom hyperspectral image strips and then quantify the accuracy of the ortho-rectification by using the GCPs as check points. The orientation (roll, pitch, and yaw) of the sensor is measured by the IMU. Alternatively imagery from a MVC running at 15 Hz, with accurate camera position data can be processed with Structure from Motion (SfM) software to obtain an estimated camera orientation. In this study, we look at which of these data sources will yield a flight strip with the highest geometric accuracy.
New Airborne Sensors and Platforms for Solving Specific Tasks in Remote Sensing
NASA Astrophysics Data System (ADS)
Kemper, G.
2012-07-01
A huge number of small and medium sized sensors entered the market. Today's mid format sensors reach 80 MPix and allow to run projects of medium size, comparable with the first big format digital cameras about 6 years ago. New high quality lenses and new developments in the integration prepared the market for photogrammetric work. Companies as Phase One or Hasselblad and producers or integrators as Trimble, Optec, and others utilized these cameras for professional image production. In combination with small camera stabilizers they can be used also in small aircraft and make the equipment small and easy transportable e.g. for rapid assessment purposes. The combination of different camera sensors enables multi or hyper-spectral installations e.g. useful for agricultural or environmental projects. Arrays of oblique viewing cameras are in the market as well, in many cases these are small and medium format sensors combined as rotating or shifting devices or just as a fixed setup. Beside the proper camera installation and integration, also the software that controls the hardware and guides the pilot has to solve much more tasks than a normal FMS did in the past. Small and relatively cheap Laser Scanners (e.g. Riegl) are in the market and a proper combination with MS Cameras and an integrated planning and navigation is a challenge that has been solved by different softwares. Turnkey solutions are available e.g. for monitoring power line corridors where taking images is just a part of the job. Integration of thermal camera systems with laser scanner and video capturing must be combined with specific information of the objects stored in a database and linked when approaching the navigation point.
Conceptual Design of a Communication-Based Deep Space Navigation Network
NASA Technical Reports Server (NTRS)
Anzalone, Evan J.; Chuang, C. H.
2012-01-01
As the need grows for increased autonomy and position knowledge accuracy to support missions beyond Earth orbit, engineers must push and develop more advanced navigation sensors and systems that operate independent of Earth-based analysis and processing. Several spacecraft are approaching this problem using inter-spacecraft radiometric tracking and onboard autonomous optical navigation methods. This paper proposes an alternative implementation to aid in spacecraft position fixing. The proposed method Network-Based Navigation technique takes advantage of the communication data being sent between spacecraft and between spacecraft and ground control to embed navigation information. The navigation system uses these packets to provide navigation estimates to an onboard navigation filter to augment traditional ground-based radiometric tracking techniques. As opposed to using digital signal measurements to capture inherent information of the transmitted signal itself, this method relies on the embedded navigation packet headers to calculate a navigation estimate. This method is heavily dependent on clock accuracy and the initial results show the promising performance of a notional system.
Iconic memory-based omnidirectional route panorama navigation.
Yagi, Yasushi; Imai, Kousuke; Tsuji, Kentaro; Yachida, Masahiko
2005-01-01
A route navigation method for a mobile robot with an omnidirectional image sensor is described. The route is memorized from a series of consecutive omnidirectional images of the horizon when the robot moves to its goal. While the robot is navigating to the goal point, input is matched against the memorized spatio-temporal route pattern by using dual active contour models and the exact robot position and orientation is estimated from the converged shape of the active contour models.
The Rendezvous Monitoring Display Capabilities of the Rendezvous and Proximity Operations Program
NASA Technical Reports Server (NTRS)
Brazzel, Jack; Spehar, Pete; Clark, Fred; Foster, Chris; Eldridge, Erin
2013-01-01
The Rendezvous and Proximity Operations Program (RPOP) is a laptop computer- based relative navigation tool and piloting aid that was developed during the Space Shuttle program. RPOP displays a graphical representation of the relative motion between the target and chaser vehicles in a rendezvous, proximity operations and capture scenario. After being used in over 60 Shuttle rendezvous missions, some of the RPOP display concepts have become recognized as a minimum standard for cockpit displays for monitoring the rendezvous task. To support International Space Station (ISS) based crews in monitoring incoming visiting vehicles, RPOP has been modified to allow crews to compare the Cygnus visiting vehicle s onboard navigated state to processed range measurements from an ISS-based, crew-operated Hand Held Lidar sensor. This paper will discuss the display concepts of RPOP that have proven useful in performing and monitoring rendezvous and proximity operations.
Emergency navigation without an infrastructure.
Gelenbe, Erol; Bi, Huibo
2014-08-18
Emergency navigation systems for buildings and other built environments, such as sport arenas or shopping centres, typically rely on simple sensor networks to detect emergencies and, then, provide automatic signs to direct the evacuees. The major drawbacks of such static wireless sensor network (WSN)-based emergency navigation systems are the very limited computing capacity, which makes adaptivity very difficult, and the restricted battery power, due to the low cost of sensor nodes for unattended operation. If static wireless sensor networks and cloud-computing can be integrated, then intensive computations that are needed to determine optimal evacuation routes in the presence of time-varying hazards can be offloaded to the cloud, but the disadvantages of limited battery life-time at the client side, as well as the high likelihood of system malfunction during an emergency still remain. By making use of the powerful sensing ability of smart phones, which are increasingly ubiquitous, this paper presents a cloud-enabled indoor emergency navigation framework to direct evacuees in a coordinated fashion and to improve the reliability and resilience for both communication and localization. By combining social potential fields (SPF) and a cognitive packet network (CPN)-based algorithm, evacuees are guided to exits in dynamic loose clusters. Rather than relying on a conventional telecommunications infrastructure, we suggest an ad hoc cognitive packet network (AHCPN)-based protocol to adaptively search optimal communication routes between portable devices and the network egress nodes that provide access to cloud servers, in a manner that spares the remaining battery power of smart phones and minimizes the time latency. Experimental results through detailed simulations indicate that smart human motion and smart network management can increase the survival rate of evacuees and reduce the number of drained smart phones in an evacuation process.
Emergency Navigation without an Infrastructure
Gelenbe, Erol; Bi, Huibo
2014-01-01
Emergency navigation systems for buildings and other built environments, such as sport arenas or shopping centres, typically rely on simple sensor networks to detect emergencies and, then, provide automatic signs to direct the evacuees. The major drawbacks of such static wireless sensor network (WSN)-based emergency navigation systems are the very limited computing capacity, which makes adaptivity very difficult, and the restricted battery power, due to the low cost of sensor nodes for unattended operation. If static wireless sensor networks and cloud-computing can be integrated, then intensive computations that are needed to determine optimal evacuation routes in the presence of time-varying hazards can be offloaded to the cloud, but the disadvantages of limited battery life-time at the client side, as well as the high likelihood of system malfunction during an emergency still remain. By making use of the powerful sensing ability of smart phones, which are increasingly ubiquitous, this paper presents a cloud-enabled indoor emergency navigation framework to direct evacuees in a coordinated fashion and to improve the reliability and resilience for both communication and localization. By combining social potential fields (SPF) and a cognitive packet network (CPN)-based algorithm, evacuees are guided to exits in dynamic loose clusters. Rather than relying on a conventional telecommunications infrastructure, we suggest an ad hoc cognitive packet network (AHCPN)-based protocol to adaptively search optimal communication routes between portable devices and the network egress nodes that provide access to cloud servers, in a manner that spares the remaining battery power of smart phones and minimizes the time latency. Experimental results through detailed simulations indicate that smart human motion and smart network management can increase the survival rate of evacuees and reduce the number of drained smart phones in an evacuation process. PMID:25196014
Fuzzy Logic Based Control for Autonomous Mobile Robot Navigation
Masmoudi, Mohamed Slim; Masmoudi, Mohamed
2016-01-01
This paper describes the design and the implementation of a trajectory tracking controller using fuzzy logic for mobile robot to navigate in indoor environments. Most of the previous works used two independent controllers for navigation and avoiding obstacles. The main contribution of the paper can be summarized in the fact that we use only one fuzzy controller for navigation and obstacle avoidance. The used mobile robot is equipped with DC motor, nine infrared range (IR) sensors to measure the distance to obstacles, and two optical encoders to provide the actual position and speeds. To evaluate the performances of the intelligent navigation algorithms, different trajectories are used and simulated using MATLAB software and SIMIAM navigation platform. Simulation results show the performances of the intelligent navigation algorithms in terms of simulation times and travelled path. PMID:27688748
Study on polarized optical flow algorithm for imaging bionic polarization navigation micro sensor
NASA Astrophysics Data System (ADS)
Guan, Le; Liu, Sheng; Li, Shi-qi; Lin, Wei; Zhai, Li-yuan; Chu, Jin-kui
2018-05-01
At present, both the point source and the imaging polarization navigation devices only can output the angle information, which means that the velocity information of the carrier cannot be extracted from the polarization field pattern directly. Optical flow is an image-based method for calculating the velocity of pixel point movement in an image. However, for ordinary optical flow, the difference in pixel value as well as the calculation accuracy can be reduced in weak light. Polarization imaging technology has the ability to improve both the detection accuracy and the recognition probability of the target because it can acquire the extra polarization multi-dimensional information of target radiation or reflection. In this paper, combining the polarization imaging technique with the traditional optical flow algorithm, a polarization optical flow algorithm is proposed, and it is verified that the polarized optical flow algorithm has good adaptation in weak light and can improve the application range of polarization navigation sensors. This research lays the foundation for day and night all-weather polarization navigation applications in future.
Submillimeter Wave Astronomy Satellite (SWAS) Launch and Early Orbit Support Experiences
NASA Technical Reports Server (NTRS)
Kirschner, S.; Sedlak, J.; Challa, M.; Nicholson, A.; Sande, C.; Rohrbaugh, D.
1999-01-01
The Submillimeter Wave Astronomy Satellite (SWAS) was successfully launched on December 6, 1998 at 00:58 UTC. The two year mission is the fourth in the series of Small Explorer (SMEX) missions. SWAS is dedicated to the study of star formation and interstellar chemistry. SWAS was injected into a 635 km by 650 km orbit with an inclination of nearly 70 deg by an Orbital Sciences Corporation Pegasus XL launch vehicle. The Flight Dynamics attitude and navigation teams supported all phases of the early mission. This support included orbit determination, attitude determination, real-time monitoring, and sensor calibration. This paper reports the main results and lessons learned concerning navigation, support software, star tracker performance, magnetometer and gyroscope calibrations, and anomaly resolution. This includes information on spacecraft tip-off rates, first-day navigation problems, target acquisition anomalies, star tracker anomalies, and significant sensor improvements due to calibration efforts.
Orion Exploration Flight Test-l (EFT -1) Absolute Navigation Design
NASA Technical Reports Server (NTRS)
Sud, Jastesh; Gay, Robert; Holt, Greg; Zanetti, Renato
2014-01-01
Scheduled to launch in September 2014 atop a Delta IV Heavy from the Kennedy Space Center, the Orion Multi-Purpose-Crew-Vehicle (MPCV's) maiden flight dubbed "Exploration Flight Test -1" (EFT-1) intends to stress the system by placing the uncrewed vehicle on a high-energy parabolic trajectory replicating conditions similar to those that would be experienced when returning from an asteroid or a lunar mission. Unique challenges associated with designing the navigation system for EFT-1 are presented in the narrative with an emphasis on how redundancy and robustness influenced the architecture. Two Inertial Measurement Units (IMUs), one GPS receiver and three barometric altimeters (BALTs) comprise the navigation sensor suite. The sensor data is multiplexed using conventional integration techniques and the state estimate is refined by the GPS pseudorange and deltarange measurements in an Extended Kalman Filter (EKF) that employs the UDUT decomposition approach. The design is substantiated by simulation results to show the expected performance.
NASA Technical Reports Server (NTRS)
Larimer, Stanley J.; Lisec, Thomas R.; Spiessbach, Andrew J.
1989-01-01
Under a contract with NASA's Jet Propulsion Laboratory, Martin Marietta has developed several alternative rover concepts for unmanned exploration of the planet Mars. One of those concepts, the 'Walking Beam', is the subject of this paper. This concept was developed with the goal of achieving many of the capabilities of more sophisticated articulated-leg walkers with a much simpler, more robust, less computationally demanding and more power efficient design. It consists of two large-base tripods nested one within the other which alternately translate with respect to each other along a 5-meter beam to propel the vehicle. The semiautonomous navigation system relies on terrain geometry sensors and tacticle feedback from each foot to autonomously select a path which avoids hazards along a route designated from earth. Both mobility and navigation features of this concept are discussed including a top-level description of the vehicle's physical characteristics, deployment strategy, mobility elements, sensor suite, theory of operation, navigation and control processes, and estimated performance.
Localization Algorithm Based on a Spring Model (LASM) for Large Scale Wireless Sensor Networks.
Chen, Wanming; Mei, Tao; Meng, Max Q-H; Liang, Huawei; Liu, Yumei; Li, Yangming; Li, Shuai
2008-03-15
A navigation method for a lunar rover based on large scale wireless sensornetworks is proposed. To obtain high navigation accuracy and large exploration area, highnode localization accuracy and large network scale are required. However, thecomputational and communication complexity and time consumption are greatly increasedwith the increase of the network scales. A localization algorithm based on a spring model(LASM) method is proposed to reduce the computational complexity, while maintainingthe localization accuracy in large scale sensor networks. The algorithm simulates thedynamics of physical spring system to estimate the positions of nodes. The sensor nodesare set as particles with masses and connected with neighbor nodes by virtual springs. Thevirtual springs will force the particles move to the original positions, the node positionscorrespondingly, from the randomly set positions. Therefore, a blind node position can bedetermined from the LASM algorithm by calculating the related forces with the neighbornodes. The computational and communication complexity are O(1) for each node, since thenumber of the neighbor nodes does not increase proportionally with the network scale size.Three patches are proposed to avoid local optimization, kick out bad nodes and deal withnode variation. Simulation results show that the computational and communicationcomplexity are almost constant despite of the increase of the network scale size. The time consumption has also been proven to remain almost constant since the calculation steps arealmost unrelated with the network scale size.
NASA Technical Reports Server (NTRS)
Mitchell, Jason W.; Baldwin, Philip J.; Kurichh, Rishi; Naasz, Bo J.; Luquette, Richard J.
2007-01-01
The Formation Flying Testbed (FFTB) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) provides a hardware-in-the-loop test environment for formation navigation and control. The facility is evolving as a modular, hybrid, dynamic simulation facility for end-to-end guidance, navigation and. control (GN&C) design and analysis of formation flying spacecraft. The core capabilities of the FFTB, as a platform for testing critical hardware and software algorithms in-the-loop, have expanded to include S-band Radio Frequency (RF) modems for inter-spacecraft communication and ranging. To enable realistic simulations that require RF ranging sensors for relative navigation, a mechanism is needed to buffer the RF signals exchanged between spacecraft that accurately emulates the dynamic environment through which the RF signals travel, including the effects of medium, moving platforms, and radiated power. The Path Emulator for RF Signals (PERFS), currently under development at NASA GSFC, provides this capability. The function and performance of a prototype device are presented.
NASA Technical Reports Server (NTRS)
Mitchell, Jason W.; Baldwin, Philip J.; Kurichh, Rishi; Naasz, Bo J.; Luquette, Richard J.
2007-01-01
The Formation Flying Testbed (FFTB) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) provides a hardware-in-the-loop test environment for formation navigation and control. The facility is evolving as a modular, hybrid, dynamic simulation facility for end-to-end guidance, navigation and control (GN&C) design and analysis of formation flying spacecraft. The core capabilities of the FFTB, as a platform for testing critical hardware and software algorithms in-the-loop, have expanded to include S-band Radio Frequency (RF) modems for interspacecraft communication and ranging. To enable realistic simulations that require RF ranging sensors for relative navigation, a mechanism is needed to buffer the RF signals exchanged between spacecraft that accurately emulates the dynamic environment through which the RF signals travel, including the effects of the medium, moving platforms, and radiated power. The Path Emulator for Radio Frequency Signals (PERFS), currently under development at NASA GSFC, provides this capability. The function and performance of a prototype device are presented.
Semi autonomous mine detection system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Douglas Few; Roelof Versteeg; Herman Herman
2010-04-01
CMMAD is a risk reduction effort for the AMDS program. As part of CMMAD, multiple instances of semi autonomous robotic mine detection systems were created. Each instance consists of a robotic vehicle equipped with sensors required for navigation and marking, a countermine sensors and a number of integrated software packages which provide for real time processing of the countermine sensor data as well as integrated control of the robotic vehicle, the sensor actuator and the sensor. These systems were used to investigate critical interest functions (CIF) related to countermine robotic systems. To address the autonomy CIF, the INL developed RIKmore » was extended to allow for interaction with a mine sensor processing code (MSPC). In limited field testing this system performed well in detecting, marking and avoiding both AT and AP mines. Based on the results of the CMMAD investigation we conclude that autonomous robotic mine detection is feasible. In addition, CMMAD contributed critical technical advances with regard to sensing, data processing and sensor manipulation, which will advance the performance of future fieldable systems. As a result, no substantial technical barriers exist which preclude – from an autonomous robotic perspective – the rapid development and deployment of fieldable systems.« less
Egnos-Based Multi-Sensor Accurate and Reliable Navigation in Search-And Missions with Uavs
NASA Astrophysics Data System (ADS)
Molina, P.; Colomina, I.; Vitoria, T.; Silva, P. F.; Stebler, Y.; Skaloud, J.; Kornus, W.; Prades, R.
2011-09-01
This paper will introduce and describe the goals, concept and overall approach of the European 7th Framework Programme's project named CLOSE-SEARCH, which stands for 'Accurate and safe EGNOS-SoL Navigation for UAV-based low-cost SAR operations'. The goal of CLOSE-SEARCH is to integrate in a helicopter-type unmanned aerial vehicle, a thermal imaging sensor and a multi-sensor navigation system (based on the use of a Barometric Altimeter (BA), a Magnetometer (MAGN), a Redundant Inertial Navigation System (RINS) and an EGNOS-enabled GNSS receiver) with an Autonomous Integrity Monitoring (AIM) capability, to support the search component of Search-And-Rescue operations in remote, difficult-to-access areas and/or in time critical situations. The proposed integration will result in a hardware and software prototype that will demonstrate an end-to-end functionality, that is to fly in patterns over a region of interest (possibly inaccessible) during day or night and also under adverse weather conditions and locate there disaster survivors or lost people through the detection of the body heat. This paper will identify the technical challenges of the proposed approach, from navigating with a BA/MAGN/RINS/GNSS-EGNOSbased integrated system to the interpretation of thermal images for person identification. Moreover, the AIM approach will be described together with the proposed integrity requirements. Finally, this paper will show some results obtained in the project during the first test campaign performed on November 2010. On that day, a prototype was flown in three different missions to assess its high-level performance and to observe some fundamental mission parameters as the optimal flying height and flying speed to enable body recognition. The second test campaign is scheduled for the end of 2011.
Searching Lost People with Uavs: the System and Results of the Close-Search Project
NASA Astrophysics Data System (ADS)
Molina, P.; Colomina, I.; Vitoria, T.; Silva, P. F.; Skaloud, J.; Kornus, W.; Prades, R.; Aguilera, C.
2012-07-01
This paper will introduce the goals, concept and results of the project named CLOSE-SEARCH, which stands for 'Accurate and safe EGNOS-SoL Navigation for UAV-based low-cost Search-And-Rescue (SAR) operations'. The main goal is to integrate a medium-size, helicopter-type Unmanned Aerial Vehicle (UAV), a thermal imaging sensor and an EGNOS-based multi-sensor navigation system, including an Autonomous Integrity Monitoring (AIM) capability, to support search operations in difficult-to-access areas and/or night operations. The focus of the paper is three-fold. Firstly, the operational and technical challenges of the proposed approach are discussed, such as ultra-safe multi-sensor navigation system, the use of combined thermal and optical vision (infrared plus visible) for person recognition and Beyond-Line-Of-Sight communications among others. Secondly, the implementation of the integrity concept for UAV platforms is discussed herein through the AIM approach. Based on the potential of the geodetic quality analysis and on the use of the European EGNOS system as a navigation performance starting point, AIM approaches integrity from the precision standpoint; that is, the derivation of Horizontal and Vertical Protection Levels (HPLs, VPLs) from a realistic precision estimation of the position parameters is performed and compared to predefined Alert Limits (ALs). Finally, some results from the project test campaigns are described to report on particular project achievements. Together with actual Search-and-Rescue teams, the system was operated in realistic, user-chosen test scenarios. In this context, and specially focusing on the EGNOS-based UAV navigation, the AIM capability and also the RGB/thermal imaging subsystem, a summary of the results is presented.
JPRS Report, Science & Technology, Japan, 27th Aircraft Symposium
1990-10-29
screen; the relative attitude is then determined . 2) Video Sensor System Specific patterns (grapple target, etc.) drawn on the target spacecraft , or the...entire target spacecraft , is imaged by camera . Navigation information is obtained by on-board image processing, such as extraction of contours and...standard figure called "grapple target" located in the vicinity of the grapple fixture on the target spacecraft is imaged by camera . Contour lines and
Evaluation of novel technologies for the miniaturization of flash imaging lidar
NASA Astrophysics Data System (ADS)
Mitev, V.; Pollini, A.; Haesler, J.; Perenzoni, D.; Stoppa, D.; Kolleck, Christian; Chapuy, M.; Kervendal, E.; Pereira do Carmo, João.
2017-11-01
Planetary exploration constitutes one of the main components in the European Space activities. Missions to Mars, Moon and asteroids are foreseen where it is assumed that the human missions shall be preceded by robotic exploitation flights. The 3D vision is recognised as a key enabling technology in the relative proximity navigation of the space crafts, where imaging LiDAR is one of the best candidates for such 3D vision sensor.
AAS/GSFC 13th International Symposium on Space Flight Dynamics. Volume 2
NASA Technical Reports Server (NTRS)
Stengle, Tom (Editor)
1998-01-01
This conference proceedings preprint includes papers and abstracts presented at the 13th International Symposium on Space Flight Dynamics, May 11-15, 1998. Co-sponsored by American Astronautical Society and the Guidance, Navigation and Control Center of the Goddard Space Flight Center, this symposium featured technical papers on a wide range of issues related to orbit-attitude prediction, determination, and control; attitude sensor calibration; attitude dynamics; and mission design.
Design and validation of a GNC system for missions to asteroids: the AIM scenario
NASA Astrophysics Data System (ADS)
Pellacani, A.; Kicman, P.; Suatoni, M.; Casasco, M.; Gil, J.; Carnelli, I.
2017-12-01
Deep space missions, and in particular missions to asteroids, impose a certain level of autonomy that depends on the mission objectives. If the mission requires the spacecraft to perform close approaches to the target body (the extreme case being a landing scenario), the autonomy level must be increased to guarantee the fast and reactive response which is required in both nominal and contingency operations. The GNC system must be designed in accordance with the required level of autonomy. The GNC system designed and tested in the frame of ESA's Asteroid Impact Mission (AIM) system studies (Phase A/B1 and Consolidation Phase) is an example of an autonomous GNC system that meets the challenging objectives of AIM. The paper reports the design of such GNC system and its validation through a DDVV plan that includes Model-in-the-Loop and Hardware-in-the-Loop testing. Main focus is the translational navigation, which is able to provide online the relative state estimation with respect to the target body using exclusively cameras as relative navigation sensors. The relative navigation outputs are meant to be used for nominal spacecraft trajectory corrections as well as to estimate the collision risk with the asteroid and, if needed, to command the execution of a collision avoidance manoeuvre to guarantee spacecraft safety
Mars Rover Navigation Results Using Sun Sensor Heading Determination
NASA Technical Reports Server (NTRS)
Volpe, Richard
1998-01-01
Upcoming missions to the surface of Mars will use mobile robots to traverse long distances from the landing site. To prepare for these missions, the prototype rover, Rocky 7, has been tested in desert field trials conducted with a team of planetary scientists. While several new capabilities have been demonstrated, foremost among these was sun-sensor based traversal of natural terrain totaling a distance of one kilometer. This paper describes navigation results obtained in the field tests, where cross-track error was only 6% of distance traveled. Comparison with previous results of other planetary rover systems shows this to be a significant improvement.
Autonomous Navigation Apparatus With Neural Network for a Mobile Vehicle
NASA Technical Reports Server (NTRS)
Quraishi, Naveed (Inventor)
1996-01-01
An autonomous navigation system for a mobile vehicle arranged to move within an environment includes a plurality of sensors arranged on the vehicle and at least one neural network including an input layer coupled to the sensors, a hidden layer coupled to the input layer, and an output layer coupled to the hidden layer. The neural network produces output signals representing respective positions of the vehicle, such as the X coordinate, the Y coordinate, and the angular orientation of the vehicle. A plurality of patch locations within the environment are used to train the neural networks to produce the correct outputs in response to the distances sensed.
Feng, Guohu; Wu, Wenqi; Wang, Jinling
2012-01-01
A matrix Kalman filter (MKF) has been implemented for an integrated navigation system using visual/inertial/magnetic sensors. The MKF rearranges the original nonlinear process model in a pseudo-linear process model. We employ the observability rank criterion based on Lie derivatives to verify the conditions under which the nonlinear system is observable. It has been proved that such observability conditions are: (a) at least one degree of rotational freedom is excited, and (b) at least two linearly independent horizontal lines and one vertical line are observed. Experimental results have validated the correctness of these observability conditions. PMID:23012523
Embedded mobile farm robot for identification of diseased plants
NASA Astrophysics Data System (ADS)
Sadistap, S. S.; Botre, B. A.; Pandit, Harshavardhan; Chandrasekhar; Rao, Adesh
2013-07-01
This paper presents the development of a mobile robot used in farms for identification of diseased plants. It puts forth two of the major aspects of robotics namely automated navigation and image processing. The robot navigates on the basis of the GPS (Global Positioning System) location and data obtained from IR (Infrared) sensors to avoid any obstacles in its path. It uses an image processing algorithm to differentiate between diseased and non-diseased plants. A robotic platform consisting of an ARM9 processor, motor drivers, robot mechanical assembly, camera and infrared sensors has been used. Mini2440 microcontroller has been used wherein Embedded linux OS (Operating System) is implemented.
Software Would Largely Automate Design of Kalman Filter
NASA Technical Reports Server (NTRS)
Chuang, Jason C. H.; Negast, William J.
2005-01-01
Embedded Navigation Filter Automatic Designer (ENFAD) is a computer program being developed to automate the most difficult tasks in designing embedded software to implement a Kalman filter in a navigation system. The most difficult tasks are selection of error states of the filter and tuning of filter parameters, which are timeconsuming trial-and-error tasks that require expertise and rarely yield optimum results. An optimum selection of error states and filter parameters depends on navigation-sensor and vehicle characteristics, and on filter processing time. ENFAD would include a simulation module that would incorporate all possible error states with respect to a given set of vehicle and sensor characteristics. The first of two iterative optimization loops would vary the selection of error states until the best filter performance was achieved in Monte Carlo simulations. For a fixed selection of error states, the second loop would vary the filter parameter values until an optimal performance value was obtained. Design constraints would be satisfied in the optimization loops. Users would supply vehicle and sensor test data that would be used to refine digital models in ENFAD. Filter processing time and filter accuracy would be computed by ENFAD.
Improving geolocation and spatial accuracies with the modular integrated avionics group (MIAG)
NASA Astrophysics Data System (ADS)
Johnson, Einar; Souter, Keith
1996-05-01
The modular integrated avionics group (MIAG) is a single unit approach to combining position, inertial and baro-altitude/air data sensors to provide optimized navigation, guidance and control performance. Lear Astronics Corporation is currently working within the navigation community to upgrade existing MIAG performance with precise GPS positioning mechanization tightly integrated with inertial, baro and other sensors. Among the immediate benefits are the following: (1) accurate target location in dynamic conditions; (2) autonomous launch and recovery using airborne avionics only; (3) precise flight path guidance; and (4) improved aircraft and payload stability information. This paper will focus on the impact of using the MIAG with its multimode navigation accuracies on the UAV targeting mission. Gimbaled electro-optical sensors mounted on a UAV can be used to determine ground coordinates of a target at the center of the field of view by a series of vector rotation and scaling computations. The accuracy of the computed target coordinates is dependent on knowing the UAV position and the UAV-to-target offset computation. Astronics performed a series of simulations to evaluate the effects that the improved angular and position data available from the MIAG have on target coordinate accuracy.
IPS - a vision aided navigation system
NASA Astrophysics Data System (ADS)
Börner, Anko; Baumbach, Dirk; Buder, Maximilian; Choinowski, Andre; Ernst, Ines; Funk, Eugen; Grießbach, Denis; Schischmanow, Adrian; Wohlfeil, Jürgen; Zuev, Sergey
2017-04-01
Ego localization is an important prerequisite for several scientific, commercial, and statutory tasks. Only by knowing one's own position, can guidance be provided, inspections be executed, and autonomous vehicles be operated. Localization becomes challenging if satellite-based navigation systems are not available, or data quality is not sufficient. To overcome this problem, a team of the German Aerospace Center (DLR) developed a multi-sensor system based on the human head and its navigation sensors - the eyes and the vestibular system. This system is called integrated positioning system (IPS) and contains a stereo camera and an inertial measurement unit for determining an ego pose in six degrees of freedom in a local coordinate system. IPS is able to operate in real time and can be applied for indoor and outdoor scenarios without any external reference or prior knowledge. In this paper, the system and its key hardware and software components are introduced. The main issues during the development of such complex multi-sensor measurement systems are identified and discussed, and the performance of this technology is demonstrated. The developer team started from scratch and transfers this technology into a commercial product right now. The paper finishes with an outlook.
A compliant mechanism for inspecting extremely confined spaces
NASA Astrophysics Data System (ADS)
Mascareñas, David; Moreu, Fernando; Cantu, Precious; Shields, Daniel; Wadden, Jack; El Hadedy, Mohamed; Farrar, Charles
2017-11-01
We present a novel, compliant mechanism that provides the capability to navigate extremely confined spaces for the purpose of infrastructure inspection. Extremely confined spaces are commonly encountered during infrastructure inspection. Examples of such spaces can include pipes, conduits, and ventilation ducts. Often these infrastructure features go uninspected simply because there is no viable way to access their interior. In addition, it is not uncommon for extremely confined spaces to possess a maze-like architecture that must be selectively navigated in order to properly perform an inspection. Efforts by the imaging sensor community have resulted in the development of imaging sensors on the millimeter length scale. Due to their compact size, they are able to inspect many extremely confined spaces of interest, however, the means to deliver these sensors to the proper location to obtain the desired images are lacking. To address this problem, we draw inspiration from the field of endoscopic surgery. Specifically we consider the work that has already been done to create long flexible needles that are capable of being steered through the human body. These devices are typically referred to as ‘steerable needles.’ Steerable needle technology is not directly applicable to the problem of navigating maze-like arrangements of extremely confined spaces, but it does provide guidance on how this problem should be approached. Specifically, the super-elastic nitinol tubing material that allows steerable needles to operate is also appropriate for the problem of navigating maze-like arrangements of extremely confined spaces. Furthermore, the portion of the mechanism that enters the extremely confined space is completely mechanical in nature. The mechanical nature of the device is an advantage when the extremely confined space features environmental hazards such as radiation that could degrade an electromechanically operated mechanism. Here, we present a compliant mechanism developed to navigate maze-like arrangements of extremely confined spaces. The mechanism is shown to be able to selectively navigate past three 90° bends. The ability to selectively navigate extremely confined spaces opens up new possibilities to use emerging miniature imaging technology for infrastructure inspection.
Sensors integration for smartphone navigation: performances and future challenges
NASA Astrophysics Data System (ADS)
Aicardi, I.; Dabove, P.; Lingua, A.; Piras, M.
2014-08-01
Nowadays the modern smartphones include several sensors which are usually adopted in geomatic application, as digital camera, GNSS (Global Navigation Satellite System) receivers, inertial platform, RFID and Wi-Fi systems. In this paper the authors would like to testing the performances of internal sensors (Inertial Measurement Unit, IMU) of three modern smartphones (Samsung GalaxyS4, Samsung GalaxyS5 and iPhone4) compared to external mass-market IMU platform in order to verify their accuracy levels, in terms of positioning. Moreover, the Image Based Navigation (IBN) approach is also investigated: this approach can be very useful in hard-urban environment or for indoor positioning, as alternative to GNSS positioning. IBN allows to obtain a sub-metrical accuracy, but a special database of georeferenced images (Image DataBase, IDB) is needed, moreover it is necessary to use dedicated algorithm to resizing the images which are collected by smartphone, in order to share it with the server where is stored the IDB. Moreover, it is necessary to characterize smartphone camera lens in terms of focal length and lens distortions. The authors have developed an innovative method with respect to those available today, which has been tested in a covered area, adopting a special support where all sensors under testing have been installed. Geomatic instrument have been used to define the reference trajectory, with purpose to compare this one, with the path obtained with IBN solution. First results leads to have an horizontal and vertical accuracies better than 60 cm, respect to the reference trajectories. IBN method, sensors, test and result will be described in the paper.
NASA Technical Reports Server (NTRS)
Moore, Andrew J.; Schubert, Matthew; Rymer, Nicholas; Balachandran, Swee; Consiglio, Maria; Munoz, Cesar; Smith, Joshua; Lewis, Dexter; Schneider, Paul
2017-01-01
Flights at low altitudes in close proximity to electrical transmission infrastructure present serious navigational challenges: GPS and radio communication quality is variable and yet tight position control is needed to measure defects while avoiding collisions with ground structures. To advance unmanned aerial vehicle (UAV) navigation technology while accomplishing a task with economic and societal benefit, a high voltage electrical infrastructure inspection reference mission was designed. An integrated air-ground platform was developed for this mission and tested in two days of experimental flights to determine whether navigational augmentation was needed to successfully conduct a controlled inspection experiment. The airborne component of the platform was a multirotor UAV built from commercial off-the-shelf hardware and software, and the ground component was a commercial laptop running open source software. A compact ultraviolet sensor mounted on the UAV can locate 'hot spots' (potential failure points in the electric grid), so long as the UAV flight path adequately samples the airspace near the power grid structures. To improve navigation, the platform was supplemented with two navigation technologies: lidar-to-polyhedron preflight processing for obstacle demarcation and inspection distance planning, and trajectory management software to enforce inspection standoff distance. Both navigation technologies were essential to obtaining useful results from the hot spot sensor in this obstacle-rich, low-altitude airspace. Because the electrical grid extends into crowded airspaces, the UAV position was tracked with NASA unmanned aerial system traffic management (UTM) technology. The following results were obtained: (1) Inspection of high-voltage electrical transmission infrastructure to locate 'hot spots' of ultraviolet emission requires navigation methods that are not broadly available and are not needed at higher altitude flights above ground structures. (2) The sensing capability of a novel airborne UV detector was verified with a standard ground-based instrument. Flights with this sensor showed that UAV measurement operations and recording methods are viable. With improved sensor range, UAVs equipped with compact UV sensors could serve as the detection elements in a self-diagnosing power grid. (3) Simplification of rich lidar maps to polyhedral obstacle maps reduces data volume by orders of magnitude, so that computation with the resultant maps in real time is possible. This enables real-time obstacle avoidance autonomy. Stable navigation may be feasible in the GPS-deprived environment near transmission lines by a UAV that senses ground structures and compares them to these simplified maps. (4) A new, formally verified path conformance software system that runs onboard a UAV was demonstrated in flight for the first time. It successfully maneuvered the aircraft after a sudden lateral perturbation that models a gust of wind, and processed lidar-derived polyhedral obstacle maps in real time. (5) Tracking of the UAV in the national airspace using the NASA UTM technology was a key safety component of this reference mission, since the flights were conducted beneath the landing approach to a heavily used runway. Comparison to autopilot tracking showed that UTM tracking accurately records the UAV position throughout the flight path.
A novel navigation method used in a ballistic missile
NASA Astrophysics Data System (ADS)
Qian, Hua-ming; Sun, Long; Cai, Jia-nan; Peng, Yu
2013-10-01
The traditional strapdown inertial/celestial integrated navigation method used in a ballistic missile cannot accurately estimate the accelerometer bias. It might cause a divergence of navigation errors. To solve this problem, a new navigation method named strapdown inertial/starlight refractive celestial integrated navigation is proposed. To verify the feasibility of the proposed method, a simulated program of a ballistic missile is presented. The simulation results indicated that, when multiple refraction stars are used, the proposed method can accurately estimate the accelerometer bias, and suppress the divergence of navigation errors completely. Specifically, in order to apply this method to a ballistic missile, a novel measurement equation based on stellar refraction was developed. Furthermore a method to calculate the number of refraction stars observed by the stellar sensor was given. Finally, the relationship between the number of refraction stars used and the navigation accuracy is analysed.
COBALT: A GN&C Payload for Testing ALHAT Capabilities in Closed-Loop Terrestrial Rocket Flights
NASA Technical Reports Server (NTRS)
Carson, John M., III; Amzajerdian, Farzin; Hines, Glenn D.; O'Neal, Travis V.; Robertson, Edward A.; Seubert, Carl; Trawny, Nikolas
2016-01-01
The COBALT (CoOperative Blending of Autonomous Landing Technology) payload is being developed within NASA as a risk reduction activity to mature, integrate and test ALHAT (Autonomous precision Landing and Hazard Avoidance Technology) systems targeted for infusion into near-term robotic and future human space flight missions. The initial COBALT payload instantiation is integrating the third-generation ALHAT Navigation Doppler Lidar (NDL) sensor, for ultra high-precision velocity plus range measurements, with the passive-optical Lander Vision System (LVS) that provides Terrain Relative Navigation (TRN) global-position estimates. The COBALT payload will be integrated onboard a rocket-propulsive terrestrial testbed and will provide precise navigation estimates and guidance planning during two flight test campaigns in 2017 (one open-loop and closed- loop). The NDL is targeting performance capabilities desired for future Mars and Moon Entry, Descent and Landing (EDL). The LVS is already baselined for TRN on the Mars 2020 robotic lander mission. The COBALT platform will provide NASA with a new risk-reduction capability to test integrated EDL Guidance, Navigation and Control (GN&C) components in closed-loop flight demonstrations prior to the actual mission EDL.
Landmark-aided localization for air vehicles using learned object detectors
NASA Astrophysics Data System (ADS)
DeAngelo, Mark Patrick
This research presents two methods to localize an aircraft without GPS using fixed landmarks observed from an optical sensor. Onboard absolute localization is useful for vehicle navigation free from an external network. The objective is to achieve practical navigation performance using available autopilot hardware and a downward pointing camera. The first method uses computer vision cascade object detectors, which are trained to detect predetermined, distinct landmarks prior to a flight. The first method also concurrently explores aircraft localization using roads between landmark updates. During a flight, the aircraft navigates with attitude, heading, airspeed, and altitude measurements and obtains measurement updates when landmarks are detected. The sensor measurements and landmark coordinates extracted from the aircraft's camera images are combined into an unscented Kalman filter to obtain an estimate of the aircraft's position and wind velocities. The second method uses computer vision object detectors to detect abundant generic landmarks referred as buildings, fields, trees, and road intersections from aerial perspectives. Various landmark attributes and spatial relationships to other landmarks are used to help associate observed landmarks with reference landmarks. The computer vision algorithms automatically extract reference landmarks from maps, which are processed offline before a flight. During a flight, the aircraft navigates with attitude, heading, airspeed, and altitude measurements and obtains measurement corrections by processing aerial photos with similar generic landmark detection techniques. The method also combines sensor measurements and landmark coordinates into an unscented Kalman filter to obtain an estimate of the aircraft's position and wind velocities.
Low Altitude Navigation Augmentation System.
1981-12-01
mtiem) 11maMM crew soer can use a SLA as the priwzy navigation sensor mdoaeteN imMIgM with the M.C generate d view (loodcing to the i il. the Pilot...t# ths s~hwtW include tim. locations &An an indication of -the tipo of messurmott 401oyed (*.I. sta’-trecker, lanmrk, bearing mssrment etc.).- TMe
Development and Flight Test of a Robust Optical-Inertial Navigation System Using Low-Cost Sensors
2008-03-01
for this test. Though, marketed as a GPS/INS, it was in fact used simply as an IMU for this test. The raw inertial measurement data (from the...Performance Evaluation of Low Cost MEMS-Based IMU Integrated With GPS for Land Vehicle Navigation Application. MS Thesis, UCGE Reports Number
Doppler Lidar Sensor for Precision Landing on the Moon and Mars
NASA Technical Reports Server (NTRS)
Amzajerdian, Farzin; Petway, Larry; Hines, Glenn; Barnes, Bruce; Pierrottet, Diego; Lockhard, George
2012-01-01
Landing mission concepts that are being developed for exploration of planetary bodies are increasingly ambitious in their implementations and objectives. Most of these missions require accurate position and velocity data during their descent phase in order to ensure safe soft landing at the pre-designated sites. To address this need, a Doppler lidar is being developed by NASA under the Autonomous Landing and Hazard Avoidance (ALHAT) project. This lidar sensor is a versatile instrument capable of providing precision velocity vectors, vehicle ground relative altitude, and attitude. The capabilities of this advanced technology have been demonstrated through two helicopter flight test campaigns conducted over a vegetation-free terrain in 2008 and 2010. Presently, a prototype version of this sensor is being assembled for integration into a rocket-powered terrestrial free-flyer vehicle. Operating in a closed loop with vehicle's guidance and navigation system, the viability of this advanced sensor for future landing missions will be demonstrated through a series of flight tests in 2012.
NASA Astrophysics Data System (ADS)
Lebedev, M. A.; Stepaniants, D. G.; Komarov, D. V.; Vygolov, O. V.; Vizilter, Yu. V.; Zheltov, S. Yu.
2014-08-01
The paper addresses a promising visualization concept related to combination of sensor and synthetic images in order to enhance situation awareness of a pilot during an aircraft landing. A real-time algorithm for a fusion of a sensor image, acquired by an onboard camera, and a synthetic 3D image of the external view, generated in an onboard computer, is proposed. The pixel correspondence between the sensor and the synthetic images is obtained by an exterior orientation of a "virtual" camera using runway points as a geospatial reference. The runway points are detected by the Projective Hough Transform, which idea is to project the edge map onto a horizontal plane in the object space (the runway plane) and then to calculate intensity projections of edge pixels on different directions of intensity gradient. The performed experiments on simulated images show that on a base glide path the algorithm provides image fusion with pixel accuracy, even in the case of significant navigation errors.
Advanced integrated enhanced vision systems
NASA Astrophysics Data System (ADS)
Kerr, J. R.; Luk, Chiu H.; Hammerstrom, Dan; Pavel, Misha
2003-09-01
In anticipation of its ultimate role in transport, business and rotary wing aircraft, we clarify the role of Enhanced Vision Systems (EVS): how the output data will be utilized, appropriate architecture for total avionics integration, pilot and control interfaces, and operational utilization. Ground-map (database) correlation is critical, and we suggest that "synthetic vision" is simply a subset of the monitor/guidance interface issue. The core of integrated EVS is its sensor processor. In order to approximate optimal, Bayesian multi-sensor fusion and ground correlation functionality in real time, we are developing a neural net approach utilizing human visual pathway and self-organizing, associative-engine processing. In addition to EVS/SVS imagery, outputs will include sensor-based navigation and attitude signals as well as hazard detection. A system architecture is described, encompassing an all-weather sensor suite; advanced processing technology; intertial, GPS and other avionics inputs; and pilot and machine interfaces. Issues of total-system accuracy and integrity are addressed, as well as flight operational aspects relating to both civil certification and military applications in IMC.
NASA Astrophysics Data System (ADS)
Katake, Anup; Choi, Heeyoul
2010-01-01
To enable autonomous air-to-refueling of manned and unmanned vehicles a robust high speed relative navigation sensor capable of proving high accuracy 3DOF information in diverse operating conditions is required. To help address this problem, StarVision Technologies Inc. has been developing a compact, high update rate (100Hz), wide field-of-view (90deg) direction and range estimation imaging sensor called VisNAV 100. The sensor is fully autonomous requiring no communication from the tanker aircraft and contains high reliability embedded avionics to provide range, azimuth, elevation (3 degrees of freedom solution 3DOF) and closing speed relative to the tanker aircraft. The sensor is capable of providing 3DOF with an error of 1% in range and 0.1deg in azimuth/elevation up to a range of 30m and 1 deg error in direction for ranges up to 200m at 100Hz update rates. In this paper we will discuss the algorithms that were developed in-house to enable robust beacon pattern detection, outlier rejection and 3DOF estimation in adverse conditions and present the results of several outdoor tests. Results from the long range single beacon detection tests will also be discussed.
GPS Navigation for the Magnetospheric Multi-Scale Mission
NASA Technical Reports Server (NTRS)
Bamford, William; Mitchell, Jason; Southward, Michael; Baldwin, Philip; Winternitz, Luke; Heckler, Gregory; Kurichh, Rishi; Sirotzky, Steve
2009-01-01
In 2014. NASA is scheduled to launch the Magnetospheric Multiscale Mission (MMS), a four-satellite formation designed to monitor fluctuations in the Earth's magnetosphere. This mission has two planned phases with different orbits (1? x 12Re and 1.2 x 25Re) to allow for varying science regions of interest. To minimize ground resources and to mitigate the probability of collisions between formation members, an on-board orbit determination system consisting of a Global Positioning System (GPS) receiver and crosslink transceiver was desired. Candidate sensors would be required to acquire GPS signals both below and above the constellation while spinning at three revolutions-per-minute (RPM) and exchanging state and science information among the constellation. The Intersatellite Ranging and Alarm System (IRAS), developed by Goddard Space Flight Center (GSFC) was selected to meet this challenge. IRAS leverages the eight years of development GSFC has invested in the Navigator GPS receiver and its spacecraft communication expertise, culminating in a sensor capable of absolute and relative navigation as well as intersatellite communication. The Navigator is a state-of-the-art receiver designed to acquire and track weak GPS signals down to -147dBm. This innovation allows the receiver to track both the main lobe and the much weaker side lobe signals. The Navigator's four antenna inputs and 24 tracking channels, together with customized hardware and software, allow it to seamlessly maintain visibility while rotating. Additionally, an extended Kalman filter provides autonomous, near real-time, absolute state and time estimates. The Navigator made its maiden voyage on the Space Shuttle during the Hubble Servicing Mission, and is scheduled to fly on MMS as well as the Global Precipitation Measurement Mission (GPM). Additionally, Navigator's acquisition engine will be featured in the receiver being developed for the Orion vehicle. The crosslink transceiver is a 1/4 Watt transmitter utilizing a TDMA schedule to distribute a science quality message to all constellation members every ten seconds. Additionally the system generates one-way range measurements between formation members which is used as input to the Kalman filter. In preparation for the MMS Preliminary Design Review (PDR), the Navigator was required to pass a series of Technology Readiness Level (TRL) tests to earn the necessary TRL-6 classification. The TRL-6 level is achieved by demonstrating a prototype unit in a relevant end-to-end environment. The IRAS unit was able to meet all requirements during the testing phase, and has thus been TRL-6 qualified
Impact Assessment of GNSS Spoofing Attacks on INS/GNSS Integrated Navigation System.
Liu, Yang; Li, Sihai; Fu, Qiangwen; Liu, Zhenbo
2018-05-04
In the face of emerging Global Navigation Satellite System (GNSS) spoofing attacks, there is a need to give a comprehensive analysis on how the inertial navigation system (INS)/GNSS integrated navigation system responds to different kinds of spoofing attacks. A better understanding of the integrated navigation system’s behavior with spoofed GNSS measurements gives us valuable clues to develop effective spoofing defenses. This paper focuses on an impact assessment of GNSS spoofing attacks on the integrated navigation system Kalman filter’s error covariance, innovation sequence and inertial sensor bias estimation. A simple and straightforward measurement-level trajectory spoofing simulation framework is presented, serving as the basis for an impact assessment of both unsynchronized and synchronized spoofing attacks. Recommendations are given for spoofing detection and mitigation based on our findings in the impact assessment process.
Laser Range and Bearing Finder for Autonomous Missions
NASA Technical Reports Server (NTRS)
Granade, Stephen R.
2004-01-01
NASA has recently re-confirmed their interest in autonomous systems as an enabling technology for future missions. In order for autonomous missions to be possible, highly-capable relative sensor systems are needed to determine an object's distance, direction, and orientation. This is true whether the mission is autonomous in-space assembly, rendezvous and docking, or rover surface navigation. Advanced Optical Systems, Inc. has developed a wide-angle laser range and bearing finder (RBF) for autonomous space missions. The laser RBF has a number of features that make it well-suited for autonomous missions. It has an operating range of 10 m to 5 km, with a 5 deg field of view. Its wide field of view removes the need for scanning systems such as gimbals, eliminating moving parts and making the sensor simpler and space qualification easier. Its range accuracy is 1% or better. It is designed to operate either as a stand-alone sensor or in tandem with a sensor that returns range, bearing, and orientation at close ranges, such as NASA's Advanced Video Guidance Sensor. We have assembled the initial prototype and are currently testing it. We will discuss the laser RBF's design and specifications. Keywords: laser range and bearing finder, autonomous rendezvous and docking, space sensors, on-orbit sensors, advanced video guidance sensor
Acquisition and cruise sensing for attitude control
NASA Technical Reports Server (NTRS)
Pace, G. D., Jr.; Schmidt, L. F.
1977-01-01
Modified wideangle analog cruise sun sensor coupled with changes in optic attitude correction capabilities, eliminate need of acquisition and sun gate sensors, making on-course navigation of spacecraft flying interplanetary missions less risky and costly. Operational characteristics potentially make system applicable to guidance and control of solar energy collection systems.
NASA Astrophysics Data System (ADS)
Moafipoor, Shahram
Personal navigators (PN) have been studied for about a decade in different fields and applications, such as safety and rescue operations, security and emergency services, and police and military applications. The common goal of all these applications is to provide precise and reliable position, velocity, and heading information of each individual in various environments. In the PN system developed in this dissertation, the underlying assumption is that the system does not require pre-existing infrastructure to enable pedestrian navigation. To facilitate this capability, a multisensor system concept, based on the Global Positioning System (GPS), inertial navigation, barometer, magnetometer, and a human pedometry model has been developed. An important aspect of this design is to use the human body as navigation sensor to facilitate Dead Reckoning (DR) navigation in GPS-challenged environments. The system is designed predominantly for outdoor environments, where occasional loss of GPS lock may happen; however, testing and performance demonstration have been extended to indoor environments. DR navigation is based on a relative-measurement approach, with the key idea of integrating the incremental motion information in the form of step direction (SD) and step length (SL) over time. The foundation of the intelligent navigation system concept proposed here rests in exploiting the human locomotion pattern, as well as change of locomotion in varying environments. In this context, the term intelligent navigation represents the transition from the conventional point-to-point DR to dynamic navigation using the knowledge about the mechanism of the moving person. This approach increasingly relies on integrating knowledge-based systems (KBS) and artificial intelligence (AI) methodologies, including artificial neural networks (ANN) and fuzzy logic (FL). In addition, a general framework of the quality control for the real-time validation of the DR processing is proposed, based on a two-stage Kalman Filter approach. The performance comparison of the algorithm based on different field and simulated datasets, with varying levels of sensor errors, showed that 90 per cent success rate was achieved in detection of outliers for SL and 80 per cent for SD. The SL is predicted for both KBS-based ANN and FL approaches with an average accumulated error of 2 per cent, observed for the total distance traveled, which is generally an improvement over most of the existing pedometry systems. The target accuracy of the system is +/-(3-5)m CEP50 (circular error, probable 50%). This dissertation provides a performance analysis in the outdoor and indoor environments for different operators. Another objective of this dissertation is to test the system's navigation limitation in DR mode in terms of time and trajectory length in order to determine the upper limit of indoor operations. It was determined that for more than four indoor loops, where the user walked 261m in about 6.5 minutes, the DR performance met the required accuracy specifications. However, these results are only relevant to the existing data. Future studies should consider more comprehensive performance analysis for longer trajectories in challenging environments and possible extension to image-based navigation to expand the indoor capability of the system.
NASA Astrophysics Data System (ADS)
Ferrini, V.; Fornari, D. J.; Shank, T.; Tivey, M.; Kelley, D. S.; Glickson, D.; Carbotte, S. M.; Howland, J.; Whitcomb, L. L.; Yoerger, D.
2004-12-01
Recent field programs at the East Pacific Rise and Juan de Fuca Ridge have resulted in the refinement of data processing protocols that enable the rapid creation of high-resolution (meter-scale) bathymetric maps from pencil-beam altimetric sonar data that are routinely collected during DSV Alvin dives. With the development of the appropriate processing tools, the Imagenex sonar, a permanent sensor on Alvin, can be used by a broad range of scientists permitting the analysis of various data sets within the context of high-quality bathymetric maps. The data processing protocol integrates depth data recorded with Alvin's Paroscientific pressure sensor with bathymetric soundings collected with an Imagenex 675 kHz articulating (scanning) sonar system, and high-resolution navigational data acquired with DVLNAV, which includes bottom lock Doppler sonar and long baseline (LBL) navigation. Together these data allow us, for the first time, to visualize portions of Ridge 2000 Integrated Study Sites (ISS) at 1-m vertical and horizontal resolution. These maps resolve morphological details of structures within the summit trough at scales that are relevant to biological communities (e.g. hydrothermal vents, lava pillars, trough walls), thus providing the important geologic context necessary to better understand spatial patterns associated with integrated biological-hydrothermal-geological processes. The Imagenex sonar is also a permanent sensor on the Jason2 ROV, which is also equipped with an SM2000 (200 kHz) near-bottom multibeam sonar. In the future, it is envisioned that near-bottom multibeam sonars will be standard sensors on all National Deep Submergence Facility (NDSF) vehicles. Streamlining data processing protocols makes these datasets more accessible to NDSF users and ensures broad compatibility between data formats among NDSF vehicle systems and allied vehicles (e.g. ABE). Establishing data processing protocols and software suites, routinely calibrating sensors (e.g. Paroscientific depth sensors), and ensuring good navigational benchmarks between various cruises to the Ridge 2000 ISS improves the capability and quality of rapidly produced high-resolution bathymetric maps enabling users to optimize their diving programs. This is especially important within the context of augmenting high-resolution bathymetric data collection in ISS areas (several cruises to the same area over multiple years) and investigating possible changes in seafloor topography, hydrothermal vent features and/or biological communities that are related to tectonic or volcanic events.
Bioinspired polarization navigation sensor for autonomous munitions systems
NASA Astrophysics Data System (ADS)
Giakos, G. C.; Quang, T.; Farrahi, T.; Deshpande, A.; Narayan, C.; Shrestha, S.; Li, Y.; Agarwal, M.
2013-05-01
Small unmanned aerial vehicles UAVs (SUAVs), micro air vehicles (MAVs), Automated Target Recognition (ATR), and munitions guidance, require extreme operational agility and robustness which can be partially offset by efficient bioinspired imaging sensor designs capable to provide enhanced guidance, navigation and control capabilities (GNC). Bioinspired-based imaging technology can be proved useful either for long-distance surveillance of targets in a cluttered environment, or at close distances limited by space surroundings and obstructions. The purpose of this study is to explore the phenomenology of image formation by different insect eye architectures, which would directly benefit the areas of defense and security, on the following four distinct areas: a) fabrication of the bioinspired sensor b) optical architecture, c) topology, and d) artificial intelligence. The outcome of this study indicates that bioinspired imaging can impact the areas of defense and security significantly by dedicated designs fitting into different combat scenarios and applications.
Adaptive UAV Attitude Estimation Employing Unscented Kalman Filter, FOAM and Low-Cost MEMS Sensors
de Marina, Héctor García; Espinosa, Felipe; Santos, Carlos
2012-01-01
Navigation employing low cost MicroElectroMechanical Systems (MEMS) sensors in Unmanned Aerial Vehicles (UAVs) is an uprising challenge. One important part of this navigation is the right estimation of the attitude angles. Most of the existent algorithms handle the sensor readings in a fixed way, leading to large errors in different mission stages like take-off aerobatic maneuvers. This paper presents an adaptive method to estimate these angles using off-the-shelf components. This paper introduces an Attitude Heading Reference System (AHRS) based on the Unscented Kalman Filter (UKF) using the Fast Optimal Attitude Matrix (FOAM) algorithm as the observation model. The performance of the method is assessed through simulations. Moreover, field experiments are presented using a real fixed-wing UAV. The proposed low cost solution, implemented in a microcontroller, shows a satisfactory real time performance. PMID:23012559
Absolute marine gravimetry with matter-wave interferometry.
Bidel, Y; Zahzam, N; Blanchard, C; Bonnin, A; Cadoret, M; Bresson, A; Rouxel, D; Lequentrec-Lalancette, M F
2018-02-12
Measuring gravity from an aircraft or a ship is essential in geodesy, geophysics, mineral and hydrocarbon exploration, and navigation. Today, only relative sensors are available for onboard gravimetry. This is a major drawback because of the calibration and drift estimation procedures which lead to important operational constraints. Atom interferometry is a promising technology to obtain onboard absolute gravimeter. But, despite high performances obtained in static condition, no precise measurements were reported in dynamic. Here, we present absolute gravity measurements from a ship with a sensor based on atom interferometry. Despite rough sea conditions, we obtained precision below 10 -5 m s -2 . The atom gravimeter was also compared with a commercial spring gravimeter and showed better performances. This demonstration opens the way to the next generation of inertial sensors (accelerometer, gyroscope) based on atom interferometry which should provide high-precision absolute measurements from a moving platform.
ARCADE-R2 experiment on board BEXUS 17 stratospheric balloon
NASA Astrophysics Data System (ADS)
Barbetta, Marco; Boesso, Alessandro; Branz, Francesco; Carron, Andrea; Olivieri, Lorenzo; Prendin, Jacopo; Rodeghiero, Gabriele; Sansone, Francesco; Savioli, Livia; Spinello, Fabio; Francesconi, Alessandro
2015-09-01
This paper provides an overview of the ARCADE-R2 experiment, a technology demonstrator that aimed to prove the feasibility of small-scale satellite and/or aircraft systems with automatic (a) attitude determination, (b) control and (c) docking capabilities. The experiment embodies a simplified scenario in which an unmanned vehicle mock-up performs rendezvous and docking operations with a fixed complementary unit. The experiment is composed by a supporting structure, which holds a small vehicle with one translational and one rotational degree of freedom, and its fixed target. The dual system features three main custom subsystems: a relative infrared navigation sensor, an attitude control system based on a reaction wheel and a small-scale docking mechanism. The experiment bus is equipped with pressure and temperature sensors, and wind probes to monitor the external environmental conditions. The experiment flew on board the BEXUS 17 stratospheric balloon on October 10, 2013, where several navigation-control-docking sequences were executed and data on the external pressure, temperature, wind speed and direction were collected, characterizing the atmospheric loads applied to the vehicle. This paper describes the critical components of ARCADE-R2 as well as the main results obtained from the balloon flight.
1999 Flight Mechanics Symposium
NASA Technical Reports Server (NTRS)
Lynch, John P. (Editor)
1999-01-01
This conference publication includes papers and abstracts presented at the Flight Mechanics Symposium held on May 18-20, 1999. Sponsored by the Guidance, Navigation and Control Center of Goddard Space Flight Center, this symposium featured technical papers on a wide range of issues related to orbit-attitude prediction, determination, and control; attitude sensor calibration; attitude determination error analysis; attitude dynamics; and orbit decay and maneuver strategy. Government, industry, and the academic community participated in the preparation and presentation of these papers.
University of Pennsylvania MAGIC 2010 Final Report
2011-01-10
and mapping ( SLAM ) techniques are employed to build a local map of the environment surrounding the robot. Readings from the two complementary LIDAR sen...IMU, LIDAR , Cameras Localization Disrupter UGV Local Navigation Sensors: GPS, IMU, LIDAR , Cameras Laser Control Localization Task Planner Strategy/Plan...various components shown in Figure 2. This is comprised of the following subsystems: • Sensor UGV: Mobile UGVs with LIDAR and camera sensors, GPS, and
Allegany Ballistics Lab: sensor test target system
NASA Astrophysics Data System (ADS)
Eaton, Deran S.
2011-06-01
Leveraging the Naval Surface Warfare Center, Indian Head Division's historical experience in weapon simulation, Naval Sea Systems Command commissioned development of a remote-controlled, digitally programmable Sensor Test Target as part of a modern, outdoor hardware-in-the-loop test system for ordnance-related guidance, navigation and control systems. The overall Target system design invokes a sciences-based, "design of automated experiments" approach meant to close the logistical distance between sensor engineering and developmental T&E in outdoor conditions over useful real world distances. This enables operating modes that employ broad spectrum electromagnetic energy in many a desired combination, variably generated using a Jet Engine Simulator, a multispectral infrared emitter array, optically enhanced incandescent Flare Simulators, Emitter/Detector mounts, and an RF corner reflector kit. As assembled, the recently tested Sensor Test Target prototype being presented can capably provide a full array of useful RF and infrared target source simulations for RDT&E use with developmental and existing sensors. Certain Target technologies are patent pending, with potential spinoffs in aviation, metallurgy and biofuels processing, while others are variations on well-established technology. The Sensor Test Target System is planned for extended installation at Allegany Ballistics Laboratory (Rocket Center, WV).
Accuracy of continuous glucose monitoring during exercise in type 1 diabetes pregnancy.
Kumareswaran, Kavita; Elleri, Daniela; Allen, Janet M; Caldwell, Karen; Nodale, Marianna; Wilinska, Malgorzata E; Amiel, Stephanie A; Hovorka, Roman; Murphy, Helen R
2013-03-01
Performance of continuous glucose monitors (CGMs) may be lower when glucose levels are changing rapidly, such as occurs during physical activity. Our aim was to evaluate accuracy of a current-generation CGM during moderate-intensity exercise in type 1 diabetes (T1D) pregnancy. As part of a study of 24-h closed-loop insulin delivery in 12 women with T1D (disease duration, 17.6 years; glycosylated hemoglobin, 6.4%) during pregnancy (gestation, 21 weeks), we evaluated the Freestyle Navigator(®) sensor (Abbott Diabetes Care, Alameda, CA) during afternoon (15:00-18:00 h) and morning (09:30-12:30 h) exercise (55 min of brisk walking on a treadmill followed by a 2-h recovery), compared with sedentary conditions (18:00-09:00 h). Plasma (reference) glucose, measured at regular 15-30-min intervals with the YSI Ltd. (Fleet, United Kingdom) model YSI 2300 analyzer, was used to assess CGM performance. Sensor accuracy, as indicated by the larger relative absolute difference (RAD) between paired sensor and reference glucose values, was lower during exercise compared with rest (median RAD, 11.8% vs. 18.4%; P<0.001). These differences remained significant when correcting for plasma glucose relative rate of change (P<0.001). Analysis by glucose range showed lower accuracy during hypoglycemia for both sedentary (median RAD, 24.4%) and exercise (median RAD, 32.1%) conditions. Using Clarke error grid analysis, 96% of CGM values were clinically safe under resting conditions compared with only 87% during exercise. Compared with sedentary conditions, accuracy of the Freestyle Navigator CGM was lower during moderate-intensity exercise in pregnant women with T1D. This difference was particularly marked in hypoglycemia and could not be solely explained by the glucose rate of change associated with physical activity.
Statistical Sensor Fusion of a 9-DOF Mems Imu for Indoor Navigation
NASA Astrophysics Data System (ADS)
Chow, J. C. K.
2017-09-01
Sensor fusion of a MEMS IMU with a magnetometer is a popular system design, because such 9-DoF (degrees of freedom) systems are capable of achieving drift-free 3D orientation tracking. However, these systems are often vulnerable to ambient magnetic distortions and lack useful position information; in the absence of external position aiding (e.g. satellite/ultra-wideband positioning systems) the dead-reckoned position accuracy from a 9-DoF MEMS IMU deteriorates rapidly due to unmodelled errors. Positioning information is valuable in many satellite-denied geomatics applications (e.g. indoor navigation, location-based services, etc.). This paper proposes an improved 9-DoF IMU indoor pose tracking method using batch optimization. By adopting a robust in-situ user self-calibration approach to model the systematic errors of the accelerometer, gyroscope, and magnetometer simultaneously in a tightly-coupled post-processed least-squares framework, the accuracy of the estimated trajectory from a 9-DoF MEMS IMU can be improved. Through a combination of relative magnetic measurement updates and a robust weight function, the method is able to tolerate a high level of magnetic distortions. The proposed auto-calibration method was tested in-use under various heterogeneous magnetic field conditions to mimic a person walking with the sensor in their pocket, a person checking their phone, and a person walking with a smartwatch. In these experiments, the presented algorithm improved the in-situ dead-reckoning orientation accuracy by 79.8-89.5 % and the dead-reckoned positioning accuracy by 72.9-92.8 %, thus reducing the relative positioning error from metre-level to decimetre-level after ten seconds of integration, without making assumptions about the user's dynamics.
Automated baseline change detection -- Phases 1 and 2. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byler, E.
1997-10-31
The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrelmore » and on feature recognition in images. The ABCD image processing software was installed on a robotic vehicle developed under a related DOE/FETC contract DE-AC21-92MC29112 Intelligent Mobile Sensor System (IMSS) and integrated with the electronics and software. This vehicle was designed especially to navigate in DOE Waste Storage Facilities. Initial system testing was performed at Fernald in June 1996. After some further development and more extensive integration the prototype integrated system was installed and tested at the Radioactive Waste Management Facility (RWMC) at INEEL beginning in April 1997 through the present (November 1997). The integrated system, composed of ABCD imaging software and IMSS mobility base, is called MISS EVE (Mobile Intelligent Sensor System--Environmental Validation Expert). Evaluation of the integrated system in RWMC Building 628, containing approximately 10,000 drums, demonstrated an easy to use system with the ability to properly navigate through the facility, image all the defined drums, and process the results into a report delivered to the operator on a GUI interface and on hard copy. Further work is needed to make the brassboard system more operationally robust.« less
Sensor Data Quality and Angular Rate Down-Selection Algorithms on SLS EM-1
NASA Technical Reports Server (NTRS)
Park, Thomas; Oliver, Emerson; Smith, Austin
2018-01-01
The NASA Space Launch System Block 1 launch vehicle is equipped with an Inertial Navigation System (INS) and multiple Rate Gyro Assemblies (RGA) that are used in the Guidance, Navigation, and Control (GN&C) algorithms. The INS provides the inertial position, velocity, and attitude of the vehicle along with both angular rate and specific force measurements. Additionally, multiple sets of co-located rate gyros supply angular rate data. The collection of angular rate data, taken along the launch vehicle, is used to separate out vehicle motion from flexible body dynamics. Since the system architecture uses redundant sensors, the capability was developed to evaluate the health (or validity) of the independent measurements. A suite of Sensor Data Quality (SDQ) algorithms is responsible for assessing the angular rate data from the redundant sensors. When failures are detected, SDQ will take the appropriate action and disqualify or remove faulted sensors from forward processing. Additionally, the SDQ algorithms contain logic for down-selecting the angular rate data used by the GN&C software from the set of healthy measurements. This paper provides an overview of the algorithms used for both fault-detection and measurement down selection.
Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng
2015-01-01
Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path. PMID:25961384
Accuracy Enhancement of Inertial Sensors Utilizing High Resolution Spectral Analysis
Noureldin, Aboelmagd; Armstrong, Justin; El-Shafie, Ahmed; Karamat, Tashfeen; McGaughey, Don; Korenberg, Michael; Hussain, Aini
2012-01-01
In both military and civilian applications, the inertial navigation system (INS) and the global positioning system (GPS) are two complementary technologies that can be integrated to provide reliable positioning and navigation information for land vehicles. The accuracy enhancement of INS sensors and the integration of INS with GPS are the subjects of widespread research. Wavelet de-noising of INS sensors has had limited success in removing the long-term (low-frequency) inertial sensor errors. The primary objective of this research is to develop a novel inertial sensor accuracy enhancement technique that can remove both short-term and long-term error components from inertial sensor measurements prior to INS mechanization and INS/GPS integration. A high resolution spectral analysis technique called the fast orthogonal search (FOS) algorithm is used to accurately model the low frequency range of the spectrum, which includes the vehicle motion dynamics and inertial sensor errors. FOS models the spectral components with the most energy first and uses an adaptive threshold to stop adding frequency terms when fitting a term does not reduce the mean squared error more than fitting white noise. The proposed method was developed, tested and validated through road test experiments involving both low-end tactical grade and low cost MEMS-based inertial systems. The results demonstrate that in most cases the position accuracy during GPS outages using FOS de-noised data is superior to the position accuracy using wavelet de-noising.
An Application of UAV Attitude Estimation Using a Low-Cost Inertial Navigation System
NASA Technical Reports Server (NTRS)
Eure, Kenneth W.; Quach, Cuong Chi; Vazquez, Sixto L.; Hogge, Edward F.; Hill, Boyd L.
2013-01-01
Unmanned Aerial Vehicles (UAV) are playing an increasing role in aviation. Various methods exist for the computation of UAV attitude based on low cost microelectromechanical systems (MEMS) and Global Positioning System (GPS) receivers. There has been a recent increase in UAV autonomy as sensors are becoming more compact and onboard processing power has increased significantly. Correct UAV attitude estimation will play a critical role in navigation and separation assurance as UAVs share airspace with civil air traffic. This paper describes attitude estimation derived by post-processing data from a small low cost Inertial Navigation System (INS) recorded during the flight of a subscale commercial off the shelf (COTS) UAV. Two discrete time attitude estimation schemes are presented here in detail. The first is an adaptation of the Kalman Filter to accommodate nonlinear systems, the Extended Kalman Filter (EKF). The EKF returns quaternion estimates of the UAV attitude based on MEMS gyro, magnetometer, accelerometer, and pitot tube inputs. The second scheme is the complementary filter which is a simpler algorithm that splits the sensor frequency spectrum based on noise characteristics. The necessity to correct both filters for gravity measurement errors during turning maneuvers is demonstrated. It is shown that the proposed algorithms may be used to estimate UAV attitude. The effects of vibration on sensor measurements are discussed. Heuristic tuning comments pertaining to sensor filtering and gain selection to achieve acceptable performance during flight are given. Comparisons of attitude estimation performance are made between the EKF and the complementary filter.
NASA Technical Reports Server (NTRS)
Rush, John; Israel, David; Harlacher, Marc; Haas, Lin
2003-01-01
The Low Power Transceiver (LPT) is an advanced signal processing platform that offers a configurable and reprogrammable capability for supporting communications, navigation and sensor functions for mission applications ranging from spacecraft TT&C and autonomous orbit determination to sophisticated networks that use crosslinks to support communications and real-time relative navigation for formation flying. The LPT is the result of extensive collaborative research under NASNGSFC s Advanced Technology Program and ITT Industries internal research and development efforts. Its modular, multi-channel design currently enables transmitting and receiving communication signals on L- or S-band frequencies and processing GPS L-band signals for precision navigation. The LPT flew as a part of the GSFC Hitchhiker payload named Fast Reaction Experiments Enabling Science Technology And Research (FREESTAR) on-board Space Shuttle Columbia s final mission. The experiment demonstrated functionality in GPS-based navigation and orbit determination, NASA STDN Ground Network communications, space relay communications via the NASA TDRSS, on-orbit reconfiguration of the software radio, the use of the Internet Protocol (IP) for TT&C, and communication concepts for space based range safety. All data from the experiment was recovered and, as a result, all primary and secondary objectives of the experiment were successful. This paper presents the results of the LPTs maiden space flight as a part of STS- 107.
AUV SLAM and Experiments Using a Mechanical Scanning Forward-Looking Sonar
He, Bo; Liang, Yan; Feng, Xiao; Nian, Rui; Yan, Tianhong; Li, Minghui; Zhang, Shujing
2012-01-01
Navigation technology is one of the most important challenges in the applications of autonomous underwater vehicles (AUVs) which navigate in the complex undersea environment. The ability of localizing a robot and accurately mapping its surroundings simultaneously, namely the simultaneous localization and mapping (SLAM) problem, is a key prerequisite of truly autonomous robots. In this paper, a modified-FastSLAM algorithm is proposed and used in the navigation for our C-Ranger research platform, an open-frame AUV. A mechanical scanning imaging sonar is chosen as the active sensor for the AUV. The modified-FastSLAM implements the update relying on the on-board sensors of C-Ranger. On the other hand, the algorithm employs the data association which combines the single particle maximum likelihood method with modified negative evidence method, and uses the rank-based resampling to overcome the particle depletion problem. In order to verify the feasibility of the proposed methods, both simulation experiments and sea trials for C-Ranger are conducted. The experimental results show the modified-FastSLAM employed for the navigation of the C-Ranger AUV is much more effective and accurate compared with the traditional methods. PMID:23012549
AUV SLAM and experiments using a mechanical scanning forward-looking sonar.
He, Bo; Liang, Yan; Feng, Xiao; Nian, Rui; Yan, Tianhong; Li, Minghui; Zhang, Shujing
2012-01-01
Navigation technology is one of the most important challenges in the applications of autonomous underwater vehicles (AUVs) which navigate in the complex undersea environment. The ability of localizing a robot and accurately mapping its surroundings simultaneously, namely the simultaneous localization and mapping (SLAM) problem, is a key prerequisite of truly autonomous robots. In this paper, a modified-FastSLAM algorithm is proposed and used in the navigation for our C-Ranger research platform, an open-frame AUV. A mechanical scanning imaging sonar is chosen as the active sensor for the AUV. The modified-FastSLAM implements the update relying on the on-board sensors of C-Ranger. On the other hand, the algorithm employs the data association which combines the single particle maximum likelihood method with modified negative evidence method, and uses the rank-based resampling to overcome the particle depletion problem. In order to verify the feasibility of the proposed methods, both simulation experiments and sea trials for C-Ranger are conducted. The experimental results show the modified-FastSLAM employed for the navigation of the C-Ranger AUV is much more effective and accurate compared with the traditional methods.
Diver-based integrated navigation/sonar sensor
NASA Astrophysics Data System (ADS)
Lent, Keith H.
1999-07-01
Two diver based systems, the Small Object Locating Sonar (SOLS) and the Integrated Navigation and Sonar Sensor (INSS) have been developed at Applied Research Laboratories, the University of Texas at Austin (ARL:UT). They are small and easy to use systems that allow a diver to: detect, classify, and identify underwater objects; render large sector visual images; and track, map and reacquire diver location, diver path, and target locations. The INSS hardware consists of a unique, simple, single beam high resolution sonar, an acoustic navigation systems, an electronic depth gauge, compass, and GPS and RF interfaces, all integrated with a standard 486 based PC. These diver sonars have been evaluated by the very shallow water mine countermeasure detachment since spring 1997. Results are very positive, showing significantly greater capabilities than current diver held systems. For example, the detection ranges are increased over existing systems, and the system allows the divers to classify mines at a significant stand off range. As a result, the INSS design has been chosen for acquisition as the next generation diver navigation and sonar system. The EDMs for this system will be designed and built by ARL:UT during 1998 and 1999 with production planned in 2000.
Long Range Navigation for Mars Rovers Using Sensor-Based Path Planning and Visual Localisation
NASA Technical Reports Server (NTRS)
Laubach, Sharon L.; Olson, Clark F.; Burdick, Joel W.; Hayati, Samad
1999-01-01
The Mars Pathfinder mission illustrated the benefits of including a mobile robotic explorer on a planetary mission. However, for future Mars rover missions, significantly increased autonomy in navigation is required in order to meet demanding mission criteria. To address these requirements, we have developed new path planning and localisation capabilities that allow a rover to navigate robustly to a distant landmark. These algorithms have been implemented on the JPL Rocky 7 prototype microrover and have been tested extensively in the JPL MarsYard, as well as in natural terrain.
Data Analysis Techniques for a Lunar Surface Navigation System Testbed
NASA Technical Reports Server (NTRS)
Chelmins, David; Sands, O. Scott; Swank, Aaron
2011-01-01
NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.
NASA Astrophysics Data System (ADS)
Chu, Chien-Hsun; Chiang, Kai-Wei
2016-06-01
The early development of mobile mapping system (MMS) was restricted to applications that permitted the determination of the elements of exterior orientation from existing ground control. Mobile mapping refers to a means of collecting geospatial data using mapping sensors that are mounted on a mobile platform. Research works concerning mobile mapping dates back to the late 1980s. This process is mainly driven by the need for highway infrastructure mapping and transportation corridor inventories. In the early nineties, advances in satellite and inertial technology made it possible to think about mobile mapping in a different way. Instead of using ground control points as references for orienting the images in space, the trajectory and attitude of the imager platform could now be determined directly. Cameras, along with navigation and positioning sensors are integrated and mounted on a land vehicle for mapping purposes. Objects of interest can be directly measured and mapped from images that have been georeferenced using navigation and positioning sensors. Direct georeferencing (DG) is the determination of time-variable position and orientation parameters for a mobile digital imager. The most common technologies used for this purpose today are satellite positioning using the Global Navigation Satellite System (GNSS) and inertial navigation using an Inertial Measuring Unit (IMU). Although either technology used along could in principle determine both position and orientation, they are usually integrated in such a way that the IMU is the main orientation sensor, while the GNSS receiver is the main position sensor. However, GNSS signals are obstructed due to limited number of visible satellites in GNSS denied environments such as urban canyon, foliage, tunnel and indoor that cause the GNSS gap or interfered by reflected signals that cause abnormal measurement residuals thus deteriorates the positioning accuracy in GNSS denied environments. This study aims at developing a novel method that uses ground control points to maintain the positioning accuracy of the MMS in GNSS denied environments. At last, this study analyses the performance of proposed method using about 20 check-points through DG process.
Colonoscope navigation system using colonoscope tracking method based on line registration
NASA Astrophysics Data System (ADS)
Oda, Masahiro; Kondo, Hiroaki; Kitasaka, Takayuki; Furukawa, Kazuhiro; Miyahara, Ryoji; Hirooka, Yoshiki; Goto, Hidemi; Navab, Nassir; Mori, Kensaku
2014-03-01
This paper presents a new colonoscope navigation system. CT colonography is utilized for colon diagnosis based on CT images. If polyps are found while CT colonography, colonoscopic polypectomy can be performed to remove them. While performing a colonoscopic examination, a physician controls colonoscope based on his/her experience. Inexperienced physicians may occur complications such as colon perforation while colonoscopic examinations. To reduce complications, a navigation system of colonoscope while performing the colonoscopic examinations is necessary. We propose a colonoscope navigation system. This system has a new colonoscope tracking method. This method obtains a colon centerline from a CT volume of a patient. A curved line (colonoscope line) representing the shape of colonoscope inserted to the colon is obtained by using electromagnetic sensors. A coordinate system registration process that employs the ICP algorithm is performed to register the CT and sensor coordinate systems. The colon centerline and colonoscope line are registered by using a line registration method. The position of the colonoscope tip in the colon is obtained from the line registration result. Our colonoscope navigation system displays virtual colonoscopic views generated from the CT volumes. A viewpoint of the virtual colonoscopic view is a point on the centerline that corresponds to the colonoscope tip. Experimental results using a colon phantom showed that the proposed colonoscope tracking method can track the colonoscope tip with small tracking errors.
Chiang, Kai-Wei; Liao, Jhen-Kai; Tsai, Guang-Je; Chang, Hsiu-Wen
2015-01-01
Hardware sensors embedded in a smartphone allow the device to become an excellent mobile navigator. A smartphone is ideal for this task because its great international popularity has led to increased phone power and since most of the necessary infrastructure is already in place. However, using a smartphone for indoor pedestrian navigation can be problematic due to the low accuracy of sensors, imprecise predictability of pedestrian motion, and inaccessibility of the Global Navigation Satellite System (GNSS) in some indoor environments. Pedestrian Dead Reckoning (PDR) is one of the most common technologies used for pedestrian navigation, but in its present form, various errors tend to accumulate. This study introduces a fuzzy decision tree (FDT) aided by map information to improve the accuracy and stability of PDR with less dependency on infrastructure. First, the map is quickly surveyed by the Indoor Mobile Mapping System (IMMS). Next, Bluetooth beacons are implemented to enable the initializing of any position. Finally, map-aided FDT can estimate navigation solutions in real time. The experiments were conducted in different fields using a variety of smartphones and users in order to verify stability. The contrast PDR system demonstrates low stability for each case without pre-calibration and post-processing, but the proposed low-complexity FDT algorithm shows good stability and accuracy under the same conditions. PMID:26729114
Environment exploration and SLAM experiment research based on ROS
NASA Astrophysics Data System (ADS)
Li, Zhize; Zheng, Wei
2017-11-01
Robots need to get the information of surrounding environment by means of map learning. SLAM or navigation based on mobile robots is developing rapidly. ROS (Robot Operating System) is widely used in the field of robots because of the convenient code reuse and open source. Numerous excellent algorithms of SLAM or navigation are ported to ROS package. hector_slam is one of them that can set up occupancy grid maps on-line fast with low computation resources requiring. Its characters above make the embedded handheld mapping system possible. Similarly, hector_navigation also does well in the navigation field. It can finish path planning and environment exploration by itself using only an environmental sensor. Combining hector_navigation with hector_slam can realize low cost environment exploration, path planning and slam at the same time
Beacons for supporting lunar landing navigation
NASA Astrophysics Data System (ADS)
Theil, Stephan; Bora, Leonardo
2017-03-01
Current and future planetary exploration missions involve a landing on the target celestial body. Almost all of these landing missions are currently relying on a combination of inertial and optical sensor measurements to determine the current flight state with respect to the target body and the desired landing site. As soon as an infrastructure at the landing site exists, the requirements as well as conditions change for vehicles landing close to this existing infrastructure. This paper investigates the options for ground-based infrastructure supporting the onboard navigation system and analyzes the impact on the achievable navigation accuracy. For that purpose, the paper starts with an existing navigation architecture based on optical navigation and extends it with measurements to support navigation with ground infrastructure. A scenario of lunar landing is simulated and the provided functions of the ground infrastructure as well as the location with respect to the landing site are evaluated. The results are analyzed and discussed.
Exploitation of Semantic Building Model in Indoor Navigation Systems
NASA Astrophysics Data System (ADS)
Anjomshoaa, A.; Shayeganfar, F.; Tjoa, A. Min
2009-04-01
There are many types of indoor and outdoor navigation tools and methodologies available. A majority of these solutions are based on Global Positioning Systems (GPS) and instant video and image processing. These approaches are ideal for open world environments where very few information about the target location is available, but for large scale building environments such as hospitals, governmental offices, etc the end-user will need more detailed information about the surrounding context which is especially important in case of people with special needs. This paper presents a smart indoor navigation solution that is based on Semantic Web technologies and Building Information Model (BIM). The proposed solution is also aligned with Google Android's concepts to enlighten the realization of results. Keywords: IAI IFCXML, Building Information Model, Indoor Navigation, Semantic Web, Google Android, People with Special Needs 1 Introduction Built environment is a central factor in our daily life and a big portion of human life is spent inside buildings. Traditionally the buildings are documented using building maps and plans by utilization of IT tools such as computer-aided design (CAD) applications. Documenting the maps in an electronic way is already pervasive but CAD drawings do not suffice the requirements regarding effective building models that can be shared with other building-related applications such as indoor navigation systems. The navigation in built environment is not a new issue, however with the advances in emerging technologies like GPS, mobile and networked environments, and Semantic Web new solutions have been suggested to enrich the traditional building maps and convert them to smart information resources that can be reused in other applications and improve the interpretability with building inhabitants and building visitors. Other important issues that should be addressed in building navigation scenarios are location tagging and end-user communication. The available solutions for location tagging are mostly based on proximity sensors and the information are bound to sensor references. In the proposed solution of this paper, the sensors simply play a role similar to annotations in Semantic Web world. Hence the sensors data in ontology sense bridges the gap between sensed information and building model. Combining these two and applying the proper inference rules, the building visitors will be able to reach their destinations with instant support of their communication devices such as hand helds, wearable computers, mobiles, etc. In a typical scenario of this kind, user's profile will be delivered to the smart building (via building ad-hoc services) and the appropriate route for user will be calculated and delivered to user's end-device. The calculated route is calculated by considering all constraints and requirements of the end user. So for example if the user is using a wheelchair, the calculated route should not contain stairs or narrow corridors that the wheelchair does not pass through. Then user starts to navigate through building by following the instructions of the end-device which are in turn generated from the calculated route. During the navigation process, the end-device should also interact with the smart building to sense the locations by reading the surrounding tags. So for example when a visually impaired person arrives at an unknown space, the tags will be sensed and the relevant information will be delivered to user in the proper way of communication. For example the building model can be used to generate a voice message for a blind person about a space and tell him/her that "the space has 3 doors, and the door on the left should be chosen which needs to be pushed to open". In this paper we will mainly focus on automatic generation of semantic building information models (Semantic BIM) and delivery of results to the end user. Combining the building information model with the environment and user constraints using Semantic Web technologies will make many scenarios conceivable. The generated IFC ontology that is base on the commonly accepted IFC (Industry Foundation Classes) standard can be used as the basis of information sharing between buildings, people, and applications. The proposed solution is aiming to facilitate the building navigation in an intuitive and extendable way that is easy to use by end-users and at the same time easy to maintain and manage by building administrators.
Detailed Test Plan Redundant Sensor Strapdown IMU Evaluation Program
NASA Technical Reports Server (NTRS)
Hartwell, T.; Miyatake, Y.; Wedekind, D. E.
1971-01-01
The test plan for a redundant sensor strapdown inertial measuring unit evaluation program is presented. The subjects discussed are: (1) test philosophy and limitations, (2) test sequence, (3) equipment specifications, (4) general operating procedures, (5) calibration procedures, (6) alignment test phase, and (7) navigation test phase. The data and analysis requirements are analyzed.
Enhancing Autonomy of Aerial Systems Via Integration of Visual Sensors into Their Avionics Suite
2016-09-01
aerial platform for subsequent visual sensor integration. 14. SUBJECT TERMS autonomous system, quadrotors, direct method, inverse ...CONTROLLER ARCHITECTURE .....................................................43 B. INVERSE DYNAMICS IN THE VIRTUAL DOMAIN ......................45 1...control station GPS Global-Positioning System IDVD inverse dynamics in the virtual domain ILP integer linear program INS inertial-navigation system
Vision Sensor-Based Road Detection for Field Robot Navigation
Lu, Keyu; Li, Jian; An, Xiangjing; He, Hangen
2015-01-01
Road detection is an essential component of field robot navigation systems. Vision sensors play an important role in road detection for their great potential in environmental perception. In this paper, we propose a hierarchical vision sensor-based method for robust road detection in challenging road scenes. More specifically, for a given road image captured by an on-board vision sensor, we introduce a multiple population genetic algorithm (MPGA)-based approach for efficient road vanishing point detection. Superpixel-level seeds are then selected in an unsupervised way using a clustering strategy. Then, according to the GrowCut framework, the seeds proliferate and iteratively try to occupy their neighbors. After convergence, the initial road segment is obtained. Finally, in order to achieve a globally-consistent road segment, the initial road segment is refined using the conditional random field (CRF) framework, which integrates high-level information into road detection. We perform several experiments to evaluate the common performance, scale sensitivity and noise sensitivity of the proposed method. The experimental results demonstrate that the proposed method exhibits high robustness compared to the state of the art. PMID:26610514
NASA Astrophysics Data System (ADS)
Nafis, Christopher; Jensen, Vern; von Jako, Ron
2008-03-01
Electromagnetic (EM) tracking systems have been successfully used for Surgical Navigation in ENT, cranial, and spine applications for several years. Catheter sized micro EM sensors have also been used in tightly controlled cardiac mapping and pulmonary applications. EM systems have the benefit over optical navigation systems of not requiring a line-of-sight between devices. Ferrous metals or conductive materials that are transient within the EM working volume may impact tracking performance. Effective methods for detecting and reporting EM field distortions are generally well known. Distortion compensation can be achieved for objects that have a static spatial relationship to a tracking sensor. New commercially available micro EM tracking systems offer opportunities for expanded image-guided navigation procedures. It is important to know and understand how well these systems perform with different surgical tables and ancillary equipment. By their design and intended use, micro EM sensors will be located at the distal tip of tracked devices and therefore be in closer proximity to the tables. Our goal was to define a simple and portable process that could be used to estimate the EM tracker accuracy, and to vet a large number of popular general surgery and imaging tables that are used in the United States and abroad.
Cooper, Andrew James; Redman, Chelsea Anne; Stoneham, David Mark; Gonzalez, Luis Felipe; Etse, Victor Kwesi
2015-08-28
This paper presents an unmanned aircraft system (UAS) that uses a probabilistic model for autonomous front-on environmental sensing or photography of a target. The system is based on low-cost and readily-available sensor systems in dynamic environments and with the general intent of improving the capabilities of dynamic waypoint-based navigation systems for a low-cost UAS. The behavioural dynamics of target movement for the design of a Kalman filter and Markov model-based prediction algorithm are included. Geometrical concepts and the Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of a target, thus delivering a new waypoint for autonomous navigation. The results of the application to aerial filming with low-cost UAS are presented, achieving the desired goal of maintained front-on perspective without significant constraint to the route or pace of target movement.
Outer planet mission guidance and navigation for spinning spacecraft
NASA Technical Reports Server (NTRS)
Paul, C. K.; Russell, R. K.; Ellis, J.
1974-01-01
The orbit determination accuracies, maneuver results, and navigation system specification for spinning Pioneer planetary probe missions are analyzed to aid in determining the feasibility of deploying probes into the atmospheres of the outer planets. Radio-only navigation suffices for a direct Saturn mission and the Jupiter flyby of a Jupiter/Uranus mission. Saturn ephemeris errors (1000 km) plus rigid entry constraints at Uranus result in very high velocity requirements (140 m/sec) on the final legs of the Saturn/Uranus and Jupiter/Uranus missions if only Earth-based tracking is employed. The capabilities of a conceptual V-slit sensor are assessed to supplement radio tracking by star/satellite observations. By processing the optical measurements with a batch filter, entry conditions at Uranus can be controlled to acceptable mission-defined levels (+ or - 3 deg) and the Saturn-Uranus leg velocity requirements can be reduced by a factor of 6 (from 139 to 23 m/sec) if nominal specified accuracies of the sensor can be realized.
Swarm Optimization-Based Magnetometer Calibration for Personal Handheld Devices
Ali, Abdelrahman; Siddharth, Siddharth; Syed, Zainab; El-Sheimy, Naser
2012-01-01
Inertial Navigation Systems (INS) consist of accelerometers, gyroscopes and a processor that generates position and orientation solutions by integrating the specific forces and rotation rates. In addition to the accelerometers and gyroscopes, magnetometers can be used to derive the user heading based on Earth's magnetic field. Unfortunately, the measurements of the magnetic field obtained with low cost sensors are usually corrupted by several errors, including manufacturing defects and external electro-magnetic fields. Consequently, proper calibration of the magnetometer is required to achieve high accuracy heading measurements. In this paper, a Particle Swarm Optimization (PSO)-based calibration algorithm is presented to estimate the values of the bias and scale factor of low cost magnetometers. The main advantage of this technique is the use of the artificial intelligence which does not need any error modeling or awareness of the nonlinearity. Furthermore, the proposed algorithm can help in the development of Pedestrian Navigation Devices (PNDs) when combined with inertial sensors and GPS/Wi-Fi for indoor navigation and Location Based Services (LBS) applications.
Cooper, Andrew James; Redman, Chelsea Anne; Stoneham, David Mark; Gonzalez, Luis Felipe; Etse, Victor Kwesi
2015-01-01
This paper presents an unmanned aircraft system (UAS) that uses a probabilistic model for autonomous front-on environmental sensing or photography of a target. The system is based on low-cost and readily-available sensor systems in dynamic environments and with the general intent of improving the capabilities of dynamic waypoint-based navigation systems for a low-cost UAS. The behavioural dynamics of target movement for the design of a Kalman filter and Markov model-based prediction algorithm are included. Geometrical concepts and the Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of a target, thus delivering a new waypoint for autonomous navigation. The results of the application to aerial filming with low-cost UAS are presented, achieving the desired goal of maintained front-on perspective without significant constraint to the route or pace of target movement. PMID:26343680
Distributed Underwater Sensing: A Paradigm Change for the Future
NASA Astrophysics Data System (ADS)
Yang, T. C.
Distributed netted underwater sensors (DNUS) present a paradigm change that has generated high interest all over the world. It utilizes many small spatially distributed, inexpensive sensors, and a certain number of mobile nodes, such as autonomous underwater vehicles (AUVs), forming a wireless acoustic network to relate data and provide real time monitoring of the ocean. Distributed underwater sensors can be used for oceanographic data collection, pollution monitoring, offshore exploration, disaster prevention, assisted navigation and tactical surveillance applications over wide areas. These functions were traditionally accomplished by a cabled system, such as an array of sensors deployed from a platform, or a large number of sensors moored on the ocean bottom, connected by a cable. The cabled systems are not only expensive but often require heavy ocean engineering (e.g., equipment to deploy heavy armored cables). In the future, as fabrication technology advances making low cost sensors a reality, DNUS is expected to be affordable and will become the undersea "OceanNet" for the marine industry like the current "internet" on land. This paper gives a layman view of the system concept, the state of the art, and future challenges. One of challenges, of particular interest to this conference, is to develop technologies for miniature-size sensors that are energy efficient, allowing long time deployment in the ocean.
A Motion Tracking and Sensor Fusion Module for Medical Simulation.
Shen, Yunhe; Wu, Fan; Tseng, Kuo-Shih; Ye, Ding; Raymond, John; Konety, Badrinath; Sweet, Robert
2016-01-01
Here we introduce a motion tracking or navigation module for medical simulation systems. Our main contribution is a sensor fusion method for proximity or distance sensors integrated with inertial measurement unit (IMU). Since IMU rotation tracking has been widely studied, we focus on the position or trajectory tracking of the instrument moving freely within a given boundary. In our experiments, we have found that this module reliably tracks instrument motion.
Driving a car with custom-designed fuzzy inferencing VLSI chips and boards
NASA Technical Reports Server (NTRS)
Pin, Francois G.; Watanabe, Yutaka
1993-01-01
Vehicle control in a-priori unknown, unpredictable, and dynamic environments requires many calculational and reasoning schemes to operate on the basis of very imprecise, incomplete, or unreliable data. For such systems, in which all the uncertainties can not be engineered away, approximate reasoning may provide an alternative to the complexity and computational requirements of conventional uncertainty analysis and propagation techniques. Two types of computer boards including custom-designed VLSI chips were developed to add a fuzzy inferencing capability to real-time control systems. All inferencing rules on a chip are processed in parallel, allowing execution of the entire rule base in about 30 microseconds, and therefore, making control of 'reflex-type' of motions envisionable. The use of these boards and the approach using superposition of elemental sensor-based behaviors for the development of qualitative reasoning schemes emulating human-like navigation in a-priori unknown environments are first discussed. Then how the human-like navigation scheme implemented on one of the qualitative inferencing boards was installed on a test-bed platform to investigate two control modes for driving a car in a-priori unknown environments on the basis of sparse and imprecise sensor data is described. In the first mode, the car navigates fully autonomously, while in the second mode, the system acts as a driver's aid providing the driver with linguistic (fuzzy) commands to turn left or right and speed up or slow down depending on the obstacles perceived by the sensors. Experiments with both modes of control are described in which the system uses only three acoustic range (sonar) sensor channels to perceive the environment. Simulation results as well as indoors and outdoors experiments are presented and discussed to illustrate the feasibility and robustness of autonomous navigation and/or safety enhancing driver's aid using the new fuzzy inferencing hardware system and some human-like reasoning schemes which may include as little as six elemental behaviors embodied in fourteen qualitative rules.
Meta-image navigation augmenters for unmanned aircraft systems (MINA for UAS)
NASA Astrophysics Data System (ADS)
Òªelik, Koray; Somani, Arun K.; Schnaufer, Bernard; Hwang, Patrick Y.; McGraw, Gary A.; Nadke, Jeremy
2013-05-01
GPS is a critical sensor for Unmanned Aircraft Systems (UASs) due to its accuracy, global coverage and small hardware footprint, but is subject to denial due to signal blockage or RF interference. When GPS is unavailable, position, velocity and attitude (PVA) performance from other inertial and air data sensors is not sufficient, especially for small UASs. Recently, image-based navigation algorithms have been developed to address GPS outages for UASs, since most of these platforms already include a camera as standard equipage. Performing absolute navigation with real-time aerial images requires georeferenced data, either images or landmarks, as a reference. Georeferenced imagery is readily available today, but requires a large amount of storage, whereas collections of discrete landmarks are compact but must be generated by pre-processing. An alternative, compact source of georeferenced data having large coverage area is open source vector maps from which meta-objects can be extracted for matching against real-time acquired imagery. We have developed a novel, automated approach called MINA (Meta Image Navigation Augmenters), which is a synergy of machine-vision and machine-learning algorithms for map aided navigation. As opposed to existing image map matching algorithms, MINA utilizes publicly available open-source geo-referenced vector map data, such as OpenStreetMap, in conjunction with real-time optical imagery from an on-board, monocular camera to augment the UAS navigation computer when GPS is not available. The MINA approach has been experimentally validated with both actual flight data and flight simulation data and results are presented in the paper.
A Long Distance Laser Altimeter for Terrain Relative Navigation and Spacecraft Landing
NASA Technical Reports Server (NTRS)
Pierrottet, Diego F.; Amzajerdian, Farzin; Barnes, Bruce W.
2014-01-01
A high precision laser altimeter was developed under the Autonomous Landing and Hazard Avoidance (ALHAT) project at NASA Langley Research Center. The laser altimeter provides slant-path range measurements from operational ranges exceeding 30 km that will be used to support surface-relative state estimation and navigation during planetary descent and precision landing. The altimeter uses an advanced time-of-arrival receiver, which produces multiple signal-return range measurements from tens of kilometers with 5 cm precision. The transmitter is eye-safe, simplifying operations and testing on earth. The prototype is fully autonomous, and able to withstand the thermal and mechanical stresses experienced during test flights conducted aboard helicopters, fixed-wing aircraft, and Morpheus, a terrestrial rocket-powered vehicle developed by NASA Johnson Space Center. This paper provides an overview of the sensor and presents results obtained during recent field experiments including a helicopter flight test conducted in December 2012 and Morpheus flight tests conducted during March of 2014.
NASA Mars rover: a testbed for evaluating applications of covariance intersection
NASA Astrophysics Data System (ADS)
Uhlmann, Jeffrey K.; Julier, Simon J.; Kamgar-Parsi, Behzad; Lanzagorta, Marco O.; Shyu, Haw-Jye S.
1999-07-01
The Naval Research Laboratory (NRL) has spearheaded the development and application of Covariance Intersection (CI) for a variety of decentralized data fusion problems. Such problems include distributed control, onboard sensor fusion, and dynamic map building and localization. In this paper we describe NRL's development of a CI-based navigation system for the NASA Mars rover that stresses almost all aspects of decentralized data fusion. We also describe how this project relates to NRL's augmented reality, advanced visualization, and REBOT projects.
Method and System for Determining Relative Displacement and Heading for Navigation
NASA Technical Reports Server (NTRS)
Sheikh, Suneel Ismail (Inventor); Pines, Darryll J. (Inventor); Conroy, Joseph Kim (Inventor); Spiridonov, Timofey N. (Inventor)
2015-01-01
A system and method for determining a location of a mobile object is provided. The system determines the location of the mobile object by determining distances between a plurality of sensors provided on a first and second movable parts of the mobile object. A stride length, heading, and separation distance between the first and second movable parts are computed based on the determined distances and the location of the mobile object is determined based on the computed stride length, heading, and separation distance.
Integration of Cold Atom Interferometry INS with Other Sensors
2012-03-22
Kalman filtering 2.6.1 Linear Kalman filtering . Kalman filtering is used to estimate the solution to a linear... Kalman Filter . This filter will estimate the errors in the navigation grade measurement. Whenever an outage occurs the mechanization must be done using ...navigation solution, with periodic GPS measurements being brought into a Kalman Filter to estimate the errors in the INS solution. The results of
Small Magnetic Sensors for Space Applications
Díaz-Michelena, Marina
2009-01-01
Small magnetic sensors are widely used integrated in vehicles, mobile phones, medical devices, etc for navigation, speed, position and angular sensing. These magnetic sensors are potential candidates for space sector applications in which mass, volume and power savings are important issues. This work covers the magnetic technologies available in the marketplace and the steps towards their implementation in space applications, the actual trend of miniaturization the front-end technologies, and the convergence of the mature and miniaturized magnetic sensor to the space sector through the small satellite concept. PMID:22574012
Unifying Terrain Awareness for the Visually Impaired through Real-Time Semantic Segmentation
Yang, Kailun; Wang, Kaiwei; Romera, Eduardo; Hu, Weijian; Sun, Dongming; Sun, Junwei; Cheng, Ruiqi; Chen, Tianxue; López, Elena
2018-01-01
Navigational assistance aims to help visually-impaired people to ambulate the environment safely and independently. This topic becomes challenging as it requires detecting a wide variety of scenes to provide higher level assistive awareness. Vision-based technologies with monocular detectors or depth sensors have sprung up within several years of research. These separate approaches have achieved remarkable results with relatively low processing time and have improved the mobility of impaired people to a large extent. However, running all detectors jointly increases the latency and burdens the computational resources. In this paper, we put forward seizing pixel-wise semantic segmentation to cover navigation-related perception needs in a unified way. This is critical not only for the terrain awareness regarding traversable areas, sidewalks, stairs and water hazards, but also for the avoidance of short-range obstacles, fast-approaching pedestrians and vehicles. The core of our unification proposal is a deep architecture, aimed at attaining efficient semantic understanding. We have integrated the approach in a wearable navigation system by incorporating robust depth segmentation. A comprehensive set of experiments prove the qualified accuracy over state-of-the-art methods while maintaining real-time speed. We also present a closed-loop field test involving real visually-impaired users, demonstrating the effectivity and versatility of the assistive framework. PMID:29748508
Benefits of combined GPS/GLONASS with low-cost MEMS IMUs for vehicular urban navigation.
Angrisano, Antonio; Petovello, Mark; Pugliano, Giovanni
2012-01-01
The integration of Global Navigation Satellite Systems (GNSS) with Inertial Navigation Systems (INS) has been very actively researched for many years due to the complementary nature of the two systems. In particular, during the last few years the integration with micro-electromechanical system (MEMS) inertial measurement units (IMUs) has been investigated. In fact, recent advances in MEMS technology have made possible the development of a new generation of low cost inertial sensors characterized by small size and light weight, which represents an attractive option for mass-market applications such as vehicular and pedestrian navigation. However, whereas there has been much interest in the integration of GPS with a MEMS-based INS, few research studies have been conducted on expanding this application to the revitalized GLONASS system. This paper looks at the benefits of adding GLONASS to existing GPS/INS(MEMS) systems using loose and tight integration strategies. The relative benefits of various constraints are also assessed. Results show that when satellite visibility is poor (approximately 50% solution availability) the benefits of GLONASS are only seen with tight integration algorithms. For more benign environments, a loosely coupled GPS/GLONASS/INS system offers performance comparable to that of a tightly coupled GPS/INS system, but with reduced complexity and development time.
Diaz-Estrella, Antonio; Reyes-Lecuona, Arcadio; Langley, Alyson; Brown, Michael; Sharples, Sarah
2018-01-01
Inertial sensors offer the potential for integration into wireless virtual reality systems that allow the users to walk freely through virtual environments. However, owing to drift errors, inertial sensors cannot accurately estimate head and body orientations in the long run, and when walking indoors, this error cannot be corrected by magnetometers, due to the magnetic field distortion created by ferromagnetic materials present in buildings. This paper proposes a technique, called EHBD (Equalization of Head and Body Directions), to address this problem using two head- and shoulder-located magnetometers. Due to their proximity, their distortions are assumed to be similar and the magnetometer measurements are used to detect when the user is looking straight forward. Then, the system corrects the discrepancies between the estimated directions of the head and the shoulder, which are provided by gyroscopes and consequently are affected by drift errors. An experiment is conducted to evaluate the performance of this technique in two tasks (navigation and navigation plus exploration) and using two different locomotion techniques: (1) gaze-directed mode (GD) in which the walking direction is forced to be the same as the head direction, and (2) decoupled direction mode (DD) in which the walking direction can be different from the viewing direction. The obtained results show that both locomotion modes show similar matching of the target path during the navigation task, while DD’s path matches the target path more closely than GD in the navigation plus exploration task. These results validate the EHBD technique especially when allowing different walking and viewing directions in the navigation plus exploration tasks, as expected. While the proposed method does not reach the accuracy of optical tracking (ideal case), it is an acceptable and satisfactory solution for users and is much more compact, portable and economical. PMID:29621298
Target Trailing With Safe Navigation With Colregs for Maritime Autonomous Surface Vehicles
NASA Technical Reports Server (NTRS)
Kuwata, Yoshiaki (Inventor); Aghazarian, Hrand (Inventor); Huntsberger, Terrance L. (Inventor); Howard, Andrew B. (Inventor); Wolf, Michael T. (Inventor); Zarzhitsky, Dimitri V. (Inventor)
2014-01-01
Systems and methods for operating autonomous waterborne vessels in a safe manner. The systems include hardware for identifying the locations and motions of other vessels, as well as the locations of stationary objects that represent navigation hazards. By applying a computational method that uses a maritime navigation algorithm for avoiding hazards and obeying COLREGS using Velocity Obstacles to the data obtained, the autonomous vessel computes a safe and effective path to be followed in order to accomplish a desired navigational end result, while operating in a manner so as to avoid hazards and to maintain compliance with standard navigational procedures defined by international agreement. The systems and methods have been successfully demonstrated on water with radar and stereo cameras as the perception sensors, and integrated with a higher level planner for trailing a maneuvering target.
Intelligent single switch wheelchair navigation.
Ka, Hyun W; Simpson, Richard; Chung, Younghyun
2012-11-01
We have developed an intelligent single switch scanning interface and wheelchair navigation assistance system, called intelligent single switch wheelchair navigation (ISSWN), to improve driving safety, comfort and efficiency for individuals who rely on single switch scanning as a control method. ISSWN combines a standard powered wheelchair with a laser rangefinder, a single switch scanning interface and a computer. It provides the user with context sensitive and task specific scanning options that reduce driving effort based on an interpretation of sensor data together with user input. Trials performed by 9 able-bodied participants showed that the system significantly improved driving safety and efficiency in a navigation task by significantly reducing the number of switch presses to 43.5% of traditional single switch wheelchair navigation (p < 0.001). All participants made a significant improvement (39.1%; p < 0.001) in completion time after only two trials.
HH-65A Dolphin digital integrated avionics
NASA Technical Reports Server (NTRS)
Huntoon, R. B.
1984-01-01
Communication, navigation, flight control, and search sensor management are avionics functions which constitute every Search and Rescue (SAR) operation. Routine cockpit duties monopolize crew attention during SAR operations and thus impair crew effectiveness. The United States Coast Guard challenged industry to build an avionics system that automates routine tasks and frees the crew to focus on the mission tasks. The HH-64A SAR avionics systems of communication, navigation, search sensors, and flight control have existed independently. On the SRR helicopter, the flight management system (FMS) was introduced. H coordinates or integrates these functions. The pilot interacts with the FMS rather than the individual subsystems, using simple, straightforward procedures to address distinct mission tasks and the flight management system, in turn, orchestrates integrated system response.
2010-03-01
Characterization Solutions Enabled by Laser Doppler Vibrometer Measurements, Proc. SPIE, Fifth International Conference on Vibration Measurements by Laser ...commercial capabilities: Ring Laser Gyros, Fiber Optic Gyros, and Micro-Electro-Mechanical Systems (MEMS) gyros and accelerometers. RLGs and FOGs are now...augmentation sensors have been tied into the inertial systems; e.g., GPS, velocity meters, seekers, star trackers, magnetometers, lidar , etc. The
Linear Covariance Analysis for a Lunar Lander
NASA Technical Reports Server (NTRS)
Jang, Jiann-Woei; Bhatt, Sagar; Fritz, Matthew; Woffinden, David; May, Darryl; Braden, Ellen; Hannan, Michael
2017-01-01
A next-generation lunar lander Guidance, Navigation, and Control (GNC) system, which includes a state-of-the-art optical sensor suite, is proposed in a concept design cycle. The design goal is to allow the lander to softly land within the prescribed landing precision. The achievement of this precision landing requirement depends on proper selection of the sensor suite. In this paper, a robust sensor selection procedure is demonstrated using a Linear Covariance (LinCov) analysis tool developed by Draper.
A novel platform for electromagnetic navigated ultrasound bronchoscopy (EBUS).
Sorger, Hanne; Hofstad, Erlend Fagertun; Amundsen, Tore; Langø, Thomas; Leira, Håkon Olav
2016-08-01
Endobronchial ultrasound transbronchial needle aspiration (EBUS-TBNA) of mediastinal lymph nodes is essential for lung cancer staging and distinction between curative and palliative treatment. Precise sampling is crucial. Navigation and multimodal imaging may improve the efficiency of EBUS-TBNA. We demonstrate a novel EBUS-TBNA navigation system in a dedicated airway phantom. Using a convex probe EBUS bronchoscope (CP-EBUS) with an integrated sensor for electromagnetic (EM) position tracking, we performed navigated CP-EBUS in a phantom. Preoperative computed tomography (CT) and real-time ultrasound (US) images were integrated into a navigation platform for EM navigated bronchoscopy. The coordinates of targets in CT and US volumes were registered in the navigation system, and the position deviation was calculated. The system visualized all tumor models and displayed their fused CT and US images in correct positions in the navigation system. Navigating the EBUS bronchoscope was fast and easy. Mean error observed between US and CT positions for 11 target lesions (37 measurements) was [Formula: see text] mm, maximum error was 5.9 mm. The feasibility of our novel navigated CP-EBUS system was successfully demonstrated. An EBUS navigation system is needed to meet future requirements of precise mediastinal lymph node mapping, and provides new opportunities for procedure documentation in EBUS-TBNA.
Path planning in GPS-denied environments via collective intelligence of distributed sensor networks
NASA Astrophysics Data System (ADS)
Jha, Devesh K.; Chattopadhyay, Pritthi; Sarkar, Soumik; Ray, Asok
2016-05-01
This paper proposes a framework for reactive goal-directed navigation without global positioning facilities in unknown dynamic environments. A mobile sensor network is used for localising regions of interest for path planning of an autonomous mobile robot. The underlying theory is an extension of a generalised gossip algorithm that has been recently developed in a language-measure-theoretic setting. The algorithm has been used to propagate local decisions of target detection over a mobile sensor network and thus, it generates a belief map for the detected target over the network. In this setting, an autonomous mobile robot may communicate only with a few mobile sensing nodes in its own neighbourhood and localise itself relative to the communicating nodes with bounded uncertainties. The robot makes use of the knowledge based on the belief of the mobile sensors to generate a sequence of way-points, leading to a possible goal. The estimated way-points are used by a sampling-based motion planning algorithm to generate feasible trajectories for the robot. The proposed concept has been validated by numerical simulation on a mobile sensor network test-bed and a Dubin's car-like robot.
A review of wearable technology in medicine.
Iqbal, Mohammed H; Aydin, Abdullatif; Brunckhorst, Oliver; Dasgupta, Prokar; Ahmed, Kamran
2016-10-01
With rapid advances in technology, wearable devices have evolved and been adopted for various uses, ranging from simple devices used in aiding fitness to more complex devices used in assisting surgery. Wearable technology is broadly divided into head-mounted displays and body sensors. A broad search of the current literature revealed a total of 13 different body sensors and 11 head-mounted display devices. The latter have been reported for use in surgery (n = 7), imaging (n = 3), simulation and education (n = 2) and as navigation tools (n = 1). Body sensors have been used as vital signs monitors (n = 9) and for posture-related devices for posture and fitness (n = 4). Body sensors were found to have excellent functionality in aiding patient posture and rehabilitation while head-mounted displays can provide information to surgeons to while maintaining sterility during operative procedures. There is a potential role for head-mounted wearable technology and body sensors in medicine and patient care. However, there is little scientific evidence available proving that the application of such technologies improves patient satisfaction or care. Further studies need to be conducted prior to a clear conclusion. © The Royal Society of Medicine.
Advancing Lidar Sensors Technologies for Next Generation Landing Missions
NASA Technical Reports Server (NTRS)
Amzajerdian, Farzin; Hines, Glenn D.; Roback, Vincent E.; Petway, Larry B.; Barnes, Bruce W.; Brewster, Paul F.; Pierrottet, Diego F.; Bulyshev, Alexander
2015-01-01
Missions to solar systems bodies must meet increasingly ambitious objectives requiring highly reliable "precision landing", and "hazard avoidance" capabilities. Robotic missions to the Moon and Mars demand landing at pre-designated sites of high scientific value near hazardous terrain features, such as escarpments, craters, slopes, and rocks. Missions aimed at paving the path for colonization of the Moon and human landing on Mars need to execute onboard hazard detection and precision maneuvering to ensure safe landing near previously deployed assets. Asteroid missions require precision rendezvous, identification of the landing or sampling site location, and navigation to the highly dynamic object that may be tumbling at a fast rate. To meet these needs, NASA Langley Research Center (LaRC) has developed a set of advanced lidar sensors under the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. These lidar sensors can provide precision measurement of vehicle relative proximity, velocity, and orientation, and high resolution elevation maps of the surface during the descent to the targeted body. Recent flights onboard Morpheus free-flyer vehicle have demonstrated the viability of ALHAT lidar sensors for future landing missions to solar system bodies.
Behavioral Mapless Navigation Using Rings
NASA Technical Reports Server (NTRS)
Monroe, Randall P.; Miller, Samuel A.; Bradley, Arthur T.
2012-01-01
This paper presents work on the development and implementation of a novel approach to robotic navigation. In this system, map-building and localization for obstacle avoidance are discarded in favor of moment-by-moment behavioral processing of the sonar sensor data. To accomplish this, we developed a network of behaviors that communicate through the passing of rings, data structures that are similar in form to the sonar data itself and express the decisions of each behavior. Through the use of these rings, behaviors can moderate each other, conflicting impulses can be mediated, and designers can easily connect modules to create complex emergent navigational techniques. We discuss the development of a number of these modules and their successful use as a navigation system in the Trinity omnidirectional robot.
NASA Astrophysics Data System (ADS)
Tom, Michael; Trujillo, Edward
1994-06-01
Integrated infrared (IR) sensors which exploit modular avionics concepts can provide features such as operational flexibility, enhanced stealthiness, and ease of maintenance to meet the demands of tactical, airborne sensor systems. On-board, tactical airborne sensor systems perform target acquisition, tracking, identification, threat warning, missile launch detection, and ground mapping in support of situation awareness, self-defense, navigation, target attack, weapon support, and reconnaissance activities. The use of sensor suites for future tactical aircraft such as US Air Force's multirole fighter require a blend of sensor inputs and outputs that may vary over time. It is expected that special-role units of these tactical aircraft will be formed to conduct tasks and missions such as anti-shipping, reconnaissance, or suppression of enemy air defenses.
Integrated and Multi-Function Navigation (Les Systemes de Navigation Integres Multifunctions)
1992-11-01
assistance, as requested, to other NATO bodies and to member nations in connection with research and development problems in the aerospace field. The...SARMCS is aimed at the motion compensation of experience in the development and applications radar returns to achieve high resolution, high of Integrated...development project such as the essentially the same technology and utilize Synthetic Aperture Radar Motion Compensation similar sensors, the mission
Inertial Measurements for Aero-assisted Navigation (IMAN)
NASA Technical Reports Server (NTRS)
Jah, Moriba; Lisano, Michael; Hockney, George
2007-01-01
IMAN is a Python tool that provides inertial sensor-based estimates of spacecraft trajectories within an atmospheric influence. It provides Kalman filter-derived spacecraft state estimates based upon data collected onboard, and is shown to perform at a level comparable to the conventional methods of spacecraft navigation in terms of accuracy and at a higher level with regard to the availability of results immediately after completion of an atmospheric drag pass.
HyMoTrack: A Mobile AR Navigation System for Complex Indoor Environments.
Gerstweiler, Georg; Vonach, Emanuel; Kaufmann, Hannes
2015-12-24
Navigating in unknown big indoor environments with static 2D maps is a challenge, especially when time is a critical factor. In order to provide a mobile assistant, capable of supporting people while navigating in indoor locations, an accurate and reliable localization system is required in almost every corner of the building. We present a solution to this problem through a hybrid tracking system specifically designed for complex indoor spaces, which runs on mobile devices like smartphones or tablets. The developed algorithm only uses the available sensors built into standard mobile devices, especially the inertial sensors and the RGB camera. The combination of multiple optical tracking technologies, such as 2D natural features and features of more complex three-dimensional structures guarantees the robustness of the system. All processing is done locally and no network connection is needed. State-of-the-art indoor tracking approaches use mainly radio-frequency signals like Wi-Fi or Bluetooth for localizing a user. In contrast to these approaches, the main advantage of the developed system is the capability of delivering a continuous 3D position and orientation of the mobile device with centimeter accuracy. This makes it usable for localization and 3D augmentation purposes, e.g. navigation tasks or location-based information visualization.
HyMoTrack: A Mobile AR Navigation System for Complex Indoor Environments
Gerstweiler, Georg; Vonach, Emanuel; Kaufmann, Hannes
2015-01-01
Navigating in unknown big indoor environments with static 2D maps is a challenge, especially when time is a critical factor. In order to provide a mobile assistant, capable of supporting people while navigating in indoor locations, an accurate and reliable localization system is required in almost every corner of the building. We present a solution to this problem through a hybrid tracking system specifically designed for complex indoor spaces, which runs on mobile devices like smartphones or tablets. The developed algorithm only uses the available sensors built into standard mobile devices, especially the inertial sensors and the RGB camera. The combination of multiple optical tracking technologies, such as 2D natural features and features of more complex three-dimensional structures guarantees the robustness of the system. All processing is done locally and no network connection is needed. State-of-the-art indoor tracking approaches use mainly radio-frequency signals like Wi-Fi or Bluetooth for localizing a user. In contrast to these approaches, the main advantage of the developed system is the capability of delivering a continuous 3D position and orientation of the mobile device with centimeter accuracy. This makes it usable for localization and 3D augmentation purposes, e.g. navigation tasks or location-based information visualization. PMID:26712755
NASA Technical Reports Server (NTRS)
Eberlein, A. J.; Lahm, T. G.
1976-01-01
The degree to which flight-critical failures in a strapdown laser gyro tetrad sensor assembly can be isolated in short-haul aircraft after a failure occurrence has been detected by the skewed sensor failure-detection voting logic is investigated along with the degree to which a failure in the tetrad computer can be detected and isolated at the computer level, assuming a dual-redundant computer configuration. The tetrad system was mechanized with two two-axis inertial navigation channels (INCs), each containing two gyro/accelerometer axes, computer, control circuitry, and input/output circuitry. Gyro/accelerometer data is crossfed between the two INCs to enable each computer to independently perform the navigation task. Computer calculations are synchronized between the computers so that calculated quantities are identical and may be compared. Fail-safe performance (identification of the first failure) is accomplished with a probability approaching 100 percent of the time, while fail-operational performance (identification and isolation of the first failure) is achieved 93 to 96 percent of the time.
POSE Algorithms for Automated Docking
NASA Technical Reports Server (NTRS)
Heaton, Andrew F.; Howard, Richard T.
2011-01-01
POSE (relative position and attitude) can be computed in many different ways. Given a sensor that measures bearing to a finite number of spots corresponding to known features (such as a target) of a spacecraft, a number of different algorithms can be used to compute the POSE. NASA has sponsored the development of a flash LIDAR proximity sensor called the Vision Navigation Sensor (VNS) for use by the Orion capsule in future docking missions. This sensor generates data that can be used by a variety of algorithms to compute POSE solutions inside of 15 meters, including at the critical docking range of approximately 1-2 meters. Previously NASA participated in a DARPA program called Orbital Express that achieved the first automated docking for the American space program. During this mission a large set of high quality mated sensor data was obtained at what is essentially the docking distance. This data set is perhaps the most accurate truth data in existence for docking proximity sensors in orbit. In this paper, the flight data from Orbital Express is used to test POSE algorithms at 1.22 meters range. Two different POSE algorithms are tested for two different Fields-of-View (FOVs) and two different pixel noise levels. The results of the analysis are used to predict future performance of the POSE algorithms with VNS data.
Sanchez, Richard D.
2004-01-01
High-resolution airborne digital cameras with onboard data collection based on the Global Positioning System (GPS) and inertial navigation systems (INS) technology may offer a real-time means to gather accurate topographic map information by reducing ground control and eliminating aerial triangulation. Past evaluations of this integrated system over relatively flat terrain have proven successful. The author uses Emerge Digital Sensor System (DSS) combined with Applanix Corporation?s Position and Orientation Solutions for Direct Georeferencing to examine the positional mapping accuracy in rough terrain. The positional accuracy documented in this study did not meet large-scale mapping requirements owing to an apparent system mechanical failure. Nonetheless, the findings yield important information on a new approach for mapping in Antarctica and other remote or inaccessible areas of the world.
Hoss, Udo; Jeddi, Iman; Schulz, Mark; Budiman, Erwin; Bhogal, Claire; McGarraugh, Geoffrey
2010-08-01
Commercial continuous subcutaneous glucose monitors require in vivo calibration using capillary blood glucose tests. Feasibility of factory calibration, i.e., sensor batch characterization in vitro with no further need for in vivo calibration, requires a predictable and stable in vivo sensor sensitivity and limited inter- and intra-subject variation of the ratio of interstitial to blood glucose concentration. Twelve volunteers wore two FreeStyle Navigator (Abbott Diabetes Care, Alameda, CA) continuous glucose monitoring systems for 5 days in parallel for two consecutive sensor wears (four sensors per subject, 48 sensors total). Sensors from a prototype sensor lot with a low variability in glucose sensitivity were used for the study. Median sensor sensitivity values based on capillary blood glucose were calculated per sensor and compared for inter- and intra-subject variation. Mean absolute relative difference (MARD) calculation and error grid analysis were performed using a single calibration factor for all sensors to simulate factory calibration and compared to standard fingerstick calibration. Sensor sensitivity variation in vitro was 4.6%, which increased to 8.3% in vivo (P < 0.0001). Analysis of variance revealed no significant inter-subject differences in sensor sensitivity (P = 0.134). Applying a single universal calibration factor retrospectively to all sensors resulted in a MARD of 10.4% and 88.1% of values in Clarke Error Grid Zone A, compared to a MARD of 10.9% and 86% of values in Error Grid Zone A for fingerstick calibration. Factory calibration of sensors for continuous subcutaneous glucose monitoring is feasible with similar accuracy to standard fingerstick calibration. Additional data are required to confirm this result in subjects with diabetes.
NASA Technical Reports Server (NTRS)
Roback, V. Eric; Pierrottet, Diego F.; Amzajerdian, Farzin; Barnes, Bruce W.; Bulyshev, Alexander E.; Hines, Glenn D.; Petway, Larry B.; Brewster, Paul F.; Kempton, Kevin S.
2015-01-01
For the first time, a suite of three lidar sensors have been used in flight to scan a lunar-like hazard field, identify a safe landing site, and, in concert with an experimental Guidance, Navigation, and Control (GN&C) system, help to guide the Morpheus autonomous, rocket-propelled, free-flying lander to that safe site on the hazard field. The lidar sensors and GN&C system are part of the Autonomous Precision Landing and Hazard Detection and Avoidance Technology (ALHAT) project which has been seeking to develop a system capable of enabling safe, precise crewed or robotic landings in challenging terrain on planetary bodies under any ambient lighting conditions. The 3-D imaging Flash Lidar is a second generation, compact, real-time, aircooled instrument developed from a number of components from industry and NASA and is used as part of the ALHAT Hazard Detection System (HDS) to scan the hazard field and build a 3-D Digital Elevation Map (DEM) in near-real time for identifying safe sites. The Flash Lidar is capable of identifying a 30 cm hazard from a slant range of 1 km with its 8 cm range precision (1-s). The Flash Lidar is also used in Hazard Relative Navigation (HRN) to provide position updates down to a 250m slant range to the ALHAT navigation filter as it guides Morpheus to the safe site. The Navigation Doppler Lidar (NDL) system has been developed within NASA to provide velocity measurements with an accuracy of 0.2 cm/sec and range measurements with an accuracy of 17 cm both from a maximum range of 2,200 m to a minimum range of several meters above the ground. The NDLâ€"TM"s measurements are fed into the ALHAT navigation filter to provide lander guidance to the safe site. The Laser Altimeter (LA), also developed within NASA, provides range measurements with an accuracy of 5 cm from a maximum operational range of 30 km down to 1 m and, being a separate sensor from the Flash Lidar, can provide range along a separate vector. The LA measurements are also fed into the ALHAT navigation filter to provide lander guidance to the safe site. The flight tests served as the culmination of the TRL 6 journey for the ALHAT system and included launch from a pad situated at the NASA-Kennedy Space Center Shuttle Landing Facility (SLF) runway, a lunar-like descent trajectory from an altitude of 250m, and landing on a lunar-like hazard field of rocks, craters, hazardous slopes, and safe sites 400m down-range just off the North end of the runway. The tests both confirmed the expected performance and also revealed several challenges present in the flight-like environment which will feed into future TRL advancement of the sensors. Guidance provided by the ALHAT system was impeded in portions of the trajectory and intermittent near the end of the trajectory due to optical effects arising from air heated by the rocket engine. The Flash Lidar identified hazards as small as 30 cm from the maximum slant range of 450 m which Morpheus could provide; however, it was occasionally susceptible to an increase in range noise due to scintillation arising from air heated by the Morpheus rocket engine which entered its Field-of-View (FOV). The Flash Lidar was also susceptible to pre-triggering, during the HRN phase, on a dust cloud created during launch and transported down-range by the wind. The NDL provided velocity and range measurements to the expected accuracy levels yet it was also susceptible to signal degradation due to air heated by the rocket engine. The LA, operating with a degraded transmitter laser, also showed signal attenuation over a few seconds at a specific phase of the flight due to the heat plume generated by the rocket engine.
Application of parallelized software architecture to an autonomous ground vehicle
NASA Astrophysics Data System (ADS)
Shakya, Rahul; Wright, Adam; Shin, Young Ho; Momin, Orko; Petkovsek, Steven; Wortman, Paul; Gautam, Prasanna; Norton, Adam
2011-01-01
This paper presents improvements made to Q, an autonomous ground vehicle designed to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2010 IGVC, Q was upgraded with a new parallelized software architecture and a new vision processor. Improvements were made to the power system reducing the number of batteries required for operation from six to one. In previous years, a single state machine was used to execute the bulk of processing activities including sensor interfacing, data processing, path planning, navigation algorithms and motor control. This inefficient approach led to poor software performance and made it difficult to maintain or modify. For IGVC 2010, the team implemented a modular parallel architecture using the National Instruments (NI) LabVIEW programming language. The new architecture divides all the necessary tasks - motor control, navigation, sensor data collection, etc. into well-organized components that execute in parallel, providing considerable flexibility and facilitating efficient use of processing power. Computer vision is used to detect white lines on the ground and determine their location relative to the robot. With the new vision processor and some optimization of the image processing algorithm used last year, two frames can be acquired and processed in 70ms. With all these improvements, Q placed 2nd in the autonomous challenge.
NASA Technical Reports Server (NTRS)
Winternitz, Luke
2017-01-01
This talk will describe two first-of-their-kind technology demonstrations attached to ongoing NASA science missions, both of which aim to extend the range of autonomous spacecraft navigation far from the Earth. First, we will describe the onboard GPS navigation system for the Magnetospheric Multiscale (MMS) mission which is currently operating in elliptic orbits reaching nearly halfway to the Moon. The MMS navigation system is a key outgrowth of a larger effort at NASA Goddard Space Flight Center to advance high-altitude Global Navigation Satellite System (GNSS) navigation on multiple fronts, including developing Global Positioning System receivers and onboard navigation software, running simulation studies, and leading efforts to characterize and protect signals at high-altitude in the so-called GNSS Space-Service Volume (SSV). In the second part of the talk, we will describe the Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) mission that aims to make the first in-space demonstration of X-ray pulsar navigation (XNAV). SEXTANT is attached to the NASA astrophysics mission Neutron-star Interior Composition ExploreR (NICER) whose International Space Station mounted X-ray telescope is investigating the fundamental physics of extremes in gravity, material density, and electromagnetic fields found in neutron stars, and whose instrument provides a nearly ideal navigation sensor for XNAV.
Drift Reduction in Pedestrian Navigation System by Exploiting Motion Constraints and Magnetic Field
Ilyas, Muhammad; Cho, Kuk; Baeg, Seung-Ho; Park, Sangdeok
2016-01-01
Pedestrian navigation systems (PNS) using foot-mounted MEMS inertial sensors use zero-velocity updates (ZUPTs) to reduce drift in navigation solutions and estimate inertial sensor errors. However, it is well known that ZUPTs cannot reduce all errors, especially as heading error is not observable. Hence, the position estimates tend to drift and even cyclic ZUPTs are applied in updated steps of the Extended Kalman Filter (EKF). This urges the use of other motion constraints for pedestrian gait and any other valuable heading reduction information that is available. In this paper, we exploit two more motion constraints scenarios of pedestrian gait: (1) walking along straight paths; (2) standing still for a long time. It is observed that these motion constraints (called “virtual sensor”), though considerably reducing drift in PNS, still need an absolute heading reference. One common absolute heading estimation sensor is the magnetometer, which senses the Earth’s magnetic field and, hence, the true heading angle can be calculated. However, magnetometers are susceptible to magnetic distortions, especially in indoor environments. In this work, an algorithm, called magnetic anomaly detection (MAD) and compensation is designed by incorporating only healthy magnetometer data in the EKF updating step, to reduce drift in zero-velocity updated INS. Experiments are conducted in GPS-denied and magnetically distorted environments to validate the proposed algorithms. PMID:27618056
Tang, Rui; Ma, Longfei; Li, Ang; Yu, Lihan; Rong, Zhixia; Zhang, Xinjing; Xiang, Canhong; Liao, Hongen; Dong, Jiahong
2018-06-01
We applied augmented reality (AR) techniques to flexible choledochoscopy examinations. Enhanced computed tomography data of a patient with intrahepatic and extrahepatic biliary duct dilatation were collected to generate a hollow, 3-dimensional (3D) model of the biliary tree by 3D printing. The 3D printed model was placed in an opaque box. An electromagnetic (EM) sensor was internally installed in the choledochoscope instrument channel for tracking its movements through the passages of the 3D printed model, and an AR navigation platform was built using image overlay display. The porta hepatis was used as the reference marker with rigid image registration. The trajectories of the choledochoscope and the EM sensor were observed and recorded using the operator interface of the choledochoscope. Training choledochoscopy was performed on the 3D printed model. The choledochoscope was guided into the left and right hepatic ducts, the right anterior hepatic duct, the bile ducts of segment 8, the hepatic duct in subsegment 8, the right posterior hepatic duct, and the left and the right bile ducts of the caudate lobe. Although stability in tracking was less than ideal, the virtual choledochoscope images and EM sensor tracking were effective for navigation. AR techniques can be used to assist navigation in choledochoscopy examinations in bile duct models. Further research is needed to determine its benefits in clinical settings.
Jiao, Jialong; Ren, Huilong; Adenya, Christiaan Adika; Chen, Chaohe
2017-01-01
Wave-induced motion and load responses are important criteria for ship performance evaluation. Physical experiments have long been an indispensable tool in the predictions of ship’s navigation state, speed, motions, accelerations, sectional loads and wave impact pressure. Currently, majority of the experiments are conducted in laboratory tank environment, where the wave environments are different from the realistic sea waves. In this paper, a laboratory tank testing system for ship motions and loads measurement is reviewed and reported first. Then, a novel large-scale model measurement technique is developed based on the laboratory testing foundations to obtain accurate motion and load responses of ships in realistic sea conditions. For this purpose, a suite of advanced remote control and telemetry experimental system was developed in-house to allow for the implementation of large-scale model seakeeping measurement at sea. The experimental system includes a series of technique sensors, e.g., the Global Position System/Inertial Navigation System (GPS/INS) module, course top, optical fiber sensors, strain gauges, pressure sensors and accelerometers. The developed measurement system was tested by field experiments in coastal seas, which indicates that the proposed large-scale model testing scheme is capable and feasible. Meaningful data including ocean environment parameters, ship navigation state, motions and loads were obtained through the sea trial campaign. PMID:29109379
Luo, Xiongbiao; Wan, Ying; He, Xiangjian; Mori, Kensaku
2015-02-01
Registration of pre-clinical images to physical space is indispensable for computer-assisted endoscopic interventions in operating rooms. Electromagnetically navigated endoscopic interventions are increasingly performed at current diagnoses and treatments. Such interventions use an electromagnetic tracker with a miniature sensor that is usually attached at an endoscope distal tip to real time track endoscope movements in a pre-clinical image space. Spatial alignment between the electromagnetic tracker (or sensor) and pre-clinical images must be performed to navigate the endoscope to target regions. This paper proposes an adaptive marker-free registration method that uses a multiple point selection strategy. This method seeks to address an assumption that the endoscope is operated along the centerline of an intraluminal organ which is easily violated during interventions. We introduce an adaptive strategy that generates multiple points in terms of sensor measurements and endoscope tip center calibration. From these generated points, we adaptively choose the optimal point, which is the closest to its assigned the centerline of the hollow organ, to perform registration. The experimental results demonstrate that our proposed adaptive strategy significantly reduced the target registration error from 5.32 to 2.59 mm in static phantoms validation, as well as from at least 7.58 mm to 4.71 mm in dynamic phantom validation compared to current available methods. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
The GNC Measurement System for the Automated Transfer Vehicle
NASA Astrophysics Data System (ADS)
Roux, Y.; da Cunha, P.
The Automated Transfer Vehicle (ATV) is a European Space Agency (ESA) funded spacecraft developed by EADS Space Transportation as prime contractor for the space segment together with major European industrial partners, in the frame of the International Space Station (ISS). Its mission objective is threefold : to supply the station with fret and propellant, to reboost ISS to a higher orbit and to dispose of waste from the station. The ATV first flight, called Jules Verne and planned on 2005, will be the first European Vehicle to perform an orbital rendezvous. The GNC Measurement System (GMS) is the ATV on board function in charge of the measurement data collection and preconditioning for the navigation, guidance and control (GNC) algorithms. The GMS is made up of hardware which are the navigation sensors (with a certain level of hardware redundancy for each of them), and of an on-board software that manages, monitors and performs consistency checks to detect and isolate potential sensor failures. The GMS relies on six kinds of navigation sensors, used during various phases of the mission : the gyrometers assembly (GYRA), the accelerometers assembly (ACCA), the star trackers (STR), the GPS receivers, the telegoniometers (TGM) and the videometers (VDM), the last two being used for the final rendezvous phase. The GMS function is developed by EADS Space Transportation together with other industrial partners: EADS Astrium, EADS Sodern, Laben and Dasa Jena Optronik.
Inertial Sensor Characterization for Inertial Navigation and Human Motion Tracking Applications
2012-06-01
sensor to the pendulum. The time he took to design this part in SolidWorks so that I could have it printed on a 3D printer was greatly appreciated...I would also like to thank Daniel Sakoda for his quick turnaround in printing the mounting bracket using a 3D printer . Lastly, I would like to...sensors provide three-dimensional ( 3D ) orientation, acceleration, rate of turn, and magnetic field information. Manufacturers specify both static and
NASA Astrophysics Data System (ADS)
Uijt de Haag, Maarten; Venable, Kyle; Bezawada, Rajesh; Adami, Tony; Vadlamani, Ananth K.
2009-05-01
This paper discusses a sensor simulator/synthesizer framework that can be used to test and evaluate various sensor integration strategies for the implementation of an External Hazard Monitor (EHM) and Integrated Alerting and Notification (IAN) function as part of NASA's Integrated Intelligent Flight Deck (IIFD) project. The IIFD project under the NASA's Aviation Safety program "pursues technologies related to the flight deck that ensure crew workload and situational awareness are both safely optimized and adapted to the future operational environment as envisioned by NextGen." Within the simulation framework, various inputs to the IIFD and its subsystems, the EHM and IAN, are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. Sensors and avionics included in this framework are TCAS, ADS-B, Forward-Looking Infrared, Vision cameras, GPS, Inertial navigators, EGPWS, Laser Detection and Ranging sensors, altimeters, communication links with ATC, and weather radar. The framework is implemented in Simulink, a modeling language developed by The Mathworks. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft. Specifically, this paper addresses the architecture of the simulator, the sensor model interfaces, the timing and database (environment) aspects of the sensor models, the user interface of the modeling environment, and the various avionics implementations.
A navigation system for the visually impaired using colored navigation lines and RFID tags.
Seto, First Tatsuya
2009-01-01
In this paper, we describe about a developed navigation system that supports the independent walking of the visually impaired in the indoor space. Our developed instrument consists of a navigation system and a map information system. These systems are installed on a white cane. Our navigation system can follow a colored navigation line that is set on the floor. In this system, a color sensor installed on the tip of a white cane senses the colored navigation line, and the system informs the visually impaired that he/she is walking along the navigation line by vibration. The color recognition system is controlled by a one-chip microprocessor and this system can discriminate 6 colored navigation lines. RFID tags and a receiver for these tags are used in the map information system. The RFID tags and the RFID tag receiver are also installed on a white cane. The receiver receives tag information and notifies map information to the user by mp3 formatted pre-recorded voice. Three normal subjects who were blindfolded with an eye mask were tested with this system. All of them were able to walk along the navigation line. The performance of the map information system was good. Therefore, our system will be extremely valuable in supporting the activities of the visually impaired.
Applying FastSLAM to Articulated Rovers
NASA Astrophysics Data System (ADS)
Hewitt, Robert Alexander
This thesis presents the navigation algorithms designed for use on Kapvik, a 30 kg planetary micro-rover built for the Canadian Space Agency; the simulations used to test the algorithm; and novel techniques for terrain classification using Kapvik's LIDAR (Light Detection And Ranging) sensor. Kapvik implements a six-wheeled, skid-steered, rocker-bogie mobility system. This warrants a more complicated kinematic model for navigation than a typical 4-wheel differential drive system. The design of a 3D navigation algorithm is presented that includes nonlinear Kalman filtering and Simultaneous Localization and Mapping (SLAM). A neural network for terrain classification is used to improve navigation performance. Simulation is used to train the neural network and validate the navigation algorithms. Real world tests of the terrain classification algorithm validate the use of simulation for training and the improvement to SLAM through the reduction of extraneous LIDAR measurements in each scan.
Augmenting the Global Positioning System with Foreign Navigation Systems and Alternative Sensors
2012-03-01
Patrick Y.C. Hwang . Introduction to Random Signals and Applied Kalman Filtering. John Wiley and Sons, 1997. [4] Dutt, Srilatha Indira, G. Sasi Bhushana Rao...A simulation was then setup for an autonomous aerial vehicle flight through the model using a Kalman Filter to combine the various sensors with GPS...21 2.7 Altimeter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.8 Kalman Filtering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pin, F.G.; de Saussure, G.; Spelt, P.F.
1988-01-01
This paper describes recent research activities at the Center for Engineering Systems Advanced Research (CESAR) in the area of sensor based reasoning, with emphasis being given to their application and implementation on our HERMIES-IIB autonomous mobile vehicle. These activities, including navigation and exploration in a-priori unknown and dynamic environments, goal recognition, vision-guided manipulation and sensor-driven machine learning, are discussed within the framework of a scenario in which an autonomous robot is asked to navigate through an unknown dynamic environment, explore, find and dock at the panel, read and understand the status of the panel's meters and dials, learn the functioningmore » of a process control panel, and successfully manipulate the control devices of the panel to solve a maintenance emergency problems. A demonstration of the successful implementation of the algorithms on our HERMIES-IIB autonomous robot for resolution of this scenario is presented. Conclusions are drawn concerning the applicability of the methodologies to more general classes of problems and implications for future work on sensor-driven reasoning for autonomous robots are discussed. 8 refs., 3 figs.« less
Multi Sensor Fusion Framework for Indoor-Outdoor Localization of Limited Resource Mobile Robots
Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro
2013-01-01
This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments. PMID:24152933
Multi sensor fusion framework for indoor-outdoor localization of limited resource mobile robots.
Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro
2013-10-21
This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments.
NASA Astrophysics Data System (ADS)
Lu, Jiazhen; Lei, Chaohua; Yang, Yanqiang; Liu, Ming
2017-06-01
Many countries have been paying great attention to space exploration, especially about the Moon and the Mars. Autonomous and high-accuracy navigation systems are needed for probers and rovers to accomplish missions. Inertial navigation system (INS)/celestial navigation system (CNS) based navigation system has been used widely on the lunar rovers. Initialization is a particularly important step for navigation. This paper presents an in-motion alignment and positioning method for lunar rovers by INS/CNS/odometer integrated navigation. The method can estimate not only the position and attitude errors, but also the biases of the accelerometers and gyros using the standard Kalman filter. The differences between the platform star azimuth, elevation angles and the computed star azimuth, elevation angles, and the difference between the velocity measured by odometer and the velocity measured by inertial sensors are taken as measurements. The semi-physical experiments are implemented to demonstrate that the position error can reduce to 10 m and attitude error is within 2″ during 5 min. The experiment results prove that it is an effective and attractive initialization approach for lunar rovers.
A Hardware-in-the-Loop Testbed for Spacecraft Formation Flying Applications
NASA Technical Reports Server (NTRS)
Leitner, Jesse; Bauer, Frank H. (Technical Monitor)
2001-01-01
The Formation Flying Test Bed (FFTB) at NASA Goddard Space Flight Center (GSFC) is being developed as a modular, hybrid dynamic simulation facility employed for end-to-end guidance, navigation, and control (GN&C) analysis and design for formation flying clusters and constellations of satellites. The FFTB will support critical hardware and software technology development to enable current and future missions for NASA, other government agencies, and external customers for a wide range of missions, particularly those involving distributed spacecraft operations. The initial capabilities of the FFTB are based upon an integration of high fidelity hardware and software simulation, emulation, and test platforms developed at GSFC in recent years; including a high-fidelity GPS simulator which has been a fundamental component of the Guidance, Navigation, and Control Center's GPS Test Facility. The FFTB will be continuously evolving over the next several years from a too[ with initial capabilities in GPS navigation hardware/software- in-the- loop analysis and closed loop GPS-based orbit control algorithm assessment to one with cross-link communications and relative navigation analysis and simulation capability. Eventually the FFT13 will provide full capability to support all aspects of multi-sensor, absolute and relative position determination and control, in all (attitude and orbit) degrees of freedom, as well as information management for satellite clusters and constellations. In this paper we focus on the architecture for the FFT13 as a general GN&C analysis environment for the spacecraft formation flying community inside and outside of NASA GSFC and we briefly reference some current and future activities which will drive the requirements and development.
Remote Marker-Based Tracking for UAV Landing Using Visible-Light Camera Sensor.
Nguyen, Phong Ha; Kim, Ki Wan; Lee, Young Won; Park, Kang Ryoung
2017-08-30
Unmanned aerial vehicles (UAVs), which are commonly known as drones, have proved to be useful not only on the battlefields where manned flight is considered too risky or difficult, but also in everyday life purposes such as surveillance, monitoring, rescue, unmanned cargo, aerial video, and photography. More advanced drones make use of global positioning system (GPS) receivers during the navigation and control loop which allows for smart GPS features of drone navigation. However, there are problems if the drones operate in heterogeneous areas with no GPS signal, so it is important to perform research into the development of UAVs with autonomous navigation and landing guidance using computer vision. In this research, we determined how to safely land a drone in the absence of GPS signals using our remote maker-based tracking algorithm based on the visible light camera sensor. The proposed method uses a unique marker designed as a tracking target during landing procedures. Experimental results show that our method significantly outperforms state-of-the-art object trackers in terms of both accuracy and processing time, and we perform test on an embedded system in various environments.
A Fully Sensorized Cooperative Robotic System for Surgical Interventions
Tovar-Arriaga, Saúl; Vargas, José Emilio; Ramos, Juan M.; Aceves, Marco A.; Gorrostieta, Efren; Kalender, Willi A.
2012-01-01
In this research a fully sensorized cooperative robot system for manipulation of needles is presented. The setup consists of a DLR/KUKA Light Weight Robot III especially designed for safe human/robot interaction, a FD-CT robot-driven angiographic C-arm system, and a navigation camera. Also, new control strategies for robot manipulation in the clinical environment are introduced. A method for fast calibration of the involved components and the preliminary accuracy tests of the whole possible errors chain are presented. Calibration of the robot with the navigation system has a residual error of 0.81 mm (rms) with a standard deviation of ±0.41 mm. The accuracy of the robotic system while targeting fixed points at different positions within the workspace is of 1.2 mm (rms) with a standard deviation of ±0.4 mm. After calibration, and due to close loop control, the absolute positioning accuracy was reduced to the navigation camera accuracy which is of 0.35 mm (rms). The implemented control allows the robot to compensate for small patient movements. PMID:23012551
A Robust Method to Detect Zero Velocity for Improved 3D Personal Navigation Using Inertial Sensors
Xu, Zhengyi; Wei, Jianming; Zhang, Bo; Yang, Weijun
2015-01-01
This paper proposes a robust zero velocity (ZV) detector algorithm to accurately calculate stationary periods in a gait cycle. The proposed algorithm adopts an effective gait cycle segmentation method and introduces a Bayesian network (BN) model based on the measurements of inertial sensors and kinesiology knowledge to infer the ZV period. During the detected ZV period, an Extended Kalman Filter (EKF) is used to estimate the error states and calibrate the position error. The experiments reveal that the removal rate of ZV false detections by the proposed method increases 80% compared with traditional method at high walking speed. Furthermore, based on the detected ZV, the Personal Inertial Navigation System (PINS) algorithm aided by EKF performs better, especially in the altitude aspect. PMID:25831086
A Short Tutorial on Inertial Navigation System and Global Positioning System Integration
NASA Technical Reports Server (NTRS)
Smalling, Kyle M.; Eure, Kenneth W.
2015-01-01
The purpose of this document is to describe a simple method of integrating Inertial Navigation System (INS) information with Global Positioning System (GPS) information for an improved estimate of vehicle attitude and position. A simple two dimensional (2D) case is considered. The attitude estimates are derived from sensor data and used in the estimation of vehicle position and velocity through dead reckoning within the INS. The INS estimates are updated with GPS estimates using a Kalman filter. This tutorial is intended for the novice user with a focus on bringing the reader from raw sensor measurements to an integrated position and attitude estimate. An application is given using a remotely controlled ground vehicle operating in assumed 2D environment. The theory is developed first followed by an illustrative example.
Pushbroom Stereo for High-Speed Navigation in Cluttered Environments
2014-09-01
inertial measurement sensors such as Achtelik et al .’s implemention of PTAM (parallel tracking and mapping) [15] with a barometric altimeter, stable flights...in indoor and outdoor environments are possible [1]. With a full vison- aided inertial navigation system (VINS), Li et al . have shown remarkable...avoidance on small UAVs. Stereo systems suffer from a similar speed issue, with most modern systems running at or below 30 Hz [8], [27]. Honegger et
Fusion of Imaging and Inertial Sensors for Navigation
2006-09-01
combat operations. The Global Positioning System (GPS) was fielded in the 1980’s and first used for precision navigation and targeting in combat...equations [37]. Consider the homogeneous nonlinear differential equation ẋ(t) = f [x(t),u(t), t] ; x(t0) = x0 (2.4) For a given input function , u0(t...differential equation is a time-varying probability density function . The Kalman filter derivation assumes Gaussian distributions for all random
Fusion of Low-Cost Imaging and Inertial Sensors for Navigation
2007-01-01
an Integrated GPS/MEMS Inertial Navigation Pack- age. In Proceedings of ION GNSS 2004, pp. 825–832, September 2004. [3] R. G. Brown and P. Y. Hwang ...track- ing, with no a priori knowledge is provided in [13]. An on- line (Extended Kalman Filter-based) method for calculat- ing a trajectory by tracking...transformation, effectively constraining the resulting correspondence search space. The algorithm was incorporated into an extended Kalman filter and
A GPS Receiver for Lunar Missions
NASA Technical Reports Server (NTRS)
Bamford, William A.; Heckler, Gregory W.; Holt, Greg N.; Moreau, Michael C.
2008-01-01
Beginning with the launch of the Lunar Reconnaissance Orbiter (LRO) in October of 2008, NASA will once again begin its quest to land humans on the Moon. This effort will require the development of new spacecraft which will safely transport people from the Earth to the Moon and back again, as well as robotic probes tagged with science, re-supply, and communication duties. In addition to the next-generation spacecraft currently under construction, including the Orion capsule, NASA is also investigating and developing cutting edge navigation sensors which will allow for autonomous state estimation in low Earth orbit (LEO) and cislunar space. Such instruments could provide an extra layer of redundancy in avionics systems and reduce the reliance on support and on the Deep Space Network (DSN). One such sensor is the weak-signal Global Positioning System (GPS) receiver "Navigator" being developed at NASA's Goddard Space Flight Center (GSFC). At the heart of the Navigator is a Field Programmable Gate Array (FPGA) based acquisition engine. This engine allows for the rapid acquisition/reacquisition of strong GPS signals, enabling the receiver to quickly recover from outages due to blocked satellites or atmospheric entry. Additionally, the acquisition algorithm provides significantly lower sensitivities than a conventional space-based GPS receiver, permitting it to acquire satellites well above the GPS constellation. This paper assesses the performance of the Navigator receiver based upon three of the major flight regimes of a manned lunar mission: Earth ascent, cislunar navigation, and entry. Representative trajectories for each of these segments were provided by NASA. The Navigator receiver was connected to a Spirent GPS signal generator, to allow for the collection of real-time, hardware-in-the-loop results for each phase of the flight. For each of the flight segments, the Navigator was tested on its ability to acquire and track GPS satellites under the dynamical environment unique to that trajectory.
Can low-cost VOR and Omega receivers suffice for RNAV - A new computer-based navigation technique
NASA Technical Reports Server (NTRS)
Hollaar, L. A.
1978-01-01
It is shown that although RNAV is particularly valuable for the personal transportation segment of general aviation, it has not gained complete acceptance. This is due, in part, to its high cost and the necessary special-handling air traffic control. VOR/DME RNAV calculations are ideally suited for analog computers, and the use of microprocessor technology has been suggested for reducing RNAV costs. Three navigation systems, VOR, Omega, and DR, are compared for common navigational difficulties, such as station geometry, siting errors, ground disturbances, and terminal area coverage. The Kalman filtering technique is described with reference to the disadvantages when using a system including standard microprocessors. An integrated navigation system, using input data from various low-cost sensor systems, is presented and current simulation studies are noted.
Navigation of robotic system using cricket motes
NASA Astrophysics Data System (ADS)
Patil, Yogendra J.; Baine, Nicholas A.; Rattan, Kuldip S.
2011-06-01
This paper presents a novel algorithm for self-mapping of the cricket motes that can be used for indoor navigation of autonomous robotic systems. The cricket system is a wireless sensor network that can provide indoor localization service to its user via acoustic ranging techniques. The behavior of the ultrasonic transducer on the cricket mote is studied and the regions where satisfactorily distance measurements can be obtained are recorded. Placing the motes in these regions results fine-grain mapping of the cricket motes. Trilateration is used to obtain a rigid coordinate system, but is insufficient if the network is to be used for navigation. A modified SLAM algorithm is applied to overcome the shortcomings of trilateration. Finally, the self-mapped cricket motes can be used for navigation of autonomous robotic systems in an indoor location.
OM300 Direction Drilling Module
MacGugan, Doug
2013-08-22
OM300 – Geothermal Direction Drilling Navigation Tool: Design and produce a prototype directional drilling navigation tool capable of high temperature operation in geothermal drilling Accuracies of 0.1° Inclination and Tool Face, 0.5° Azimuth Environmental Ruggedness typical of existing oil/gas drilling Multiple Selectable Sensor Ranges High accuracy for navigation, low bandwidth High G-range & bandwidth for Stick-Slip and Chirp detection Selectable serial data communications Reduce cost of drilling in high temperature Geothermal reservoirs Innovative aspects of project Honeywell MEMS* Vibrating Beam Accelerometers (VBA) APS Flux-gate Magnetometers Honeywell Silicon-On-Insulator (SOI) High-temperature electronics Rugged High-temperature capable package and assembly process
Conference on Space and Military Applications of Automation and Robotics
NASA Technical Reports Server (NTRS)
1988-01-01
Topics addressed include: robotics; deployment strategies; artificial intelligence; expert systems; sensors and image processing; robotic systems; guidance, navigation, and control; aerospace and missile system manufacturing; and telerobotics.
NASA Astrophysics Data System (ADS)
Morris, Phillip A.
The prevalence of low-cost side scanning sonar systems mounted on small recreational vessels has created improved opportunities to identify and map submerged navigational hazards in freshwater impoundments. However, these economical sensors also present unique challenges for automated techniques. This research explores related literature in automated sonar imagery processing and mapping technology, proposes and implements a framework derived from these sources, and evaluates the approach with video collected from a recreational grade sonar system. Image analysis techniques including optical character recognition and an unsupervised computer automated detection (CAD) algorithm are employed to extract the transducer GPS coordinates and slant range distance of objects protruding from the lake bottom. The retrieved information is formatted for inclusion into a spatial mapping model. Specific attributes of the sonar sensors are modeled such that probability profiles may be projected onto a three dimensional gridded map. These profiles are computed from multiple points of view as sonar traces crisscross or come near each other. As lake levels fluctuate over time so do the elevation points of view. With each sonar record, the probability of a hazard existing at certain elevations at the respective grid points is updated with Bayesian mechanics. As reinforcing data is collected, the confidence of the map improves. Given a lake's current elevation and a vessel draft, a final generated map can identify areas of the lake that have a high probability of containing hazards that threaten navigation. The approach is implemented in C/C++ utilizing OpenCV, Tesseract OCR, and QGIS open source software and evaluated in a designated test area at Lake Lavon, Collin County, Texas.
Evaluation on the impact of IMU grades on BDS + GPS PPP/INS tightly coupled integration
NASA Astrophysics Data System (ADS)
Gao, Zhouzheng; Ge, Maorong; Shen, Wenbin; Li, You; Chen, Qijin; Zhang, Hongping; Niu, Xiaoji
2017-09-01
The unexpected observing environments in dynamic applications may lead to partial and/or complete satellite signal outages frequently, which can definitely impact on the positioning performance of the Precise Point Positioning (PPP) in terms of decreasing available satellite numbers, breaking the continuity of observations, and degrading PPP's positioning accuracy. Generally, both the Inertial Navigation System (INS) and the multi-constellation Global Navigation Satellite System (GNSS) can be used to enhance the performance of PPP. This paper introduces the mathematical models of the multi-GNSS PPP/INS Tightly Coupled Integration (TCI), and investigates its performance from several aspects. Specifically, it covers (1) the use of the BDS/GPS PPP, PPP/INS, and their combination; (2) three positioning modes including PPP, PPP/INS TCI, and PPP/INS Loosely Coupled Integration (LCI); (3) the use of four various INS systems named navigation grade, tactical grade, auto grade, and Micro-Electro-Mechanical-Sensors (MEMS) one; (4) three PPP observation scenarios including PPP available, partially available, and fully outage. According to the statistics results, (1) the positioning performance of the PPP/INS (either TCI or LCI) mode is insignificantly depended on the grade of inertial sensor, when there are enough available satellites; (2) after the complete GNSS outages, the TCI mode expresses both higher convergence speed and more accurate positioning solutions than the LCI mode. Furthermore, in the TCI mode, using a higher grade inertial sensor is beneficial for the PPP convergence; (3) under the partial GNSS outage situations, the PPP/INS TCI mode position divergence speed is also restrained significantly; and (4) the attitude determination accuracy of the PPP/INS integration is highly correlated with the grade of inertial sensor.
A comparison between different error modeling of MEMS applied to GPS/INS integrated systems.
Quinchia, Alex G; Falco, Gianluca; Falletti, Emanuela; Dovis, Fabio; Ferrer, Carles
2013-07-24
Advances in the development of micro-electromechanical systems (MEMS) have made possible the fabrication of cheap and small dimension accelerometers and gyroscopes, which are being used in many applications where the global positioning system (GPS) and the inertial navigation system (INS) integration is carried out, i.e., identifying track defects, terrestrial and pedestrian navigation, unmanned aerial vehicles (UAVs), stabilization of many platforms, etc. Although these MEMS sensors are low-cost, they present different errors, which degrade the accuracy of the navigation systems in a short period of time. Therefore, a suitable modeling of these errors is necessary in order to minimize them and, consequently, improve the system performance. In this work, the most used techniques currently to analyze the stochastic errors that affect these sensors are shown and compared: we examine in detail the autocorrelation, the Allan variance (AV) and the power spectral density (PSD) techniques. Subsequently, an analysis and modeling of the inertial sensors, which combines autoregressive (AR) filters and wavelet de-noising, is also achieved. Since a low-cost INS (MEMS grade) presents error sources with short-term (high-frequency) and long-term (low-frequency) components, we introduce a method that compensates for these error terms by doing a complete analysis of Allan variance, wavelet de-nosing and the selection of the level of decomposition for a suitable combination between these techniques. Eventually, in order to assess the stochastic models obtained with these techniques, the Extended Kalman Filter (EKF) of a loosely-coupled GPS/INS integration strategy is augmented with different states. Results show a comparison between the proposed method and the traditional sensor error models under GPS signal blockages using real data collected in urban roadways.
A Comparison between Different Error Modeling of MEMS Applied to GPS/INS Integrated Systems
Quinchia, Alex G.; Falco, Gianluca; Falletti, Emanuela; Dovis, Fabio; Ferrer, Carles
2013-01-01
Advances in the development of micro-electromechanical systems (MEMS) have made possible the fabrication of cheap and small dimension accelerometers and gyroscopes, which are being used in many applications where the global positioning system (GPS) and the inertial navigation system (INS) integration is carried out, i.e., identifying track defects, terrestrial and pedestrian navigation, unmanned aerial vehicles (UAVs), stabilization of many platforms, etc. Although these MEMS sensors are low-cost, they present different errors, which degrade the accuracy of the navigation systems in a short period of time. Therefore, a suitable modeling of these errors is necessary in order to minimize them and, consequently, improve the system performance. In this work, the most used techniques currently to analyze the stochastic errors that affect these sensors are shown and compared: we examine in detail the autocorrelation, the Allan variance (AV) and the power spectral density (PSD) techniques. Subsequently, an analysis and modeling of the inertial sensors, which combines autoregressive (AR) filters and wavelet de-noising, is also achieved. Since a low-cost INS (MEMS grade) presents error sources with short-term (high-frequency) and long-term (low-frequency) components, we introduce a method that compensates for these error terms by doing a complete analysis of Allan variance, wavelet de-nosing and the selection of the level of decomposition for a suitable combination between these techniques. Eventually, in order to assess the stochastic models obtained with these techniques, the Extended Kalman Filter (EKF) of a loosely-coupled GPS/INS integration strategy is augmented with different states. Results show a comparison between the proposed method and the traditional sensor error models under GPS signal blockages using real data collected in urban roadways. PMID:23887084
NASA Technical Reports Server (NTRS)
Teles, Jerome (Editor); Samii, Mina V. (Editor)
1993-01-01
A conference on spaceflight dynamics produced papers in the areas of orbit determination, spacecraft tracking, autonomous navigation, the Deep Space Program Science Experiment Mission (DSPSE), the Global Positioning System, attitude control, geostationary satellites, interplanetary missions and trajectories, applications of estimation theory, flight dynamics systems, low-Earth orbit missions, orbital mechanics, mission experience in attitude dynamics, mission experience in sensor studies, attitude dynamics theory and simulations, and orbit-related experience. These papaers covered NASA, European, Russian, Japanese, Chinese, and Brazilian space programs and hardware.
Development of Flight-Test Performance Estimation Techniques for Small Unmanned Aerial Systems
NASA Astrophysics Data System (ADS)
McCrink, Matthew Henry
This dissertation provides a flight-testing framework for assessing the performance of fixed-wing, small-scale unmanned aerial systems (sUAS) by leveraging sub-system models of components unique to these vehicles. The development of the sub-system models, and their links to broader impacts on sUAS performance, is the key contribution of this work. The sub-system modeling and analysis focuses on the vehicle's propulsion, navigation and guidance, and airframe components. Quantification of the uncertainty in the vehicle's power available and control states is essential for assessing the validity of both the methods and results obtained from flight-tests. Therefore, detailed propulsion and navigation system analyses are presented to validate the flight testing methodology. Propulsion system analysis required the development of an analytic model of the propeller in order to predict the power available over a range of flight conditions. The model is based on the blade element momentum (BEM) method. Additional corrections are added to the basic model in order to capture the Reynolds-dependent scale effects unique to sUAS. The model was experimentally validated using a ground based testing apparatus. The BEM predictions and experimental analysis allow for a parameterized model relating the electrical power, measurable during flight, to the power available required for vehicle performance analysis. Navigation system details are presented with a specific focus on the sensors used for state estimation, and the resulting uncertainty in vehicle state. Uncertainty quantification is provided by detailed calibration techniques validated using quasi-static and hardware-in-the-loop (HIL) ground based testing. The HIL methods introduced use a soft real-time flight simulator to provide inertial quality data for assessing overall system performance. Using this tool, the uncertainty in vehicle state estimation based on a range of sensors, and vehicle operational environments is presented. The propulsion and navigation system models are used to evaluate flight-testing methods for evaluating fixed-wing sUAS performance. A brief airframe analysis is presented to provide a foundation for assessing the efficacy of the flight-test methods. The flight-testing presented in this work is focused on validating the aircraft drag polar, zero-lift drag coefficient, and span efficiency factor. Three methods are detailed and evaluated for estimating these design parameters. Specific focus is placed on the influence of propulsion and navigation system uncertainty on the resulting performance data. Performance estimates are used in conjunction with the propulsion model to estimate the impact sensor and measurement uncertainty on the endurance and range of a fixed-wing sUAS. Endurance and range results for a simplistic power available model are compared to the Reynolds-dependent model presented in this work. Additional parameter sensitivity analysis related to state estimation uncertainties encountered in flight-testing are presented. Results from these analyses indicate that the sub-system models introduced in this work are of first-order importance, on the order of 5-10% change in range and endurance, in assessing the performance of a fixed-wing sUAS.
NASA Astrophysics Data System (ADS)
Gottwald, Martin; Mayekar, Kavita; Reiswich, Vladislav; Bousack, Herbert; Damalla, Deepak; Biswas, Shubham; Metzen, Michael G.; von der Emde, Gerhard
2011-04-01
During their nocturnal activity period, weakly electric fish employ a process called "active electrolocation" for navigation and object detection. They discharge an electric organ in their tail, which emits electrical current pulses, called electric organ discharges (EOD). Local EODs are sensed by arrays of electroreceptors in the fish's skin, which respond to modulations of the signal caused by nearby objects. Fish thus gain information about the size, shape, complex impedance and distance of objects. Inspired by these remarkable capabilities, we have designed technical sensor systems which employ active electrolocation to detect and analyse the walls of small, fluid filled pipes. Our sensor systems emit pulsed electrical signals into the conducting medium and simultaneously sense local current densities with an array of electrodes. Sensors can be designed which (i) analyse the tube wall, (ii) detect and localize material faults, (iii) identify wall inclusions or objects blocking the tube (iv) and find leakages. Here, we present first experiments and FEM simulations on the optimal sensor arrangement for different types of sensor systems and different types of tubes. In addition, different methods for sensor read-out and signal processing are compared. Our biomimetic sensor systems promise to be relatively insensitive to environmental disturbances such as heat, pressure, turbidity or muddiness. They could be used in a wide range of tubes and pipes including water pipes, hydraulic systems, and biological systems. Medical applications include catheter based sensors which inspect blood vessels, urethras and similar ducts in the human body.
Image-guided navigation surgery for pelvic malignancies using electromagnetic tracking
NASA Astrophysics Data System (ADS)
Nijkamp, Jasper; Kuhlmann, Koert; Sonke, Jan-Jakob; Ruers, Theo
2016-03-01
The purpose of this study was to implement and evaluate a surgical navigation system for pelvic malignancies. For tracking an NDI Aurora tabletop field generator and in-house developed navigation software were used. For patient tracking three EM-sensor stickers were used, one on the back and two on the superior iliac spines. During surgery a trackable pointer was used. One day before surgery a CT scan was acquired with the stickers in-place and marked. From the CT scan the EM-sensors, tumor and normal structures were segmented. During surgery, accuracy was independently checked by pointing at the aorta bifurcation and the common iliac artery bifurcations. Subsequently, the system was used to localize the ureters and the tumor. Seven patients were included, three rectal tumors with lymph node-involvement, three lymph node recurrences, and one rectal recurrence. The average external marker registration accuracy was 0.75 cm RMSE (range 0.31-1.58 cm). The average distance between the pointer and the arterial bifurcations was 1.55 cm (1SD=0.63 cm). We were able to localize and confirm the location of all ureters. Twelve out of thirteen lymph nodes were localized and removed. All tumors were removed radically. In all cases the surgeons indicated that the system aided in better anatomical insight, and faster localization of malignant tissue and ureters. In 2/7 cases surgeons indicated that radical resection was only possible with navigation. The navigation accuracy was limited due to the use of skin markers. Nevertheless, preliminary results indicated potential clinical benefit due to better utilization of pre-treatment 3D imaging information.
A Concept for Optimizing Behavioural Effectiveness & Efficiency
NASA Astrophysics Data System (ADS)
Barca, Jan Carlo; Rumantir, Grace; Li, Raymond
Both humans and machines exhibit strengths and weaknesses that can be enhanced by merging the two entities. This research aims to provide a broader understanding of how closer interactions between these two entities can facilitate more optimal goal-directed performance through the use of artificial extensions of the human body. Such extensions may assist us in adapting to and manipulating our environments in a more effective way than any system known today. To demonstrate this concept, we have developed a simulation where a semi interactive virtual spider can be navigated through an environment consisting of several obstacles and a virtual predator capable of killing the spider. The virtual spider can be navigated through the use of three different control systems that can be used to assist in optimising overall goal directed performance. The first two control systems use, an onscreen button interface and a touch sensor, respectively to facilitate human navigation of the spider. The third control system is an autonomous navigation system through the use of machine intelligence embedded in the spider. This system enables the spider to navigate and react to changes in its local environment. The results of this study indicate that machines should be allowed to override human control in order to maximise the benefits of collaboration between man and machine. This research further indicates that the development of strong machine intelligence, sensor systems that engage all human senses, extra sensory input systems, physical remote manipulators, multiple intelligent extensions of the human body, as well as a tighter symbiosis between man and machine, can support an upgrade of the human form.
Zabaleta, Haritz; Valencia, David; Perry, Joel; Veneman, Jan; Keller, Thierry
2011-01-01
ArmAssist is a wireless robot for post stroke upper limb rehabilitation. Knowing the position of the arm is essential for any rehabilitation device. In this paper, we describe a method based on an artificial landmark navigation system. The navigation system uses three optical mouse sensors. This enables the building of a cheap but reliable position sensor. Two of the sensors are the data source for odometry calculations, and the third optical mouse sensor takes very low resolution pictures of a custom designed mat. These pictures are processed by an optical symbol recognition algorithm which will estimate the orientation of the robot and recognize the landmarks placed on the mat. The data fusion strategy is described to detect the misclassifications of the landmarks in order to fuse only reliable information. The orientation given by the optical symbol recognition (OSR) algorithm is used to improve significantly the odometry and the recognition of the landmarks is used to reference the odometry to a absolute coordinate system. The system was tested using a 3D motion capture system. With the actual mat configuration, in a field of motion of 710 × 450 mm, the maximum error in position estimation was 49.61 mm with an average error of 36.70 ± 22.50 mm. The average test duration was 36.5 seconds and the average path length was 4173 mm.
Optimal rotation sequences for active perception
NASA Astrophysics Data System (ADS)
Nakath, David; Rachuy, Carsten; Clemens, Joachim; Schill, Kerstin
2016-05-01
One major objective of autonomous systems navigating in dynamic environments is gathering information needed for self localization, decision making, and path planning. To account for this, such systems are usually equipped with multiple types of sensors. As these sensors often have a limited field of view and a fixed orientation, the task of active perception breaks down to the problem of calculating alignment sequences which maximize the information gain regarding expected measurements. Action sequences that rotate the system according to the calculated optimal patterns then have to be generated. In this paper we present an approach for calculating these sequences for an autonomous system equipped with multiple sensors. We use a particle filter for multi- sensor fusion and state estimation. The planning task is modeled as a Markov decision process (MDP), where the system decides in each step, what actions to perform next. The optimal control policy, which provides the best action depending on the current estimated state, maximizes the expected cumulative reward. The latter is computed from the expected information gain of all sensors over time using value iteration. The algorithm is applied to a manifold representation of the joint space of rotation and time. We show the performance of the approach in a spacecraft navigation scenario where the information gain is changing over time, caused by the dynamic environment and the continuous movement of the spacecraft
NASA Technical Reports Server (NTRS)
Barbee, Brent William; Carpenter, J. Russell; Heatwole, Scott; Markley, F. Landis; Moreau, Michael; Naasz, Bo J.; VanEepoel, John
2010-01-01
The feasibility and benefits of various spacecraft servicing concepts are currently being assessed, and all require that the servicer spacecraft perform rendezvous, proximity, and capture operations with the target spacecraft to be serviced. Many high-value spacecraft, which would be logical targets for servicing from an economic point of view, are located in geosynchronous orbit, a regime in which autonomous rendezvous and capture operations are not commonplace. Furthermore, existing GEO spacecraft were not designed to be serviced. Most do not have cooperative relative navigation sensors or docking features, and some servicing applications, such as de-orbiting of a non-functional spacecraft, entail rendezvous and capture with a spacecraft that may be non-functional or un-controlled. Several of these challenges have been explored via the design of a notional mission in which a nonfunctional satellite in geosynchronous orbit is captured by a servicer spacecraft and boosted into super-synchronous orbit for safe disposal. A strategy for autonomous rendezvous, proximity operations, and capture is developed, and the Orbit Determination Toolbox (ODTBX) is used to perform a relative navigation simulation to assess the feasibility of performing the rendezvous using a combination of angles-only and range measurements. Additionally, a method for designing efficient orbital rendezvous sequences for multiple target spacecraft is utilized to examine the capabilities of a servicer spacecraft to service multiple targets during the course of a single mission.
Transmission of linearly polarized light in seawater: implications for polarization signaling.
Shashar, Nadav; Sabbah, Shai; Cronin, Thomas W
2004-09-01
Partially linearly polarized light is abundant in the oceans. The natural light field is partially polarized throughout the photic range, and some objects and animals produce a polarization pattern of their own. Many polarization-sensitive marine animals take advantage of the polarization information, using it for tasks ranging from navigation and finding food to communication. In such tasks, the distance to which the polarization information propagates is of great importance. Using newly designed polarization sensors, we measured the changes in linear polarization underwater as a function of distance from a standard target. In the relatively clear waters surrounding coral reefs, partial (%) polarization decreased exponentially as a function of distance from the target, resulting in a 50% reduction of partial polarization at a distance of 1.25-3 m, depending on water quality. Based on these measurements, we predict that polarization sensitivity will be most useful for short-range (in the order of meters) visual tasks in water and less so for detecting objects, signals, or structures from far away. Navigation and body orientation based on the celestial polarization pattern are predicted to be limited to shallow waters as well, while navigation based on the solar position is possible through a deeper range.
Semimajor Axis Estimation Strategies
NASA Technical Reports Server (NTRS)
How, Jonathan P.; Alfriend, Kyle T.; Breger, Louis; Mitchell, Megan
2004-01-01
This paper extends previous analysis on the impact of sensing noise for the navigation and control aspects of formation flying spacecraft. We analyze the use of Carrier-phase Differential GPS (CDGPS) in relative navigation filters, with a particular focus on the filter correlation coefficient. This work was motivated by previous publications which suggested that a "good" navigation filter would have a strong correlation (i.e., coefficient near -1) to reduce the semimajor axis (SMA) error, and therefore, the overall fuel use. However, practical experience with CDGPS-based filters has shown this strong correlation seldom occurs (typical correlations approx. -0.1), even when the estimation accuracies are very good. We derive an analytic estimate of the filter correlation coefficient and demonstrate that, for the process and sensor noises levels expected with CDGPS, the expected value will be very low. It is also demonstrated that this correlation can be improved by increasing the time step of the discrete Kalman filter, but since the balance condition is not satisfied, the SMA error also increases. These observations are verified with several linear simulations. The combination of these simulations and analysis provide new insights on the crucial role of the process noise in determining the semimajor axis knowledge.
Non-destructive inspection in industrial equipment using robotic mobile manipulation
NASA Astrophysics Data System (ADS)
Maurtua, Iñaki; Susperregi, Loreto; Ansuategui, Ander; Fernández, Ane; Ibarguren, Aitor; Molina, Jorge; Tubio, Carlos; Villasante, Cristobal; Felsch, Torsten; Pérez, Carmen; Rodriguez, Jorge R.; Ghrissi, Meftah
2016-05-01
MAINBOT project has developed service robots based applications to autonomously execute inspection tasks in extensive industrial plants in equipment that is arranged horizontally (using ground robots) or vertically (climbing robots). The industrial objective has been to provide a means to help measuring several physical parameters in multiple points by autonomous robots, able to navigate and climb structures, handling non-destructive testing sensors. MAINBOT has validated the solutions in two solar thermal plants (cylindrical-parabolic collectors and central tower), that are very demanding from mobile manipulation point of view mainly due to the extension (e.g. a thermal solar plant of 50Mw, with 400 hectares, 400.000 mirrors, 180 km of absorber tubes, 140m height tower), the variability of conditions (outdoor, day-night), safety requirements, etc. Once the technology was validated in simulation, the system was deployed in real setups and different validation tests carried out. In this paper two of the achievements related with the ground mobile inspection system are presented: (1) Autonomous navigation localization and planning algorithms to manage navigation in huge extensions and (2) Non-Destructive Inspection operations: thermography based detection algorithms to provide automatic inspection abilities to the robots.
Earth orbit navigation study. Volume 2: System evaluation
NASA Technical Reports Server (NTRS)
1972-01-01
An overall systems evaluation was made of five candidate navigation systems in support of earth orbit missions. The five systems were horizon sensor system, unkown landmark tracking system, ground transponder system, manned space flight network, and tracking and data relay satellite system. Two reference missions were chosen: a low earth orbit mission and a transfer trajectory mission from low earth orbit to geosynchronous orbit. The specific areas addressed in the evaluation were performance, multifunction utilization, system mechanization, and cost.
On-Board Perception System For Planetary Aerobot Balloon Navigation
NASA Technical Reports Server (NTRS)
Balaram, J.; Scheid, Robert E.; T. Salomon, Phil
1996-01-01
NASA's Jet Propulsion Laboratory is implementing the Planetary Aerobot Testbed to develop the technology needed to operate a robotic balloon aero-vehicle (Aerobot). This earth-based system would be the precursor for aerobots designed to explore Venus, Mars, Titan and other gaseous planetary bodies. The on-board perception system allows the aerobot to localize itself and navigate on a planet using information derived from a variety of celestial, inertial, ground-imaging, ranging, and radiometric sensors.
Precision Positioning and Inertial Guidance Sensors. Technology and Operational Aspects
1981-03-01
Ueberlingen, GE EVALUATION D’UN SYSTEME EUROPEEN DE NAVIGATION HYBRIDE A - - GYROLASER POUR HELICOPTERE: "SEXTAN" by D Regnault, Centre d’Essais en Vol de...NAVIGATION SYSTEM AND STANDARD STATE ELEMENT DEVIATIONMEASUREMENT SOURCES( Dead-reckoning with position fxP fy 5000 En ]TAS, heading and wind scale...Reproduction Ltd ilarford House. 7-9 Charlotte St. London. WIP JIHD [i Ii THEME A new class of precision positioning systems , including GPS (Global
Systematic methods for knowledge acquisition and expert system development
NASA Technical Reports Server (NTRS)
Belkin, Brenda L.; Stengel, Robert F.
1991-01-01
Nine cooperating rule-based systems, collectively called AUTOCREW which were designed to automate functions and decisions associated with a combat aircraft's subsystems, are discussed. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base and to assess the cooperation between the rule bases. Simulation and comparative workload results for two mission scenarios are given. The scenarios are inbound surface-to-air-missile attack on the aircraft and pilot incapacitation. The methodology used to develop the AUTOCREW knowledge bases is summarized. Issues involved in designing the navigation sensor selection expert in AUTOCREW's NAVIGATOR knowledge base are discussed in detail. The performance of seven navigation systems aiding a medium-accuracy INS was investigated using Kalman filter covariance analyses. A navigation sensor management (NSM) expert system was formulated from covariance simulation data using the analysis of variance (ANOVA) method and the ID3 algorithm. ANOVA results show that statistically different position accuracies are obtained when different navaids are used, the number of navaids aiding the INS is varied, the aircraft's trajectory is varied, and the performance history is varied. The ID3 algorithm determines the NSM expert's classification rules in the form of decision trees. The performance of these decision trees was assessed on two arbitrary trajectories, and the results demonstrate that the NSM expert adapts to new situations and provides reasonable estimates of the expected hybrid performance.
Research and technology: Fiscal year 1984 report
NASA Technical Reports Server (NTRS)
1985-01-01
Topics covered include extraterrestrial physics, high energy astrophysics, astronomy, solar physics, atmospheres, oceans, terrestrial physics, space technology, sensors, techniques, user space data systems, space communications and navigation, and system and software engineering.