Monday, November 18, 2024 02:00PM

Ph.D. Defense

 

Sébastien Henry

(Advisor: Prof. Christian)

 

"Absolute and Autonomous Spacecraft Navigation Using Line-of-Sight Measurements" 
 


Monday, November 18 

2:00 p.m. 
CODA C1015 Vinings

Microsoft Teams

 

Abstract
Increasing activities in the space domain drive the need for better navigation solutions. Optical navigation (OPNAV) is particularly attractive because it allows one to navigate autonomously using different beacons like stars, planets, moons, asteroids, and other spacecraft. However, there are still technical hurdles to overcome about autonomous OPNAV. The evermore stringent mission requirements call for efficient OPNAV solutions that provide the best state estimation possible. Moreover, the need for efficient algorithms is further motivated by the fact that cameras increase in resolution while new image processing pipelines often allow for the extraction of more measurements. Finally, autonomous navigation pipelines need recovery procedures in case of filter failure or simply as a redundancy check. This work specifically treats the case of line-of-sight (LOS) measurements and focuses on four contributions.

One of the most fundamental algorithms used in computer vision is triangulation. Triangulation serves a dual purpose: reconstructing shapes from multiple pictures and estimating the position of a camera when observing known features. The first contribution reviews the state-of-the-art in triangulation and develops a complete non-iterative framework for an uncertainty-aware statistically optimal solution: linear optimal sine triangulation (LOST). LOST gives similar results as iterative schemes but at a fraction of the computational cost. 

LOST is applied to space exploration in a second contribution. More specifically, effects like relativity, planetary uncertainty, and light time-of-flight are important in celestial navigation. LOST is used to analyze the impact of those effects and seamlessly integrates them into the navigation solution. LOST for planet identification in the image in the case of a full lost-in-space problem. Finally, LOST is adapted to the pushbroom camera model.
 

The third contribution considers a problem adjacent to triangulation, the perspective-n-point (PnP). At the difference with triangulation, the PnP further needs to estimate the camera rotation on top of the position. The PnP is very relevant for robot navigation. The linear optimal sine framework is extended to solve this harder problem and demonstrates faster performance compared to other optimal methods.

Finally, the fourth contribution tackles the use of LOS in the case of a formation flying initial orbit determination (IOD). For this, the LOS measurements are augmented with knowledge of the range by the inter-satellite communication. The developed solution allows the recovery of the inertial orbits of multiple spacecraft that can then initialize navigation filters.
 

Committee

•    Prof. John A. Christian – School of Aerospace Engineering (advisor)
•    Dr. Adnan Ansar – Aerial and Orbital Image Analysis Group (Jet Propulsion Laboratory)
•    Prof. Frank Dellaert – School of Interactive Computing (Georgia Tech)
•    Prof. Brian C. Gunter – School of Aerospace Engineering (Georgia Tech)
•    Prof. Edgar G. Lightsey – School of Aerospace Engineering (Georgia Tech)