AAU Student Projects - visit Aalborg University's student projects portal
A master's thesis from Aalborg University
Book cover


Mobile Robot Localization Under intermitted GNSS Service Using Visual Support

Author

Term

4. semester

Education

Publication year

2025

Submitted on

Abstract

Udendørs mobile robotter, som f.eks. sportsbane‑opmærkningsrobotter, er afhængige af GNSS og kameraer til præcis navigation og linjedetektion, men afbrudte GNSS‑signaler og forringet billedkvalitet pga. vejr eller lys kan give væsentlige posefejl. Dette projekt undersøger, hvordan lokalisationen kan gøres robust over for intermitterende GNSS ved at støtte med visuelle informationer og andre komplementære sensorer. Tilgangen bygger på en perceptionskæde, der med traditionelle metoder (bl.a. Hough‑transform) detekterer linjer bag robotten og bruger dem som reference mod et kendt kort, samt på odometri og IMU‑målinger. Disse kilder fusions i et Udvidet Kalman‑filter sammen med GNSS, når det er tilgængeligt. Metoden er implementeret i en ROS2/Gazebo‑baseret simulering og afprøvet under både nominelle og degraderede forhold med og uden sensorfejl. Resultaterne viser, at sensorfusion med visuel støtte forbedrer systemets robusthed og stabiliserer poseestimeringen, når GNSS er ustabilt eller delvist utilgængeligt, hvilket er særligt relevant for mobile felt‑ og linjemarkeringsrobotter.

Outdoor mobile robots, such as sports field‑marking robots, rely on GNSS and cameras for accurate navigation and line detection, yet intermittent GNSS and degraded imagery due to weather or lighting can cause significant pose errors. This thesis investigates how to maintain reliable localization under intermittent GNSS by leveraging visual support and complementary sensors. The approach employs a perception pipeline that uses traditional methods (notably the Hough transform) to detect lines behind the robot and reference them against a known map, alongside wheel odometry and IMU measurements. These data are fused with GNSS (when available) using an Extended Kalman Filter. The method is implemented in a ROS2/Gazebo simulation and evaluated under nominal and degraded conditions with and without sensor faults. The results indicate that sensor fusion with visual support improves system robustness and stabilizes pose estimation when GNSS is unstable or partially unavailable, which is particularly relevant for mobile field‑ and line‑marking robots.

[This abstract was generated with the help of AI]