← Back to Portfolio

Visual Navigation System for Triceratops Robot

Developing a visual navigation system for autonomous navigation of a Triceratops robot in indoor environments.

Overview

This project demonstates how triceratops robot can use visual navigation system to navigate in indoor environments, with Visual SLAM and AprilTag localization for accurate pose estimation and map building.

Robot Setup

We use the Triceratops robot platform equipped with a RGB-D camera for visual navigation.

The robot is built by City Science Lab@Taipei Tech Robotic team, and the low-level control(gait control) is not developed by me, but I developed the visual navigation system and integrated it with the robot for autonomous navigation.

Triceratops robot platform equipped with a RGB-D camera for visual navigation

Triceratops robot platform setup

System Architecture

The visual navigation system consists of several key components, including Visual SLAM for map building and localization, and AprilTag markers for accurate pose estimation and map switching.

Visual navigation system architecture diagram

Visual navigation system architecture diagram

Visual SLAM

The visual slam we are using is from NVIDIA Isaac ROS VSLAM, and I take visual odometry into account when integrating the visual navigation system with the triceratops robot for autonomous navigation in indoor environments.

Visual SLAM demonstration for map building and localization

AprilTag-assisted Localization

As the pure visual navigation system can be noisy and not robust in complex environments, we utilize AprilTag markers for accurate localization during indoor navigation.

AprilTag marker detection for localization and map switching

AprilTag marker detection

AprilTag-assisted localization demo

Links