Developing a visual navigation system for autonomous navigation of a Triceratops robot in indoor environments.
This project demonstates how triceratops robot can use visual navigation system to navigate in indoor environments, with Visual SLAM and AprilTag localization for accurate pose estimation and map building.
We use the Triceratops robot platform equipped with a RGB-D camera for visual navigation.
The robot is built by City Science Lab@Taipei Tech Robotic team, and the low-level control(gait control) is not developed by me, but I developed the visual navigation system and integrated it with the robot for autonomous navigation.

Triceratops robot platform setup
The visual navigation system consists of several key components, including Visual SLAM for map building and localization, and AprilTag markers for accurate pose estimation and map switching.

Visual navigation system architecture diagram
The visual slam we are using is from NVIDIA Isaac ROS VSLAM, and I take visual odometry into account when integrating the visual navigation system with the triceratops robot for autonomous navigation in indoor environments.
Visual SLAM demonstration for map building and localization
As the pure visual navigation system can be noisy and not robust in complex environments, we utilize AprilTag markers for accurate localization during indoor navigation.

AprilTag marker detection
AprilTag-assisted localization demo