SentiPro: time-sync of multiple stereo cameras for visual SLAM
Background
For agile, autonomous systems there is little point in having expensive and accurate sensors if their measurements are not synchronized in time. This is particularly when using visual SLAM (simultaneous localization and mapping), where multiple cameras are used to estimate the position of a robot. If the images from the cameras are not synchronized, the vehicle will have moved slightly between the time the image was taken and the time it is timestamped with, making it difficult for the SLAM algorithm to estimate the position of the robot accurately.
One solution to this problem is hardware time synchronization, where the cameras are triggered to take images at the same time using a common signal from e.g. a “master” camera. However, this does not scale well when there are many cameras, and does not sync the cameras to the other sensors (e.g. LiDAR, IMU, GPS) on the robot.
The Trondheim-based startup SentiSystems solves this by using their SenTiBoard, which provides hardware time synchronization for multiple sensors. They are collaborating with NVIDIA to showcase the importance of good time synchronization for visual SLAM using multiple cameras, using NVIDIAs Jetson AI compute platform and their SLAM pipeline.
Scope
This project will set up a VSLAM pipeline, and compare various methods for synchronization of cameras and other sensors. It will use data from three Intel RealSense D455 stereo cameras, mounted on a Nova Carter from Segway Robotics.
Proposed tasks
- Survey cutting-edge research in visual SLAM, including recent neural network-based approaches
- Familiarize with the existing hardware and software used in the project
- Set up a VSLAM pipeline in ROS, using images from multiple stereo cameras taken at different times
- Compare the performance of the VSLAM when using the timestamps provided by the SenTiBoard with software-based timestamping
- Discuss the results with a critical eye, and conclude the work in a written report
Prerequisites
The project lies in the intersection of estimation/navigation and embedded computers. No candidate is expected to be an expert in all these domains, but the background and interest of the candidate will help determine the focus of the project.
- robotic/computer vision
- estimation/navigation/sensor fusion
- ROS/ROS2
Contact
Contact supervisor Kristoffer Gryte. Other people involved in the project:
- Torleiv H. Bryne (Assoc. Prof. NTNU ITK)