Abstract

Towards autonomous 3D modelling of moving targets, we present a system where multiple ground-based robots cooperate to localize, follow and scan from all sides a moving target. Each robot has a single camera as its only sensor, and they perform collaborative visual SLAM (CoSLAM). We present a simple robot controller that maintains the visual constraints of CoSLAM while orbiting a moving target so as to observe it from all sides. Real-world experiments demonstrate that multiple ground robots can successfully track and scan a moving target.

Video Demo:

Publication

[1] Jacob Perron*, Rui Huang*, Jack Thomas, Lingkang Zhang, Ping Tan, Richard Vaughan, Orbiting a Moving Target with Multi-Robot Collaborative Visual SLAM, Workshop on Multi-View Geometry in Robotics (MVIGRO) at the 2015 Robotics: Science and System Conference (RSS'15 workshop), Rome, Italy, 2015. PDF (* are first authors of equal contribution)

Code and How to use

1. CoSLAM for Target Following: The code is available here.

The improved Collaborative visual SLAM system is designed for dynamic target following using multiple ground robots with off-the-shelf color cameras. It is based on the original CoSLAM system2 by D.Zou and P.Tan (Project Page). This code introduces several new features to the original CoSLAM system. The new system is integrated with ROS and iRobot Create. It receives multiple video streams to perfrom collaborative SLAM in real time. A robust feature tracker is implemented. More details can be found in our paper [1].

You may refer to the github page of the original CoSLAM for some instructions on code installation. A detailed instruction for the new system is under development.

Please cite both [1] and [2] if you use this code in your research.

[2] Danping Zou, and Ping Tan, CoSLAM: Collaborative visual SLAM in dynamic environments Pattern Analysis and Machine Intelligence, IEEE Transactions on 35.2 (2013): 354-366.

2. Robot controller: The code is available here

The robot controller recevies the estimated poses of the target and followers from the improved CoSLAM system. The robot team can follow the target by orbiting it using the controller.

Note: The code is for the development of research-based applications. It could be messy and tricky to set up. I am sorry that a detailed instruction is under development. Please contact me if you are interested in the work for now.

Contact

Please feel free to contact me if you are interested in this work and encounter problems using the software.

Rui Huang, Simon Fraser University

Email: rha55@sfu.ca

Personal Webpage: http://www.sfu.ca/~rha55/