Demonstration video, IEEE TCSVT 2015

"HW/SW co-design and FPGA acceleration of visual odometry algorithms for rover navigation on Mars",
George Lentaris, Ioannis Stamoulias, Dimitrios Soudris, and Manolis Lourakis

Description: the following video records at runtime the FPGA (Xilinx xc6vlx240t-2 at 172MHz, HTG board) and CPU (Intel core2-duo, E8400) co-processing. The visual odometry algorithms process successively the stereo images acquired by a hypothetical rover moving on the Martian surface to estimate the pose of the rover (position and orientation) at each time instant. The images (512x384px) are input to the CPU, sent to the FPGA for feature detection with Harris, feature description with SIFT, and feature matching with chi-square distance, while the CPU performs filtering and egomotion estimation with absolute orientation. The rover's path in this test is 100m long and the positional error of the HW/SW system remains around 1% (we run pipe1no on a synthetic dataset). All intermediate results/timings are stored on disk and analyzed with MATLAB to do the scaling/estimation of the cost on space-grade devices.

• hit play below, or download the entire video here (76MB, .mp4)













extra tools used for the demo: mint linux (with feh and tcpdump), gnuplot , MATLAB , simple screen recorder , openshot video editor , avidemux ,

George Lentaris, March 2015.
microlab, ECE, NTUA, GR.