CASIA OpenIR  > 模式识别国家重点实验室  > 机器人视觉
Optical Flow Assisted Monocular Visual Odometry
Wan, Yiming1,2; Gao, Wei1,2; Wu, Yihong1,2
Conference NameAsian Conference on Pattern Recognition
Conference Date2019.11.26-2019.11.29
Conference PlaceAuckland, New Zealand

This paper proposes a novel deep learning based approach for monocular visual odometry (VO) called FlowVO-Net. Our approach utilizes CNN to extract motion information between two consecutive frames and employs Bi-directional convolution LSTM (Bi-ConvLSTM) for temporal modelling. ConvLSTM can encode not only temporal information but also spatial correlation, and the bidirectional architecture enables it to learn the geometric relationship from image sequences pre and post. Besides, our approach jointly predicts optical flow as an auxiliary task in a self-supervised way by measuring photometric consistency. Experiment results indicate competitive performance of the proposed FlowVO-Net to the state-of-art methods.

Indexed ByEI
Document Type会议论文
First Author AffilicationInstitute of Automation, Chinese Academy of Sciences
Recommended Citation
GB/T 7714
Wan, Yiming,Gao, Wei,Wu, Yihong. Optical Flow Assisted Monocular Visual Odometry[C],2020.
Files in This Item: Download All
File Name/Size DocType Version Access License
Optical Flow Assiste(2741KB)会议论文 开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Wan, Yiming]'s Articles
[Gao, Wei]'s Articles
[Wu, Yihong]'s Articles
Baidu academic
Similar articles in Baidu academic
[Wan, Yiming]'s Articles
[Gao, Wei]'s Articles
[Wu, Yihong]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Wan, Yiming]'s Articles
[Gao, Wei]'s Articles
[Wu, Yihong]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: Optical Flow Assisted Monocular Visual Odometry.pdf
Format: Adobe PDF
This file does not support browsing at this time
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.