|Coupled Multi-Vehicle Detection and Classification with Prior Objectness Measure|
|Yao YJ(姚彦洁); Tian B(田滨); Wang FY(王飞跃); Bin Tian
|Source Publication||IEEE Transactions on Vehicular Technology
|Abstract||Vehicle recognition plays an important role in traffic surveillance systems, advanced driver assistance systems, and autonomous vehicles. This paper presents a novel approach for multi-vehicle recognition which considers vehicle space location and classification as a coupled optimization problem. It can speed up the detection process with more accurate vehicle region proposals, and can recognize multi-vehicles using a single model. The proposed detector is implemented by three stages: 1) Obtaining candidate vehicle locations with prior objectness measure; 2) classifying vehicle region proposals to distinguish three common types of vehicles (i.e. car, taxi, and bus) by a single convolutional neural network; and 3) coupling classification results with detection process which lead to fewer false positives. In experiments on high-resolution traffic images, our method achieves unique characteristics: 1) it matches the state-of-the-art detection accuracy; 2) it is more efficiently generating smaller set of high quality vehicle windows; 3) searching time is decreased about 30 times compared with other two popular detection schemes; and 4) it recognizes different vehicles in each image using a single CNN model with 8-layers.|
Convolutional Neural Network (Cnn)
|Corresponding Author||Bin Tian|
Yao YJ,Tian B,Wang FY,et al. Coupled Multi-Vehicle Detection and Classification with Prior Objectness Measure[J]. IEEE Transactions on Vehicular Technology,2016(99):1-1.
Yao YJ,Tian B,Wang FY,&Bin Tian.(2016).Coupled Multi-Vehicle Detection and Classification with Prior Objectness Measure.IEEE Transactions on Vehicular Technology(99),1-1.
Yao YJ,et al."Coupled Multi-Vehicle Detection and Classification with Prior Objectness Measure".IEEE Transactions on Vehicular Technology .99(2016):1-1.
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.