CASIA OpenIR
UA-DETRAC: A new benchmark and protocol for multi-object detection and tracking
Wen, Longyin1; Du, Dawei2; Cai, Zhaowei3; Lei, Zhen4; Chang, Ming-Ching2; Qi, Honggang5; Lim, Jongwoo6; Yang, Ming-Hsuan7; Lyu, Siwei2
Source PublicationCOMPUTER VISION AND IMAGE UNDERSTANDING
ISSN1077-3142
2020-04-01
Volume193Pages:20
Corresponding AuthorLyu, Siwei(slyu@albany.edu)
AbstractEffective multi-object tracking (MOT) methods have been developed in recent years for a wide range of applications including visual surveillance and behavior understanding. Existing performance evaluations of MOT methods usually separate the tracking step from the detection step by using one single predefined setting of object detection for comparisons. In this work, we propose a new University at Albany DEtection and TRACking (UA-DETRAC) dataset for comprehensive performance evaluation of MOT systems especially on detectors. The UA-DETRAC benchmark dataset consists of 100 challenging videos captured from real-world traffic scenes (over 140,000 frames with rich annotations, including illumination, vehicle type, occlusion, truncation ratio, and vehicle bounding boxes) for multi-object detection and tracking. We evaluate complete MOT systems constructed from combinations of state-of-the-art object detection and tracking methods. Our analysis shows the complex effects of detection accuracy on MOT system performance. Based on these observations, we propose effective and informative evaluation metrics for MOT systems that consider the effect of object detection for comprehensive performance analysis.
KeywordObject detection Object tracking Benchmark Evaluation protocol
DOI10.1016/j.cviu.2020.102907
WOS KeywordMULTITARGET TRACKING ; ROBUST ; APPEARANCE ; HISTOGRAMS
Indexed BySCI
Language英语
Funding ProjectUS Natural Science Foundation[IIS1816227] ; National Nature Science Foundation of China[61472388] ; National Nature Science Foundation of China[61771341]
Funding OrganizationUS Natural Science Foundation ; National Nature Science Foundation of China
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Artificial Intelligence ; Engineering, Electrical & Electronic
WOS IDWOS:000518876100004
PublisherACADEMIC PRESS INC ELSEVIER SCIENCE
Citation statistics
Cited Times:1[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/38604
Collection中国科学院自动化研究所
Corresponding AuthorLyu, Siwei
Affiliation1.JD Finance Amer Corp, Mountain View, CA USA
2.SUNY Albany, Dept Comp Sci, Albany, NY 12222 USA
3.Univ Calif San Diego, Dept Elect & Comp Engn, San Diego, CA 92103 USA
4.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing, Peoples R China
5.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing, Peoples R China
6.Hanyang Univ, Div Comp Sci & Engn, Seoul, South Korea
7.Univ Calif Merced, Sch Engn, Merced, CA USA
Recommended Citation
GB/T 7714
Wen, Longyin,Du, Dawei,Cai, Zhaowei,et al. UA-DETRAC: A new benchmark and protocol for multi-object detection and tracking[J]. COMPUTER VISION AND IMAGE UNDERSTANDING,2020,193:20.
APA Wen, Longyin.,Du, Dawei.,Cai, Zhaowei.,Lei, Zhen.,Chang, Ming-Ching.,...&Lyu, Siwei.(2020).UA-DETRAC: A new benchmark and protocol for multi-object detection and tracking.COMPUTER VISION AND IMAGE UNDERSTANDING,193,20.
MLA Wen, Longyin,et al."UA-DETRAC: A new benchmark and protocol for multi-object detection and tracking".COMPUTER VISION AND IMAGE UNDERSTANDING 193(2020):20.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Wen, Longyin]'s Articles
[Du, Dawei]'s Articles
[Cai, Zhaowei]'s Articles
Baidu academic
Similar articles in Baidu academic
[Wen, Longyin]'s Articles
[Du, Dawei]'s Articles
[Cai, Zhaowei]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Wen, Longyin]'s Articles
[Du, Dawei]'s Articles
[Cai, Zhaowei]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.