Double Domain Guided Real-Time Low-Light Image Enhancement for Ultra-High-Definition Transportation Surveillance
Qu, Jingxiang1,2; Liu, Ryan Wen1,2; Gao, Yuan1,2; Guo, Yu2; Zhu, Fenghua3; Wang, Fei-Yue3
发表期刊IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS
ISSN1524-9050
2024-02-07
页码13
通讯作者Liu, Ryan Wen(wenliu@whut.edu.cn)
摘要Real-time transportation surveillance is an essential part of the intelligent transportation system (ITS). However, images captured under low-light conditions often suffer poor visibility with types of degradation, such as noise interference and vague edge features, etc. With the development of imaging devices, the quality of the visual surveillance data is continually increasing, like 2K and 4K, which have more strict requirements on the efficiency of image processing. To satisfy the requirements on both enhancement quality and computational speed, this paper proposes a double domain guided real-time low-light image enhancement network (DDNet) for ultra-high-definition (UHD) transportation surveillance. Specifically, we design an encoder-decoder structure as the main architecture of the learning network. In particular, the enhancement processing is divided into two subtasks (i.e., color enhancement and gradient enhancement) via the proposed coarse enhancement module (CEM) and LoG-based gradient enhancement module (GEM), which are embedded in the encoder-decoder structure. It enables the network to enhance the color and edge features simultaneously. Through the decomposition and reconstruction on both color and gradient domains, our DDNet can restore the detailed feature information concealed by the darkness with better visual quality and efficiency. The evaluation experiments on standard and transportation-related datasets demonstrate that our DDNet provides superior enhancement quality and efficiency compared with state-of-the-art methods. Besides, the object detection and scene segmentation experiments indicate the practical benefits for higher-level image analysis under low-light environments in ITS. The source code is available at https://github.com/QuJX/DDNet.
关键词Intelligent transportation system (ITS) transportation surveillance low-light image enhancement ultra-high-definition (UHD) double domain guidance
DOI10.1109/TITS.2024.3359755
关键词[WOS]SIGNAL FIDELITY ; DEEP NETWORK
收录类别SCI
语种英语
资助项目National Key Research and Development Program of China
项目资助者National Key Research and Development Program of China
WOS研究方向Engineering ; Transportation
WOS类目Engineering, Civil ; Engineering, Electrical & Electronic ; Transportation Science & Technology
WOS记录号WOS:001161083400001
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
引用统计
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/57764
专题多模态人工智能系统全国重点实验室_平行智能技术与系统团队
通讯作者Liu, Ryan Wen
作者单位1.Wuhan Univ Technol, Sch Nav, Wuhan 430063, Peoples R China
2.Wuhan Univ Technol, State Key Lab Maritime Technol & Safety, Wuhan, Peoples R China
3.Chinese Acad Sci, Inst Automat, State Key Lab Management & Control Complex Syst, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Qu, Jingxiang,Liu, Ryan Wen,Gao, Yuan,et al. Double Domain Guided Real-Time Low-Light Image Enhancement for Ultra-High-Definition Transportation Surveillance[J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS,2024:13.
APA Qu, Jingxiang,Liu, Ryan Wen,Gao, Yuan,Guo, Yu,Zhu, Fenghua,&Wang, Fei-Yue.(2024).Double Domain Guided Real-Time Low-Light Image Enhancement for Ultra-High-Definition Transportation Surveillance.IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS,13.
MLA Qu, Jingxiang,et al."Double Domain Guided Real-Time Low-Light Image Enhancement for Ultra-High-Definition Transportation Surveillance".IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS (2024):13.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Qu, Jingxiang]的文章
[Liu, Ryan Wen]的文章
[Gao, Yuan]的文章
百度学术
百度学术中相似的文章
[Qu, Jingxiang]的文章
[Liu, Ryan Wen]的文章
[Gao, Yuan]的文章
必应学术
必应学术中相似的文章
[Qu, Jingxiang]的文章
[Liu, Ryan Wen]的文章
[Gao, Yuan]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。