Multi-granularity Distillation Scheme Towards Lightweight Semi-supervised Semantic Segmentation
Jie Qin1,2,3; Jie Wu2; Ming Li2; Xuefeng Xiao2; Min Zheng2; Xingang Wang3
2022
会议名称European Conference on Computer Vision
会议日期10.23-10.27
会议地点以色列特拉维夫
摘要

Albeit with varying degrees of progress in the field of Semi-Supervised Semantic Segmentation, most of its recent successes are involved in unwieldy models and the lightweight solution is still not yet explored. We find that existing knowledge distillation techniques pay more attention to pixel-level concepts from labeled data, which fails to take more informative cues within unlabeled data into account. Consequently, we offer the first attempt to provide lightweight SSSS models via a novel multi-granularity distillation (MGD) scheme, where multi-granularity is captured from three aspects: i) complementary teacher structure; ii) labeled-unlabeled data cooperative distillation; iii) hierarchical and multi-levels loss setting. Specifically, MGD is formulated as a labeled-unlabeled data cooperative distillation scheme, which helps to take full advantage of diverse data characteristics that are essential in the semi-supervised setting. Image-level semantic-sensitive loss, region-level content-aware loss, and pixel-level consistency loss are set up to enrich hierarchical distillation abstraction via structurally complementary teachers. Experimental results on PASCAL VOC2012 and Cityscapes reveal that MGD can outperform the competitive approaches by a large margin under diverse partition protocols. For example, the performance of ResNet-18 and MobileNet-v2 backbone is boosted by 11.5% and 4.6% respectively under 1/16 partition protocol on Cityscapes. Although the FLOPs of the model backbone is compressed by 3.4-5.3× (ResNet-18) and 38.7-59.6× (MobileNetv2), the model manages to achieve satisfactory segmentation results.

七大方向——子方向分类图像视频处理与分析
国重实验室规划方向分类视觉信息处理
是否有论文关联数据集需要存交
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/57169
专题中国科学院工业视觉智能装备工程实验室_精密感知与控制
作者单位1.School of Artificial Intelligence, University of Chinese Academy of Sciences
2.ByteDance Inc
3.Institute of Automation, Chinese Academy of Sciences
第一作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Jie Qin,Jie Wu,Ming Li,et al. Multi-granularity Distillation Scheme Towards Lightweight Semi-supervised Semantic Segmentation[C],2022.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Multi-granularity Di(4167KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Jie Qin]的文章
[Jie Wu]的文章
[Ming Li]的文章
百度学术
百度学术中相似的文章
[Jie Qin]的文章
[Jie Wu]的文章
[Ming Li]的文章
必应学术
必应学术中相似的文章
[Jie Qin]的文章
[Jie Wu]的文章
[Ming Li]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Multi-granularity Distillation Scheme Towards Lightweight Semi-supervised Semantic Segmentation.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。