首页 | 官方网站   微博 | 高级检索  
     

基于类间距离蒸馏的语义分割
引用本文:邓文革,王亚军,隋立林,孙国栋,张正博.基于类间距离蒸馏的语义分割[J].计算机系统应用,2023,32(10):235-241.
作者姓名:邓文革  王亚军  隋立林  孙国栋  张正博
作者单位:国能数智科技开发(北京)有限公司, 北京 100011;武汉大学 测绘遥感信息工程国家重点实验室, 武汉 430079
摘    要:知识蒸馏被广泛应用于语义分割以减少计算量.以往的语义分割知识提取方法侧重于像素级的特征对齐和类内特征变化提取,忽略了对语义分割非常重要的类间距离知识的传递.为了解决这个问题,本文提出了一种类间距离提取方法,将特征空间中的类间距离从教师网络转移到学生网络.此外,语义分割是一个位置相关的任务,因此本文开发了一个位置信息提取模块来帮助学生网络编码更多的位置信息.在Cityscapes、Pascal VOC和ADE20K这3个流行的语义分割数据集上的大量实验表明,该方法有助于提高语义分割模型的精度,取得了较好的性能.

关 键 词:知识蒸馏  语义分割  模型压缩
收稿时间:2023/3/27 0:00:00
修稿时间:2023/4/27 0:00:00

Distilling Inter-class Distance for Semantic Segmentation
DENG Wen-Ge,WANG Ya-Jun,SUI Li-Lin,SUN Guo-Dong,ZHANG Zheng-Bo.Distilling Inter-class Distance for Semantic Segmentation[J].Computer Systems& Applications,2023,32(10):235-241.
Authors:DENG Wen-Ge  WANG Ya-Jun  SUI Li-Lin  SUN Guo-Dong  ZHANG Zheng-Bo
Affiliation:CHN Energy Digital Intelligence Technology Development (Beijing) Co. Ltd., Beijing 100011, China; State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China
Abstract:Knowledge distillation is widely adopted in semantic segmentation to reduce the computation cost. The previous knowledge distillation methods for semantic segmentation focus on pixel-wise feature alignment and intra-class feature variation distillation, neglecting to transfer the knowledge of the inter-class distance, which is important for semantic segmentation. To address this issue, this study proposes an inter-class distance distillation (IDD) method to transfer the inter-class distance in the feature space from the teacher network to the student network. Furthermore, since semantic segmentation is a position-dependent task, thus this study exploits a position information distillation module to help the student network encode more position information. Extensive experiments on three popular semantic segmentation datasets: Cityscapes, Pascal VOC, and ADE20K show that the proposed method is helpful to improve the accuracy of semantic segmentation models and achieves great performance.
Keywords:knowledge distillation|semantic segmentation|model compression
点击此处可从《计算机系统应用》浏览原始摘要信息
点击此处可从《计算机系统应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号