面向图像语义分割任务的多级注意力蒸馏学习  

MAD:Multi attention distillation learning for semantic segmentation

在线阅读下载全文

作  者:刘佳琦[1,2] 杨璐[1,2] 王龙志 LIU Jiaqi;YANG Lu;WANG Longzhi(Tianjin Key Laboratory for Advanced Mechatronic System Design and Intelligent Control,School of Mechanical Engineering,Tianjin University of Technology,Tianjin 300384,China;National Demonstration Center for Experimental Mechanical and Electrical Engineering Education(Tianjin University of Technology),Tianjin 300384,China;Autobrain(Tianjin)Technology,LTD,Tianjin 300300,China)

机构地区:[1]天津理工大学天津市先进机电系统设计与智能控制重点实验室,天津300384 [2]天津理工大学机电工程国家级实验教学示范中心,天津300384 [3]奥特贝睿(天津)科技有限公司,天津300300

出  处:《智能计算机与应用》2021年第5期13-18,25,共7页Intelligent Computer and Applications

基  金:天津市自然科学基金(16JCQNJC04100)。

摘  要:传统的蒸馏学习仅通过大网络对轻量网络进行单向蒸馏,不但难以从轻量网络的学习状态中得到反馈信息,对训练过程进行优化调整,同时还会限制轻量网络的特征表达能力。本文提出结合自身多级注意力上下文信息进行自我学习优化的方法(MAD,Multi Attention Distillation),以自监督的方式使自身成熟的部分约束不成熟部分,即浅层可以从深层中提取有用的上下文信息,让浅层特征学习高层特征的表达,从而提升网络的整体表达能力。使用轻量级网络ERFNet、DeepLabV3在两个不同任务的数据集CULane、VOC上进行验证。实验结果表明,MAD可以在不增加推理时间的前提下,提升网络的特征提取能力,使ERFNet在CULane任务的F1-measure指标提升2.13,DeepLabV3在VOC任务的mIoU指标提升1.5。Traditional distillation learning only uses one-way distillation of large network to light-weight network,which not only is difficult to get feedback information from the learning state of light-weight network and optimize the training process,but also limits the feature expression ability of light-weight network. This paper proposes a self-learning optimization method(MAD,Multi Attention Distillation)based on multi-level attention context information,which makes the mature part restrain the immature part by self-monitoring,that is,the shallow layer can extract useful context information from the deep layer,and let the shallow layer learn the expression of high-level features,so as to improve the overall expression ability of the network. Using lightweight network ERFnet and DeepLabV3 to verify on two different task datasets,CULane and VOC,mad can improve the network performance without increasing the reasoning time. Improved the F1-measure index of ERFNet in CULane task by2.13,and improved the mIoU index of DeepLabV3 in VOC task by1.5.

关 键 词:蒸馏学习 语义分割 注意力 卷积神经网络 

分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象