基于协同压缩方法的轻量化SAR舰船识别  

Lightweight SAR Ship Recognition Based on Collaborative Compression Method

作  者:李炫潮 何永华 朱卫纲 李永刚 邱琳琳 LI Xuanchao;HE Yonghua;ZHU Weigang;LI Yonggang;QIU Linlin(Space Engineering University,Beijing 101000 China)

机构地区:[1]航天工程大学,北京101000

出  处:《电光与控制》2025年第2期103-110,共8页Electronics Optics & Control

摘  要:深度学习为SAR舰船识别提供了新方法,但是目前大多数深度学习模型的参数众多,难以在资源受限的环境中运行,模型压缩是其实现的必要条件。剪枝作为一种常用的模型压缩方法,当深度神经网络被过多地修枝时,其精确度会显著降低,传统微调方法无法将其恢复到较高的精确度。为此,以SAR舰船为研究对象,提出一种协同模型剪枝与知识蒸馏的网络压缩方法。首先,定义了协同压缩方法总体架构,即将模型剪枝中的微调过程替换为知识蒸馏,以提高剪枝后网络的精确度;然后,在传统知识蒸馏方法中引入学生自我反思机制,以进一步提高目标网络的性能。实验结果表明,利用所提知识蒸馏方法恢复的剪枝网络具有更好的性能,而且协同压缩后的网络模型性能目前已达到主流轻量级网络水平。Deep learning provides a new methods for SAR ship recognition but most current deep learning models have a large number of parameters making them difficult to run in resource-constrained environments.Model compression is a necessary condition for their implementation.Pruning is a commonly used model compression method.When the deep neural network is pruned too much its accuracy will be significantly reduced and traditional fine-tuning methods cannot restore it to a higher accuracy.For this reason focusing on SAR ships as the research object a network compression method that combines model pruning with knowledge distillation is proposed.Firstly the overall architecture of the collaborative compression method is defined which replaces the fine-tuning process in model pruning with knowledge distillation to improve the accuracy of the pruned network.Then a student self-reflection mechanism is introduced into the traditional knowledge distillation method to further enhance the performance of the target network.Experimental results show that the pruned network restored by the proposed knowledge distillation method performs better and the performance of the network model after collaborative compression has reached the level of mainstream lightweight networks.

关 键 词:SAR 舰船识别 协同压缩 模型剪枝 知识蒸馏 学生自我反思 

分 类 号:TP273[自动化与计算机技术—检测技术与自动化装置]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象