检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:范沈伟 李国平 王国中 FAN Shenwei;LI Guoping;WANG Guozhong(School of Electronic and Electrical Engineering,Shanghai University of Engineering Science,Shanghai 201620,China)
机构地区:[1]上海工程技术大学电子电气工程学院,上海201620
出 处:《上海大学学报(自然科学版)》2024年第3期466-475,共10页Journal of Shanghai University:Natural Science Edition
基 金:国家重点研发计划资助项目(2019YFB1802700)。
摘 要:基于深度学习的图像压缩算法变换部分存在结构复杂、计算量大的问题.为了加快其编码和解码的速度,提出了一种在尽可能保持原有压缩图像质量的情况下,使用知识蒸馏减少原网络参数量和乘-加运算计算量(multiply-accumulationoperations,MACs)的方法.同时训练原网络和轻量化网络,通过将原网络的特征信息传递给轻量化网络,提升轻量化网络的性能.在轻量化网络的结构设计中,为了保留更多的特征信息,且尽可能地减少网络的参数量和MACs,在减少其通道数量的同时引入了分组卷积.在测试数据集Kodak以及DIV2K上的实验结果证明,相比于原网络,经过知识蒸馏的轻量化网络其参数量和MACs约为原来1/16,且仍然保持了较好的图像质量.The transformation modules of image compression algorithms based on deep learning involves complex architectures and large quantities of computation.To speed up the encoding and decoding process,a method was proposed to reduce the number of parameters and multiply-accumulation operations(MACs)of the original network with knowledge distillation while maintaining the image quality as much as possible.The original and the lightweight networks were trained simultaneously,and the lightweight network performance was improved by receiving feature information from the original network.When designing the lightweight network,group convolution was introduced to retain more feature information and reduce the number of parameters and MACs of the network as much as possible,while the number of channels was reduced.Experiments on the test datasets Kodak and DIV2K showed that,compared with the original network,the lightweight network after knowledge distillation still maintained good image quality while the amount of parameters and MACs was approximately one-sixteenth that of the original network.
分 类 号:TP37[自动化与计算机技术—计算机系统结构]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.49