基于稀疏敏感的鲁棒网络分层剪枝策略  

LAYERWISE PRUNING STRATEGY BASED ON SPARSE SENSITIVE ROBUST NETWORKS

在线阅读下载全文

作  者:李平[1] 袁晓彤 Li Ping;Yuan Xiaotong(School of Automation,Nanjing University of Information Science and Technology,Nanjing 210044,Jiangsu,China;Jiangsu Key Laboratory of Big Data Analysis Technology,Collaborative Innovation Center of Atmospheric Environment and Equipment Technology,Nanjing 210044,Jiangsu,China)

机构地区:[1]南京信息工程大学自动化学院,江苏南京210044 [2]江苏省大数据分析技术重点实验室大气环境与装备技术协同创新中心,江苏南京210044

出  处:《计算机应用与软件》2023年第5期200-206,共7页Computer Applications and Software

基  金:国家新一代人工智能重大项目(2018AAA0100400);国家自然科学基金项目(61876090,61936005)。

摘  要:深度神经网络很容易受到精心设计的对抗样本攻击。虽然基于极大极小值优化的对抗训练方法能提升网络的鲁棒性,但是对抗训练比正常训练需要更大容量和更多参数的模型。为了获得一个高鲁棒性和高稀疏度的网络模型,该文从模型压缩角度出发通过实验分析模型精度、鲁棒性和稀疏性之间的关系,并根据鲁棒网络稀疏敏感特性提出一种基于稀疏敏感的鲁棒网络非结构剪枝算法。在Mnist和Cifar10数据集上的白盒攻击实验结果表明,该算法在采用较大剪枝率时仍能保持高模型精度和高鲁棒性。在黑盒攻击下,基于该算法的稀疏模型的鲁棒精度甚至能超过未剪枝模型。Deep neural networks are vulnerable to crafted adversarial attacks.Adversarial training method based on min-max optimization may boost the robustness of neural networks.However,adversarial training requires a larger capacity and more parameters of the network than that for natural training.To obtain a network model with high robustness and sparsity,from the perspective of model compression to lighten the burden,this paper analyzed the relationship between model accuracy,robustness and sparsity.According to the sparse characteristics of the robust network,this paper proposed a new unstructured pruning method based on the sparse sensitivity of robust networks.White box attack experiments on Mnist and Cifar10 datasets show that robust networks maintain high model accuracy and robustness while using large pruning rates.The robustness and accuracy of sparse networks under black box attack based on the method are even better than the dense networks.

关 键 词:鲁棒性 对抗训练 非结构剪枝 稀疏敏感度 

分 类 号:TP3[自动化与计算机技术—计算机科学与技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象