检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:李平[1] 袁晓彤 Li Ping;Yuan Xiaotong(School of Automation,Nanjing University of Information Science and Technology,Nanjing 210044,Jiangsu,China;Jiangsu Key Laboratory of Big Data Analysis Technology,Collaborative Innovation Center of Atmospheric Environment and Equipment Technology,Nanjing 210044,Jiangsu,China)
机构地区:[1]南京信息工程大学自动化学院,江苏南京210044 [2]江苏省大数据分析技术重点实验室大气环境与装备技术协同创新中心,江苏南京210044
出 处:《计算机应用与软件》2023年第5期200-206,共7页Computer Applications and Software
基 金:国家新一代人工智能重大项目(2018AAA0100400);国家自然科学基金项目(61876090,61936005)。
摘 要:深度神经网络很容易受到精心设计的对抗样本攻击。虽然基于极大极小值优化的对抗训练方法能提升网络的鲁棒性,但是对抗训练比正常训练需要更大容量和更多参数的模型。为了获得一个高鲁棒性和高稀疏度的网络模型,该文从模型压缩角度出发通过实验分析模型精度、鲁棒性和稀疏性之间的关系,并根据鲁棒网络稀疏敏感特性提出一种基于稀疏敏感的鲁棒网络非结构剪枝算法。在Mnist和Cifar10数据集上的白盒攻击实验结果表明,该算法在采用较大剪枝率时仍能保持高模型精度和高鲁棒性。在黑盒攻击下,基于该算法的稀疏模型的鲁棒精度甚至能超过未剪枝模型。Deep neural networks are vulnerable to crafted adversarial attacks.Adversarial training method based on min-max optimization may boost the robustness of neural networks.However,adversarial training requires a larger capacity and more parameters of the network than that for natural training.To obtain a network model with high robustness and sparsity,from the perspective of model compression to lighten the burden,this paper analyzed the relationship between model accuracy,robustness and sparsity.According to the sparse characteristics of the robust network,this paper proposed a new unstructured pruning method based on the sparse sensitivity of robust networks.White box attack experiments on Mnist and Cifar10 datasets show that robust networks maintain high model accuracy and robustness while using large pruning rates.The robustness and accuracy of sparse networks under black box attack based on the method are even better than the dense networks.
分 类 号:TP3[自动化与计算机技术—计算机科学与技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.49