A deep learning aided differential distinguisher improvement framework with more lightweight and universality  

在线阅读下载全文

作  者:JiaShuo Liu JiongJiong Ren ShaoZhen Chen 

机构地区:[1]Information Engineering Universiy,Zhengzhou,People's Republic of China

出  处:《Cybersecurity》2024年第4期36-51,共16页网络空间安全科学与技术(英文)

基  金:supported by the National Natural Science Foundation of China[Grant number 62206312].

摘  要:In CRYPTO 2019,Gohr opens up a new direction for cryptanalysis.He successfully applied deep learning to differential cryptanalysis against the NSA block cipher SPECK32/64,achieving higher accuracy than traditional differential distinguishers.Until now,one of the mainstream research directions is increasing the training sample size and utilizing different neural networks to improve the accuracy of neural distinguishers.This conversion mindset may lead to a huge number of parameters,heavy computing load,and a large number of memory in the distinguishers training process.However,in the practical application of cryptanalysis,the applicability of the attacks method in a resourceconstrained environment is very important.Therefore,we focus on the cost optimization and aim to reduce network parameters for differential neural cryptanalysis.ln this paper,we propose two cost-optimized neural distinguisher improvement methods from the aspect of data format and network structure,respectively.Firstly,we obtain a partial output difference neural distinguisher using only 4-bits training data format which is constructed with a new advantage bits search algorithm based on two key improvement conditions.In addition,we perform an interpretability analysis of the new neural distinguishers whose results are mainly reflected in the relationship between the neural distinguishers,truncated differential,and advantage bits.Secondly,we replace the traditional convolution with the depthwise separable convolution to reduce the training cost without affecting the accuracy as much as possible.Overall,the number of training parameters can be reduced by less than 50%by using our new network structure for training neural distinguishers.Finally,we apply the network structure to the partial output difference neural distinguishers.The combinatorial approach have led to a further reduction in the number of parameters(approximately 30% of Gohr's distinguishers for SPECK).

关 键 词:Deep learning Block cipher Neural distinguisher Depthwise separabe convolution SPECK 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象