Deterministic conversion rule for CNNs to efficient spiking convolutional neural networks  被引量:2

Deterministic conversion rule for CNNs to efficient spiking convolutional neural networks

在线阅读下载全文

作  者:Xu YANG Zhongxing ZHANG Wenping ZHU Shuangming YU Liyuan LIU Nanjian WU 

机构地区:[1]State Key Laboratory of Superlattices and Microstructures,Institute of Semiconductors,Chinese Academy of Sciences,Beijing 100083,China [2]Center of Materials Science and Optoelectronics Engineering,University of Chinese Academy of Sciences,Beijing 100049,China [3]Center for Excellence in Brain Science and Intelligence Technology,Chinese Academy of Sciences,Beijing 100083,China

出  处:《Science China(Information Sciences)》2020年第2期196-214,共19页中国科学(信息科学)(英文版)

基  金:supported by National Natural Science Foundation of China(Grant Nos.61704167,61434004);Beijing Municipal Science and Technology Project(Grant No.Z181100008918009);Youth Innovation Promotion Association Program,Chinese Academy of Sciences(Grant No.2016107);Strategic Priority Research Program of Chinese Academy of Science(Grant No.XDB32050200).

摘  要:This paper proposes a general conversion theory to reveal the relations between convolutional neural network(CNN)and spiking convolutional neural network(spiking CNN)from structure to information processing.Based on the conversion theory and the statistical features of the activations distribution in CNN,we establish a deterministic conversion rule to convert CNNs into spiking CNNs with definite conversion procedure and the optimal setting of all parameters.Included in conversion rule,we propose a novel"nscaling"weight mapping method to realize high-accuracy,low-latency and power efficient object classification on hardware.For the first time,the minimum dynamic range of spiking neuron’s membrane potential is studied to help to balance the trade-off between representation range and precise of the data type adopted by dedicated hardware when spiking CNNs run on it.The simulation results demonstrate that the converted spiking CNNs perform well on MNIST,SVHN and CIFAR-10 datasets.The accuracy loss over three datasets is no more than 0.4%.39%of processing time is shortened at best,and less power consumption is benefited from lower latency achieved by our conversion rule.Furthermore,the results of noise robustness experiments indicate that spiking CNN inherits the robustness from its corresponding CNN.This paper proposes a general conversion theory to reveal the relations between convolutional neural network(CNN) and spiking convolutional neural network(spiking CNN) from structure to information processing. Based on the conversion theory and the statistical features of the activations distribution in CNN,we establish a deterministic conversion rule to convert CNNs into spiking CNNs with definite conversion procedure and the optimal setting of all parameters. Included in conversion rule, we propose a novel "nscaling" weight mapping method to realize high-accuracy, low-latency and power efficient object classification on hardware. For the first time, the minimum dynamic range of spiking neuron’s membrane potential is studied to help to balance the trade-off between representation range and precise of the data type adopted by dedicated hardware when spiking CNNs run on it. The simulation results demonstrate that the converted spiking CNNs perform well on MNIST, SVHN and CIFAR-10 datasets. The accuracy loss over three datasets is no more than 0.4%. 39% of processing time is shortened at best, and less power consumption is benefited from lower latency achieved by our conversion rule. Furthermore, the results of noise robustness experiments indicate that spiking CNN inherits the robustness from its corresponding CNN.

关 键 词:convolutional NEURAL networks(CNN) SPIKING NEURAL networks(SNN) image classification CONVERSION RULE noise robustness neuromorphic hardware 

分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象