基于“存算一体”的卷积神经网络加速器  

Convolutional Neural Network accelerator based on computing in memory

作  者:卢莹莹 孙翔宇 计炜梁 邢占强 LU Yingying;SUN Xiangyu;JI Weiliang;XING Zhanqiang(Institute of Electronic Engineering,China Academy of Engineering Physics,Mianyang Sichuan 621999,China;Microsystem&Terahertz Research Center,China Academy of Engineering Physics,Chengdu Sichuan 610200,China;Graduate School of China Academy of Engineering Physics,Beijing 100088,China)

机构地区:[1]中国工程物理研究院电子工程研究所,四川绵阳621999 [2]中国工程物理研究院微系统与太赫兹研究中心,四川成都610200 [3]中国工程物理研究院研究生院,北京100088

出  处:《太赫兹科学与电子信息学报》2025年第2期170-174,共5页Journal of Terahertz Science and Electronic Information Technology

摘  要:基于冯·诺伊曼架构的卷积神经网络(CNN)实现方案难以满足高性能与低功耗的要求,本文设计了一种基于“存算一体”架构的卷积神经网络加速器。利用可变电阻式存储器(RRAM)阵列实现“存算一体”架构,采用高效的数据输入管道及硬件处理单元进行大批量图像数据的处理,实现了高性能的数字图像识别。仿真结果表明,该卷积神经网络加速器有着更快的计算能力,其时钟频率可达100 MHz;此外,该结构综合得到的面积为300742μm^(2),为常规设计方法的56.6%。本文设计的加速模块在很大程度上提高了CNN加速器的速率并降低了能耗,仿真结果对高性能神经网络加速器设计有指导意义和参考作用。The implementation scheme of Convolutional Neural Network(CNN)based on Von Neumann architecture is difficult to meet the requirements of high performance and low power consumption.Therefore,a CNN accelerator based on storage-computing integrated architecture is designed.By using the circuit structure of Resistive Random Access Memory(RRAM)to realize the storage-computing integrated architecture,and using efficient data input pipeline and CNN processing unit to process large-scale image data,high-performance digital image recognition is realized.The simulation results show that the CNN accelerator has faster computing capability and its clock frequency can reach 100 MHz;in addition,the area of the structure is 300742μm^(2),which is 56.6%of that of the conventional design method.The acceleration module designed in this paper greatly improves the speed and decreases the energy consumption of CNN accelerator.It shows guiding significance for the design of high performance neural network accelerator.

关 键 词:存算一体 卷积神经网络(CNN) 加速器 输入管道 处理单元 

分 类 号:TN79[电子电信—电路与系统]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象