检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Miao Li Feng Zhang Cuiting Zhang
出 处:《Machine Intelligence Research》2024年第6期1192-1200,共9页机器智能研究(英文版)
摘 要:Quantization is one of the research topics on lightweight and edge-deployed convolutional neural networks(CNNs).Usu-ally,the activation and weight bit-widths between layers are inconsistent to ensure good performance of CNN,meaning that dedicated hardware has to be designed for specific layers.In this work,we explore a unified quantization method with extremely low-bit quantized weights for all layers.We use thermometer coding to convert the 8-bit RGB input images to the same bit-width as that of the activa-tions of middle layers.For the quantization of the results of the last layer,we propose a branch convolution quantization(BCQ)method.Together with the extremely low-bit quantization of the weights,the deployment of the network on circuits will be simpler than that of other works and consistent throughout all the layers including the first layer and the last layer.Taking tiny_yolo_v3 and yolo_v3 on VOC and COCO datasets as examples,the feasibility of thermometer coding on input images and branch convolution quantization on output results is verified.Finally,tiny_yolo_v3 is deployed on FPGA,which further demonstrates the high performance of the proposed algorithm on hardware.
关 键 词:Branch convolution quantization thermometer coding extremely low-bit quantization hardware deployment object detection
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.200