基于Ghost卷积与自适应注意力的点云分类  

Point cloud classification based on Ghost convolution and adaptive attention

在线阅读下载全文

作  者:舒密 王占刚[1] SHU Mi;WANG Zhangang(School of Information Communication Engineering,Beijing Information Science and Technology University,Beijing 100011,China)

机构地区:[1]北京信息科技大学信息与通信工程学院,北京100011

出  处:《现代电子技术》2025年第6期106-112,共7页Modern Electronics Technique

摘  要:点云Transformer网络在提取三维点云的局部特征和携带的多级自注意力机制方面展现出了卓越的特征学习能力。然而,多级自注意力层对计算和内存资源的要求极高,且未充分考虑特征融合中层级间以及通道间的区分度与关联性。为解决上述问题,提出一种基于点云Transformer的轻量级特征增强融合分类网络EFF-LPCT。EFF-LPCT使用一维化Ghost卷积对原始网络进行重构,以降低计算复杂度和内存要求;引入自适应支路权重,以实现注意力层级间的多尺度特征融合;利用多个通道注意力模块增强特征的通道交互信息,以提高模型分类效果。在ModelNet40数据集进行的实验结果表明,EFF-LPCT在达到93.3%高精度的同时,相较于点云Transformer减少了1.11 GFLOPs的浮点计算量和0.86×10^(6)的参数量。The point cloud Transformer network can exhibit remarkable feature learning capabilities by extracting local features of three-dimensional point clouds and employing multi-level self-attention mechanisms.However,the multi-level selfattention layer has high requirements on computing and memory resources,and the differentiation and correlation between levels and channels in feature fusion are not considered fully.In order to solve the above problems,a lightweight point cloud Transformer based on enhanced feature fusion(EFF-LPCT)is proposed.In the EFF-LPCT,the original network is reconstructed by means of one-dimensional Ghost convolution to reduce computational complexity and memory requirements.The adaptive branch weight is used to realize the multi-scale feature fusion between attention levels and multiple channel attention modules are used to enhance channel interaction information,so as to improve the model classification performance.The experimental results on the ModelNet40 datasets demonstrate that EFF-LPCT can realize 93.3%high accuracy while reducing the floating point computation amount of 1.11 GFLOPs and the parameter number of 0.86×106 compared to point cloud Transformer.

关 键 词:点云分类 Transformer网络 Ghost卷积 特征增强融合模块 ECA通道注意力 特征学习 

分 类 号:TN249-34[电子电信—物理电子学] TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象