基于融合通道注意力的Uformer的CT图像稀疏重建  被引量:2

Sparse reconstruction of CT images based on Uformer with fused channel attention

在线阅读下载全文

作  者:陈蒙蒙 乔志伟[1] CHEN Mengmeng;QIAO Zhiwei(School of Computer and Information Technology,Shanxi University,Taiyuan Shanxi 030006,China)

机构地区:[1]山西大学计算机与信息技术学院,太原030006

出  处:《计算机应用》2023年第9期2948-2954,共7页journal of Computer Applications

基  金:国家自然科学基金资助项目(62071281)。

摘  要:针对解析法稀疏重建中产生的条状伪影问题,提出一种融合通道注意力的U型Transformer(CA-Uformer),以实现高精度计算机断层成像(CT)的稀疏重建。CA-Uformer融合了通道注意力和Transformer中的空间注意力,双注意力机制使网络更容易学习到图像细节信息;采用优秀的U型架构融合多尺度图像信息;采用卷积操作实现前向反馈网络设计,从而进一步耦合卷积神经网络(CNN)的局部信息关联能力和Transformer的全局信息捕捉能力。实验结果表明,与经典U-Net相比,CA-Uformer的峰值信噪比(PSNR)、结构相似性(SSIM)提高了3.27 dB、3.14%,均方根误差(RMSE)降低了35.29%,提升效果明显。可见,CA-Uformer稀疏重建精度更高,压制伪影能力更强。Concerning the problem of streak artifacts generated in the sparse reconstruction of analytic method,a Channel Attention U-shaped Transformer(CA-Uformer)was proposed to achieve high-precision Computed Tomography(CT)sparse reconstruction.In CA-Uformer,channel attention and spatial attention in Transformer were fused,and with the dual-attention mechanism,image detail information was easier learnt by the network;an excellent U-shaped architecture was adopted to fuse multi-scale image information;a forward feedback network design was implemented by using convolutional operations,which further coupled the local information association ability of Convolutional Neural Network(CNN)and the global information capturing ability of Transformer.Experimental results show that CA-Uformer has the Peak Signal-to-Noise Ratio(PSNR),Structural Similarity(SSIM)3.27 dB and 3.14%higher,and Root Mean Square Error(RMSE)35.29%lower than the classical U-Net,which is a significant improvement.It can be seen that CA-Uformer has sparse reconstruction with higher precision and better ability to suppress artifacts.

关 键 词:计算机断层成像 稀疏重建 条状伪影 TRANSFORMER 通道注意力 

分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象