基于Transformer模型的5G功放预失真研究  

Research on predistortion of 5G power amplifier based on Transformer model

在线阅读下载全文

作  者:王静怡 陈景豪 许高明[1] Wang Jingyi;Chen Jinghao;Xu Gaoming(Faculty of Electrical Engineering and Computer Science,Ningbo University,Ningbo 315211)

机构地区:[1]宁波大学信息科学与工程学院,宁波315211

出  处:《无线通信技术》2024年第4期51-54,62,共5页Wireless Communication Technology

摘  要:为了补偿功放的非线性失真和记忆效应,本文基于一种基于深度学习的Transformer模型用于射频功放非线性建模的数字预失真算法。该模型具有长时序依赖捕获和交互能力,可以很好地表征功放的强非线性失真和记忆效应。为了验证该模型的建模性能和线性化效果,对比了当下流行的数字预失真器模型,实验结果表明,相比于FFNN模型和LSTM模型,建模精度提高了~2.1dB,同时模型参数量减少了~21%。In recent years,the Transformer model based on the self-attention mechanism has been developed,which has been applied to computer natural speech processing(NLP)and migrated to the fields of computer vision and machine translation.The ability of Transformer models to capture dependencies and interactions over long distances makes them ideal for time series analy-sis.In this paper,a digital predistortion algorithm based on deep learning Transformer model for nonlinear modeling of RF power amplifier is proposed.The model has long-time dependent capture and interaction capabilities,which can well characterize the strong nonlinear distortion and memory effects of power amplifiers.In order to verify the modeling performance and linearization effect of the model,compared with the popular deep learning models,the experimental results show that compared with the FFNN model and the LSTM model,the modeling accuracy is increased by 2.1dB,and the number of model parameters is reduced by 21%.

关 键 词:TRANSFORMER 功放 记忆效应 深度学习 

分 类 号:TN92[电子电信—通信与信息系统]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象