Exploring Polarizing Political Discourse Among U.S.Congressional Members with Large Language Models  

在线阅读下载全文

作  者:Cheng FU Zili HUANG Xiaoqiang CAI 

机构地区:[1]School of Integrated Circuit Science and Engineering,University of Electronic Science and Technology of China,Chengdu 611731,China [2]School of Science and Engineering,The Chinese University of Hong Kong,Shenzhen 518172,China [3]School of Data Science,The Chinese University of Hong Kong,Shenzhen 518172,China

出  处:《Journal of Systems Science and Information》2025年第1期102-115,共14页系统科学与信息学报(英文)

摘  要:In July 2024,a shooting incident involving President Trump drew widespread public attention,highlighting the need for a deeper understanding of political rhetoric among U.S.Congressional members.This study analyzes discourse patterns on Twitter using large language models(LLMs),specifically ChatGPT and bidirectional encoder representations from transformers(BERT),to explore underlying factors that may contribute to polarizing political language.By collecting and preprocessing Twitter data,we initially labeled 20,000 tweets using ChatGPT and then utilized the BERT-large model to classify the remaining 980,000 tweets.The analysis identified party affiliation and geographic region as significant factors influencing political rhetoric.Republican lawmakers exhibited a higher prevalence of polarizing language,while New Jersey recorded the highest rate among the states.Newly elected Congressional members also tended to adopt more provocative language,potentially as a strategy to engage with their voter base or distinguish themselves in a competitive political environment.Temporal analysis revealed spikes in polarizing rhetoric corresponding to events such as discussions on the new fiscal year budget.This study offers insights into the dynamics of political discourse,providing a foundation for promoting constructive dialogue and fostering institutional resilience.

关 键 词:U.S.Congressional members polarizing discourses TWITTER large language models 

分 类 号:H31[语言文字—英语]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象