A predictive human model of language challenges traditional views in linguistics and pretrained transformer research  

在线阅读下载全文

作  者:Sergio Torres-Martínez 

机构地区:[1]Universidad de Antioquia,CII.67#53-108,Medellín,Antioquia,Colombia

出  处:《Language and Semiotic Studies》2024年第4期562-592,共31页语言与符号学研究(英文)

摘  要:This paper introduces a theory of mind that positions language as a cognitive tool in its own right for the optimization of biological fitness.I argue that human language reconstruction of reality results from biological memory and adaptation to uncertain environmental conditions for the reaffirmation of the Selfas-symbol.I demonstrate that pretrained language models,such as ChatGPT,lack embodied grounding,which compromises their ability to adequately model the world through language due to the absence of subjecthood and conscious states for event recognition and partition.At a deep level,I challenge the notion that the constitution of a semiotic Self relies on computational reflection,arguing against reducing human representation to data structures and emphasizing the importance of positing accurate models of human representation through language.This underscores the distinction between transformers as posthuman agents and humans as purposeful biological agents,which emphasizes the human capacity for purposeful biological adjustment and optimization.One of the main conclusions of this is that the capacity to integrate information does not amount to phenomenal consciousness as argued by Information Integration Theory.Moreover,while language models exhibit superior computational capacity,they lack the real consciousness providing them with multiscalar experience anchored in the physical world,a characteristic of human cognition.However,the paper anticipates the emergence of new in silico conceptualizers capable of defining themselves as phenomenal agents with symbolic contours and specific goals.

关 键 词:active inference agentive cognitive construction grammar ChatGPT EMBODIMENT essentialist concept formation large language models 

分 类 号:H31[语言文字—英语]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象