检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Wenjun KE Ziyu SHANG Zhizhao LUO Peng WANG Yikai GUO Qi LIU Yuxuan CHEN
机构地区:[1]School of Computer Science and Engineering,Southeast University,Nanjing 210096,China [2]Key Laboratory of New Generation Artificial Intelligence Technology and Its Interdisciplinary Applications(Southeast University),Nanjing 210096,China [3]Beijing Institute of Technology Zhuhai,Zhuhai 519088,China [4]Beijing Institute of Computer Technology and Application,Beijing 100048,China
出 处:《Science China(Information Sciences)》2024年第10期385-386,共2页中国科学(信息科学)(英文版)
基 金:supported by the National Science Foundation of China (Grant No. 62376057)。
摘 要:Large language models(LLMs) have demonstrated remarkable effectiveness across various natural language processing(NLP) tasks, as evidenced by recent studies [1, 2]. However, these models often produce responses that conflict with reality due to the unreliable distribution of facts within their training data, which is particularly critical for applications requiring high credibility and accuracy [3].
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.15