supported by the National Key R&D Program of China(No.2023YFB3308601);Sichuan Science and Technology Program(2024NSFJQ0035,2024NSFSC0004);the Talents by Sichuan provincial Party Committee Organization Department.
In natural language processing(NLP),managing multiple downstream tasks through fine-tuning pre-trained models often requires maintaining separate task-specific models,leading to practical inefficiencies.To address thi...
supported by National Natural Science Foundation of China(Grant Nos.62276143,62302233);Natural Science Foundation of Jiangsu Province(Grant No.BK20231287);National Science Fund for Distinguished Young Scholars of China(Grant No.62125203).
Large language models(LLMs)have recently shown remarkable performance in a variety of natural language processing(NLP)tasks.To further explore LLMs’reasoning abilities in solving complex problems,recent research[1–3...
supported by National Key Research and Development Program of China (Grant No.2022ZD0160403);National Natural Science Foundation of China (Grant No.62176178)。
Driven by the expansion of foundation models and the increasing variety of downstream tasks,parameter-efficient fine-tuning(PEFT) methods have exhibited remarkable efficacy in the unimodal domain,effectively mitigatin...
We present an approach to classify medical text at a sentence level automatically.Given the inherent complexity of medical text classification,we employ adapters based on pre-trained language models to extract informa...
supported by the National Natural Science Foundation of China(Grant Nos.:52293413 and 52076161).
Fault detection and diagnosis(FDD)of heating,ventilation,and air conditioning(HVAC)systems can help to improve the energy saving in building energy systems.However,most data-driven trained FDD models have limited gene...