检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陆刚 肖金梅 王向文 蒋芸 蔺想红 LU Gang;XIAO Jinmei;WANG Xiangwen;JIANG Yun;LIN Xianghong(College of Computer Science&Engineering,Northwest Normal University,Lanzhou 730070;Department of Radiology,Dingxi People’s Hospital,Dingxi 743000,China)
机构地区:[1]西北师范大学计算机科学与工程学院,甘肃兰州730070 [2]定西市人民医院放射科,甘肃定西743000
出 处:《计算机工程与科学》2025年第2期308-316,共9页Computer Engineering & Science
基 金:西北师范大学青年教师科研能力提升计划(NWNU-LKQN2024-23);甘肃省自然科学基金(24JRRA127)。
摘 要:目前,已有的深度学习模型还无法准确、可靠地定位出2D头影X射线图像上的解剖标志点。为此,提出了一种用于头影测量的基于表观token和标志点token定位的模型。首先,从原始图像中采样出分辨率不同但大小固定的图像块;其次,将图像块输入到特征提取网络中提取多尺度特征;再次,通过线性投影将多尺度特征转换成表观token,将其与标志点token一起输入到关系推理层中,让标志点token在推理层中学习其与表观token间的内在关系;最后,经过多次迭代推理,令初始点以级联的方式逐步向目标移动。与先进的基线模型相比,所提出模型在公开头影X射线图像上表现出更优越的性能。The currently existing deep learning models are still unable to accurately and reliably locate anatomical landmark points on 2D cephalometric X-ray images.To address this issue,proposes a localization model for cephalometric measurement based on appearance token and landmark token.Firstly,fixed-size image patches of different resolutions are sampled from the original image and input into a feature extraction network to extract multi-scale features.Then,these features are converted into appearance tokens through linear projection and,together with landmark tokens,input into a relational reasoning layer.This allows the landmark tokens to learn the intrinsic relationships between the appearance tokens and the land-marks in the interence layer.Finally,through multiple iterative inferences,the model moves the initial points from coarse to fine in a cascaded manner towards the target.Compared with advanced baseline models,the proposed model demonstrates superior localization performance on public cephalometric X-ray images.
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.91