检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王岭[1,2] 陈曦[2] 董峰[1] 杨国文[2] 郁道银[1]
机构地区:[1]天津大学电气与自动化工程学院,天津300072 [2]天津航海仪器研究所三室,天津300451
出 处:《计算机辅助设计与图形学学报》2016年第9期1506-1511,共6页Journal of Computer-Aided Design & Computer Graphics
基 金:国家自然科学基金(30500129)
摘 要:通过血管模型重建可以更直观、准确的诊断冠脉粥样硬化,为冠心病早期诊断提供直接的参考依据.利用血管内超声成像(IVUS)图像序列在导引丝上的定位、定向计算,实现了血管的模型重建.首先对IVUS图像序列内外膜分割,对X射线冠脉造影成像三维骨架重建,采用空间几何变换实现IVUS图像在血管骨架上定位;然后利用导引丝相邻点间的空间关系,实现导引丝序列点间的相对方位计算;最后根据导引丝与血管骨架的空间关系实现IVUS图像序列在导引丝上的定向.实验结果表明,血管模型数据融合相对长度的平均误差和标准偏差分别为0.50 mm和0.57 mm;相对角度的平均误差和标准偏差分别为6.03?和7.86?,该方法可满足血管模型重建的数据融合精度需要.It can be more intuitive and accurate to diagnose the coronary atherosclerosis by means of the vascular model, which provides a reference for the early diagnosis of coronary artery disease. The reconstruction of vascular model is realized by the positioning and orientation of the sequence images of intravascular ultrasound imaging(IVUS) on the guide wire. Firstly, the IVUS images were positioned on the vascular skeleton by using the spatial geometric transformation, based on the internal and external membrane segmentation for IVUS sequence images and the 3D reconstruction for vessel skeleton. Secondly, the relative orientation and position between the sequence points of guide wire were realized, through the space relationship between the adjacent points of guide wire. Finally, the orientation of IVUS sequence images on the guide wire was decided. Experimental results show that the relative length error of the mean and standard deviation for vascular mode of data fusion is 0.50 mm and 0.57 mm respectively; the relative angle error of the mean and standard deviation is 6.03°and 7.86° respectively. This method can meet the needs of the data fusion precision for reconstruction of vascular model.
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.229