检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:范怀玉 马军山[1] 刘玉堂 杜彩虹 FAN Huaiyu;MA Junshan;LIU Yutang;DU Caihong(School of Optical-Electrical and Computer Engineering University of Shanghai for Science and Technology,Shanghai 200093,China;Department of Medical Information Engineering,Jining Medical University,Rizhao 276826,China)
机构地区:[1]上海理工大学光电信息与计算机工程学院,上海200093 [2]济宁医学院医学信息工程学院,山东日照276826
出 处:《光学仪器》2019年第6期20-25,共6页Optical Instruments
基 金:教育部高等教育司产学合作协同育人项目(201701020089);国家级大学生创新创业训练计划项目(201610443082);济宁医学院大学生科研项目(JYXS2017KJ031)
摘 要:利用种子区域增长对超声乳腺肿瘤图像进行分割是一种常用的计算机辅助诊断方法。为实现种子点的自动快速定位,满足实时在线分割图像的需求,根据超声乳腺肿瘤图像的结构特征,综合图像的灰度因素和空间因素,提出了一种基于迭代四叉树分解的算法。该算法将满足特定阈值的图像分裂转化为寻找种子区域,以实现种子点的自动定位。对105幅超声乳腺肿瘤图像进行了实验验证,结果表明,该算法准确率能够达到94.28%,平均耗时2.97 s,不但满足了种子点的自动定位于图像肿瘤内部,而且需要调整的参数少,其定位效率要高于人工选择。Segmentation of ultrasound breast tumors using seed region growth is a common method of computer-aided diagnosis. In order to realize automatic and fast locating of seed points,the demand of real-time online image segmentation is satisfied. According to the structural characteristics of ultrasound breast tumor image, and integrating the gray and spatial factors of the image, an algorithm based on iterative quadtree decomposition is proposed to transform the image splitting which meets a specific threshold into searching for seed regions, so as to realize automatic seed location. The verification results of 105 ultrasound breast tumor images show that this method not only satisfies the requirement of automatic location of the seed point within the tumor, with an accuracy of 94.28%. The average time consumed is 2.97 s. This method is also requires few parameters to be adjusted. Compared with other methods, the proposed method is more efficient.
关 键 词:种子区域增长 迭代四叉树分解 灰度均衡 计算机辅助诊断
分 类 号:TP391.9[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.31