检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]兰州大学信息科学与工程学院,兰州730000
出 处:《计算机工程与应用》2012年第7期201-204,241,共5页Computer Engineering and Applications
摘 要:为提高基于内容的图像检索系统(CBIR)中纹理特征提取的有效性,进一步提升CBIR系统的整体性能。提出了一种基于脉冲耦合神经网络的纹理图像检索方法。脉冲耦合神经网络(PCNN)是新一代的人工神经网络,在数据处理上具有很多优势。特征提取时具有平移、旋转、尺度、扭曲等不变性,以及很好的抗噪性,而这一点非常适合于图像检索系统。利用PCNN及简化模型ICM得到对应于不同灰度值的二值图像序列,计算序列中每幅图像的熵序列,其一维的特征矢量作为纹理特征。采用Eu-clidean距离进行相似度计算,建立了一套基于示例查询图像的纹理图像检索系统。实验结果表明,与小波包等特征提取方法相比,该方法不仅对噪声具有较强的鲁棒性,同时能降低特征向量维数,具有尺度、平移和旋转不变性,而且能取得更高的检索率。To increase the validity of texture feature extraction in Content-Based Image Retrieval(CBIR)system, a novel approach based on Pulse-Coupled Neural Network(PCNN)for texture image retrieval is proposed. PCNN is a new generation of artificial neural networks and powerful in data processing. The outputs of PCNN and Intersecting Cortical Model(ICM)represent unique features of original stimulus are invariant to translation, rotation, scaling and distortion, which is particularly suitable for content-based image retrieval system. By adopting PCNN and simplified PCNN model ICM, the dual value image sequence corresponding to different gray levels is obtained. The variance of each image in entropy sequence is then calculated to convert into one dimensional eigenvector used to represent the image features. The Euclidean distance is used to compute the similarity between images. A texture retrieval system based on query image is developed. The experimental results show, compared to the wavelet package transform, this approach can not only robust to the noises in images similarity retrieval and reduce the dimension of feature vectors, and has the property of shift, scale and rotation invariance, but also get higher accuracy rate.
关 键 词:基于内容的图像检索(CBIR) 脉冲耦合神经网络(PCNN) 交叉皮层模型(ICM) 特征提取
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.49