检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]福建师范大学数学与计算机科学学院计算机科学系 [2]南京航空航天大学计算机科学与技术系南京210016 [3]南京航空航天大学计算机科学与技术系
出 处:《计算机科学》2006年第12期189-195,共7页Computer Science
基 金:国家自然科学基金(No.49971063);国家高技术研究发展计划(863)(No.2001AA6330101-04);航空科学基金项目(02F52033);江苏省自然科学基金(No.BK2001045)。
摘 要:空间数据具有海量、复杂、连续、空间自相关、存在缺损与误差等的特点,要求空间聚类算法具有高效率,能处理各种复杂形状的簇,聚类结果与数据空间分布顺序无关,并且对离群点是健壮的等性能,已有的算法难以同时满足要求。本文提出了一个适合处理海量复杂空间数据的数据结构-多代表点特征树。基于多代表点特征树提出了适合挖掘海量复杂空间数据聚类算法CAMFT,该算法利用多代表点特征树对海量的数据进行压缩,结合随机采样的方法进一步增强算法处理海量数据的能力;同时,多代表点特征树能够保存复杂形状的聚类特征,适合处理复杂空间数据。实验表明了算法CAMFT能够快速处理带有离群点的复杂形状聚类的空间数据,结果与对象空间分布顺序无关,并且效率优于已有的同类聚类算法BIRCH与CURE。Spatial data have the features of largeness, complexity, continuity, spatial autocorrelation, missing data and error in spatial database. These characters require that a good spatial clustering algorithm must be high efficient, and should be able to detect clusters of complicated shapes, and the dusters found should be independent of the order in which the points in the space are examined, and should be not be impacted by outliers. The existed algorithms can not work well, Clustering algorithm based on multi-representation feature tree named CAMFT is proposed, A new data structure is firstly proposed to condense data, which drew the strongpoint from BIRCH algorithm and CURE algorithm, and then the algorithm that included the idea of random sampling is proposed to enhance the ability to detect very large data, As well as, the multi-representation feature tree can keep clusters of complicated shapes, so it can be used to detect spatial clusters. Experimental results show the algorithm can identify clusters of complicated shapes efficiently in large spatial database that have many outliers, and outperform BIRCH algorithm and CURE algorithm in efficiency.
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.249