检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]武汉理工大学资源与环境工程学院,武汉430070
出 处:《武汉理工大学学报(交通科学与工程版)》2013年第4期695-698,共4页Journal of Wuhan University of Technology(Transportation Science & Engineering)
基 金:国家自然科学基金项目(批准号:41171319;40571128;40901214;60904073;41071104;41071283);国家重点实验室计划项目(批准号:A0703);国家863计划项目(批准号:2009AA12Z201);武汉市青年科技晨光计划项目(批准号:200950431203);武汉理工大学自主创新研究基金项目(批准号:2013-zy-071)资助
摘 要:由于遥感技术应用领域的不断扩大,用户希望利用遥感技术来实现功能性用地的自动提取.然而,在现行分类标准中,存在许多按应用功能分类的地物,其在光谱和纹理等遥感特征上与普通的城市建筑物没有本质区别,仅仅通过遥感数据本身,无法对其进行提取.因此,文中以教学用地为例,提供一种用地理本体和相对高程对其进行识别分类的方法:首先,在遥感影像中提取操场和建筑物对象,通过阴影计算建筑物的相对高程并确定其建筑面积,再通过遥感领域知识从操场面积推算出学校的建筑面积,并通过缓冲区分析对教学用地进行分类.As the application field of remote sensing technology continues to expand,the user wants to achieve the automatic extraction of the functional land with remote sensing technology.However,in the current classification standard,many surface features are classified by their function,they have no substantial difference with general urban residential land and other buildings in cities in remote sensing characteristics,such as spectral and texture.Just by remote sensing data itself,we can′t extract them.Therefore,taking teaching land as an example,this article provides a method to identify and classify they by using the geographic ontology and relative elevation.Firstly,the playground and building objects in remote sensing image is extracted to calculate the relative elevation of the building according to its shadow and determine its construction area.Then the school′s construction area is calculated through the playground area with remote sensing domain knowledge.And the classification of teaching land via buffer analysis is realized.It explores a new way for automatic recognition and classification of such targets of functional land.
分 类 号:TP72[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.66