检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:丁海燕[1,2,3] 马灵玲[1,3] 李子扬[1,2,3] 唐伶俐[1,3]
机构地区:[1]中国科学院光电研究院,北京100094 [2]中国科学院大学,北京100049 [3]中国科学院光电研究院定量遥感信息技术重点实验室,北京100094
出 处:《遥感技术与应用》2013年第1期52-57,共6页Remote Sensing Technology and Application
基 金:国家科技支撑计划"遥感小卫星智能观测技术与验证"(2011BAH23B01)
摘 要:由于云、雪光谱特征在可见光谱段范围内的相似性,全色影像的云检测和云雪识别一直是对地观测遥感数据预处理及应用中的难点之一。细致分析了云、雪的纹理特征,通过训练大量的实验样本获得了表征云、雪纹理特征的分形维数值的统计规律,在此基础上综合考虑云、雪的纹理特征与覆盖分布规律,提出了一种基于分形维数的全色影像云与积雪自动识别方法。利用"北京一号"小卫星实际图像的测试结果表明,该方法是一种有效的全色影像云、雪自动识别方法。The similarity of spectral feature in visible/near-infrared band between cloud and snow has been an important influence which degrades the recognition accuracy of cloud and snow, especially the panchro- matic images. In this paper, a novel and feasible method was presented to automatically identify cloud and snow from panchromatic images. The method made full use of two different analytical techniques:the spec- trum threshold segmentation and the texture analysis. These two approaches discriminated the image from two different aspects. At first, the cloud or snow was distinguished from the background utilizing the difference of spectral feature, so the proportion of cloud or snow in the image was got. And then the sam- ples' fraetal dimension which could reflect the texture feature of cloud and snow from an image were calcu- lated to get the distribution of the fractal dimension values. At last, by comparing the proportion with the distribution, the automatic identification of cloud and snow was realized. The experimental results of the ac- tual panchromatic images by Beijing-1 indicate the feasibility and accuracy of the method. The method could be also applied for other high-resolution panchromatic images because of the universality of the texture fea- ture.
分 类 号:TP79[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.44