检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:LI Hongliang DAI Feng ZHAO Qiang MA Yike CAO Juan ZHANG Yongdong
机构地区:[1]University of Chinese Academy of Sciences,Beijing 100049,China [2]Key Lab of Intelligent Information Processing of Chinese Academy of Sciences,Institute of Computing Technology,Chinese Academy of Sciences,Beijing 100190,China [3]University of Science and Technology of China,Hefei 230027,China
出 处:《Chinese Journal of Electronics》2023年第1期159-165,共7页电子学报(英文版)
摘 要:For more effective image sampling,compressive sensing(CS) imaging methods based on image saliency have been proposed in recent years.Those methods assign higher measurement rates to salient regions,but lower measurement rate to non-salient regions to improve the performance of CS imaging.However,those methods are block-based,which are difficult to apply to actual CS sampling,as each photodiode should strictly correspond to a block of the scene.In our work,we propose a non-uniform CS imaging method based on image saliency,which assigns higher measurement density to salient regions and lower density to non-salient regions,where measurement density is the number of pixels measured in a unit size.As the dimension of the signal is reduced,the quality of reconstructed image will be improved theoretically,which is confirmed by our experiments.Since the scene is sampled as a whole,our method can be easily applied to actual CS sampling.To verify the feasibility of our approach,we design and implement a hardware sampling system,which can apply our non-uniform sampling method to obtain measurements and reconstruct the images.To our best knowledge,this is the first CS hardware sampling system based on image saliency.
关 键 词:Compressive sensing NON-UNIFORM Measurement density Image saliency
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.33