检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:李远丽 刘伟[1] 李润生[1] 牛朝阳[1] 李芳润 卢万杰[1] LI Yuanli;LIU Wei;LI Runsheng;NIU Chaoyang;LI Fangrun;LU Wanjie(Information Engineering University,Zhengzhou 450001,China)
机构地区:[1]信息工程大学,河南郑州450001
出 处:《信息工程大学学报》2024年第5期532-537,共6页Journal of Information Engineering University
基 金:国家自然科学基金(42201472);福建省自然资源科技创新项目(KY-080000-04-2021-030)。
摘 要:遥感图像语义描述是解释或注释遥感图像中地物对象和场景的类型、状态和特征的跨模态任务,深化了对遥感图像的解读和理解,成为遥感领域研究的热点。首先,从研究现状使用不同技术的角度出发,主要介绍基于像素和基于目标两种方法下的遥感图像语义描述工作。其次,根据解码器的不同,这两种方法进一步细分为CNN-RNN方法和CNN-Transformer方法。尽管遥感图像描述研究已有明显进展,但面对复杂的背景干扰、尺度多变、目标模糊、类间相似等挑战,仍需克服诸多难题。未来,遥感图像语义描述研究需专注于图像视觉信息的利用、特征增强和集成大型模型等方面的创新,以提升模型的鲁棒性和准确性。Semantic description of remote sensing images is a cross-modal task to explain or annotate the types,states and features of ground objects and scenes in remote sensing images.It deepens the in⁃terpretation and understanding of remote sensing images,and becomes a research hotspot in the field of remote sensing.Firstly,from the perspectives of different technologies used in the current research situation,the semantic description work of remote sensing images under the pixel-based and targetbased methods is mainly reviewed.Secondly,these two methods are further subdivided into CNN-RNN method and CNN-Transformer method according to different decoders.Although the research on re⁃mote sensing image description has made remarkable progress,many problems still need to be over⁃come in the face of complex background interference,variable scale,fuzzy target and similarity be⁃tween classes.In the future,the research of remote sensing image semantic description should focus on the use of image visual information,feature enhancement and integration of large-scale models to im⁃prove the robustness and accuracy of models.
分 类 号:TP752[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.33