检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:卫禹帆 张丽红[1] WEI Yufan;ZHANG Lihong(College of Physical and Electronic Engineering,Shanxi University,Taiyuan,030006,China)
出 处:《网络新媒体技术》2024年第4期16-25,共10页Network New Media Technology
基 金:山西省研究生创新项目(编号:2023SJ012);山西省高等学校教学改革创新项目(编号:J2021086)。
摘 要:夜间目标检测任务中,目标能见度低,难以对图像数据进行大量标注,导致使用监督方法进行大规模数据训练的困难,而当使用较小规模标注数据训练时监督方法容易产生过拟合,预测准确性差。针对这些问题,本文提出一种无监督域适应夜间目标检测模型,使用有标注白昼图像和无标注夜间图像训练。模型使用了昼夜图像增强方法,减小昼夜域间隙并提升夜间训练数据的复杂性以丰富特征学习;将多尺度通道注意力引入Faster-RCNN模型,提升感知多尺度特征的能力;使用类别对比学习方法获得具有鉴别性和域间不变性的类别特征。在城市交通数据集BDD100K和SODA10M上进行的实验表明,本文方法性能优于常用的域适应目标检测方法。In the task of nighttime object detection,the visibility of the object is low,and it is difficult to annotate a large amount of image data,which makes it difficult to use supervised methods for large-scale data training.And when using a small-scale annotated data for training supervised methods tend to overfit on the training data,resulting in poor prediction accuracy.To address these issues,an unsupervised domain adaptation nighttime object detection model using labeled daytime images and unlabeled nighttime images for training is proposed in this paper.Day-night image enhancement methods is employed in the model to reduce the domain gap and enhance the complexity of nighttime training data to enrich feature learning;multi-scale channel attention is introduced to the Faster-RCNN model to enhance its ability to perceive multi-scale features;and class-level contrastive learning is used to obtain discriminative and domain-invariant class features.Experiments conducted on the urban traffic datasets BDD100K and SODA10M demonstrate that the performance of the proposed method exceeds that of commonly used domain adaptation target detection methods.
关 键 词:无监督学习 域适应 夜间目标检测 对比学习 Faster-RCNN
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.148.229.54