检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:翟记锋 ZHAI Ji-Feng(School of Computer Science,Fudan University,Shanghai 200438,China)
机构地区:[1]复旦大学计算机科学技术学院,上海200438
出 处:《计算机系统应用》2023年第8期19-30,共12页Computer Systems & Applications
基 金:国家自然科学基金(61771146)。
摘 要:在计算机视觉领域的双目立体匹配方向,基于神经网络的深度学习算法需要场景数据集进行训练,泛化能力差.针对这两个问题,根据神经网络能够模拟函数的特点,提出一种无需在数据集上训练,以双目图像互为监督的深度场景相容解迭代优选算法.该算法使用场景位置猜测网络模拟关于当前双目图像的深度场景相容位置空间,用与该网络匹配的互监督损失函数通过梯度下降法指导该网络在输入双目图像上迭代学习,搜索深度场景相容位置空间中的可行解,整个算法过程无需在数据集上训练.与CREStereo、PCW-Net、CFNet等算法在Middlebury标准数据集图像上的对比实验表明,该算法在非遮挡区域的平均误匹配率为2.52%,在所有区域的平均误匹配率为7.26%,比对比实验中的其他算法有更低的平均误匹配率.In the direction of binocular stereo matching in computer vision,deep learning algorithms based on neural networks require scene datasets for training and have poor generalization ability.In order to address these two problems,an iterative optimization algorithm of compatible solutions of deep scenes is proposed based on the ability of neural networks to simulate functions,and the algorithm requires no training on a dataset,with binocular images supervised by each other.The algorithm uses a scene location guessing network to simulate the compatible location space of a deep scene about the current binocular image,and a mutually supervised loss function matched with this network is used to guide the network to iteratively learn on the input binocular image by gradient descent.In addition,the feasible solution in the compatible location space of the deep scene is searched,and the whole process does not require training on the dataset.Comparison experiments with CREStereo,PCW-Net,CFNet,and other algorithms on Middlebury standard dataset images show that this algorithm has an average mismatching rate of 2.52%in non-occluded regions and 7.26%in all regions,which is lower than that of the other algorithms in the comparison experiments.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7