检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:李文书[1] 赵朋 尹灵芝 李绅皓 Li Wenshu;Zhao Peng;Yin Lingzhi;Li Shenhao(College of Information Science and Technology,Zhejiang Sci-Tech University,Hangzhou 310018)
出 处:《计算机辅助设计与图形学学报》2022年第5期743-750,共8页Journal of Computer-Aided Design & Computer Graphics
基 金:国家自然科学基金(31771224);国家科技部重点研发计划重点专项课题(2018YFB1004901);浙江省自然科学基金(LY17C090011,LGF19F020009)。
摘 要:随着深度学习的迅速发展,图像风格迁移成为计算机视觉领域的研究热点之一.针对现有方法难以对内容图像中局部相似区域进行有效风格迁移的问题,提出基于高斯采样的区域多元化图像风格迁移方法.首先,通过编码器提取图像特征;然后,在特征空间中将内容特征、风格特征和从风格图像所处的高斯分布中采样得到的风格特征融合;最后,通过解码器重建风格化图像.在WikiArt和Microsoft COCO数据集上进行实验,并使用内容损失和多尺度风格损失评价指标进行量化度量.实验结果表明,与现有方法相比,所提方法能有效地降低生成图像的风格损失,使生成图像的整体风格更加统一,呈现出更好的视觉效果.With the rapid development of deep learning,image style transfer is currently one of the most actively explored fields in computer vision.Aiming at the problem that the existing methods are difficult to transfer the style of the local similar area in the content image,a novel area diversified style transfer method is proposed.Firstly,the image features are extracted through the encoder.Then,the content features,style features and style features sampled from the Gaussian distribution of the style image are fused in the feature space.Finally,the stylized image is reconstructed through the decoder.Experiments are conducted on WikiArt and Microsoft COCO datasets,and the content loss and multi-scale style losses are used as the quantitative measurement.The experimental results show that,compared with the existing methods,this method can effectively reduce the style loss of the generated images,make the overall style of the generated images more unified,and present a better visual effect.
关 键 词:图像风格迁移 卷积神经网络 特征变换 高斯分布 高斯采样
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.191