检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:罗敦浪 蒋旻[1] 袁琳君 江佳俊 郭嘉 LUO Dunlang;JIANG Min;YUAN Linjun;JIANG Jiajun;GUO Jia(School of Computer Science and Technology,Wuhan University of Science and Technology,Wuhan 430065,China)
机构地区:[1]武汉科技大学计算机科学与技术学院,武汉430065
出 处:《计算机工程与应用》2021年第13期193-198,共6页Computer Engineering and Applications
基 金:国家自然科学基金(61702385)。
摘 要:随着多媒体技术的发展,诸如黑白照片着色、医学影像渲染和手绘图上色等各种图像着色应用需求逐渐增多。传统着色算法大部分存在着色模式单一、在处理部分数据时着色效果不佳或者依赖人工输入信息等缺点,对此,设计了一种条件生成对抗网络和颜色分布预测模型相结合的图像着色方法。由生成对抗网络生成着色图像,并通过预测模型的预测值来对生成器的生成的着色图像做出校正,改善了生成对抗网络生成图像颜色容易趋向单一化的问题。最后通过引入一个色彩对比度损失,进一步提升了算法在某些对比度较小的分类图像上的着色质量。通过在ImageNet数据集上的多组对比实验表明,与其他传统方法相比,该方法在更多的图像分类上有着更出色的着色效果。With the development of multimedia technology,the application of various kinds of image coloring,such as black and white photo coloring,medical image rendering and hand drawing coloring,has been increasing gradually.Most of the traditional coloring algorithms have some disadvantages,such as single coloring mode,poor coloring effect in processing part of data,or relying on manual input information,so this paper designs an image coloring method combining conditional generation countermeasure network and color distribution prediction model.The problem that the color of the generated image tends to be simple is improved by generating the shaded image from the generated antagonistic network and correcting the shaded image from the generated antagonistic network through the predicted value of the prediction model.Finally,a color contrast loss is introduced to further improve the quality of the algorithm in some classification images with low contrast.Comparing experiments on the ImageNet data set show that compared with other traditional methods,this method has better coloring effect on more image classification.
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28