检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:董智强 肖云[1] 段佳顺 DONG Zhiqiang;XIAO Yun;DUAN Jiashun(School of Information Science and Technology,Northwest University,Xi’an 710127,China)
机构地区:[1]西北大学信息科学与技术学院,陕西西安710127
出 处:《西北大学学报(自然科学版)》2025年第1期193-200,共8页Journal of Northwest University(Natural Science Edition)
基 金:国家自然科学基金(62372371);陕西省国际科技合作计划重点项目(2022KWZ-14)。
摘 要:利用图像和谐化算法虚拟修复破损的书法图像,对于文物保护具有重要意义。现有的图像和谐化方法大多集中于解决前景和背景之间的不和谐问题,针对书法图像虚拟修复过程中出现的前景和背景图像视觉特征差异大、不和谐的问题,提出了一种基于局部光影感知的书法图像和谐化算法LSPNet。LSPNet通过引入参考掩膜和内容掩膜,对需要修复的区域进行精确的定位,从而确保合成的前景与背景在风格上保持一致。为了验证该算法的有效性,在破损的书法图像上进行了实验,经过实验和对比,LSPNet相比于其他算法能够明显降低前景与背景之间的亮度、对比度以及图像结构等方面的差异,使得修复后的书法作品在视觉上更加自然和统一。The virtual restoration of damaged calligraphy images using image harmonizati on algorithms is of great significance for cultural relic protection.Most existing image harmonization methods focus on addressing the disharmony between foreground and background.In response to the significant visual feature differences and disharmony issues between foreground and background during the virtual restoration of calligraphy images,this paper proposes a calligraphy image harmonization algorithm based on local light and shadow perception on LSPNet.LSPNet precisely locates the area to be restored by introducing reference masks and content masks,thereby ensuring that the style of the synthesized foreground and background remains consistent.To verify the effectiveness of this algorithm,experiments were conducted on damaged calligraphy images.Through numerous experiments and comparisons,LSPNet significantly outperforms other algorithms in reducing the differences in brightness,contrast,and image structure between the foreground and background,making the restored calligraphy works appear more visually natural and unified.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.38