引用本文:米志鹏.基于风格特征分布匹配的图像风格迁移(J/M/D/N,J:杂志,M:书,D:论文,N:报纸).期刊名称,2023,40(2):51-56
CHEN X. Adap tive slidingmode contr ol for discrete2ti me multi2inputmulti2 out put systems[ J ]. Aut omatica, 2006, 42(6): 4272-435
【打印本页】   【下载PDF全文】   查看/发表评论  【EndNote】   【RefMan】   【BibTex】
←前一篇|后一篇→ 过刊浏览    高级检索
本文已被:浏览 735次   下载 3591 本文二维码信息
码上扫一扫!
分享到: 微信 更多
基于风格特征分布匹配的图像风格迁移
米志鹏
安徽工业大学 计算机科学与技术学院, 安徽 马鞍山 243000
摘要:
本研究针对现有方法在实现风格迁移任务时只能提取图像特征的低阶统计量这一问题,考虑将风格迁移过 程建模为一个特征分布匹配过程,提出了一个基于 Wasserstein 距离的判别器网络并以此定义了一个风格损失函 数,Wasserstein 判别器能够更好地拟合特征分布之间的 Wasserstein 距离,定义的风格损失也能够更好地区分图像 特征的高阶统计信息之间的差异。 同时,为了达到实时生成的效果,引入一个基于编码器-解码器结构和一个基于 注意力机制的风格迁移转换模块作为生成网络,该生成网络能够有效融合原始图像特征并生成。 具体而言,通过 在计算损失模块的卷积层(CNN)后面添加 Wasserstein 判别器来计算风格损失,然后将风格损失与传统方法中计 算为均方误差的内容损失一起监督生成网络的训练,在网络训练结束后,可以输入任意图像进行风格迁移测试。 最后,在基准 MSCOCO 和 WikiArt 数据集上训练网络并测试结果,定性实验和定量实验结果表明,与现有方法相 比,所提出的方法可以实现实时风格迁移,并且生成高质量风格化效果。
关键词:  注意力机制  Wasserstein 距离  特征分布  风格迁移
DOI:
分类号:
基金项目:
Image Style Transfer Based on the Distribution Matching of the Style Features
MI Zhipeng
School of Computer Science and Technology, Anhui University of Technology, Anhui Maanshan 243000, China
Abstract:
Aiming at the problem that the existing methods can only extract the low-order statistics of image features when implementing the task of style transfer, this study considered modeling the style transfer process as a feature distribution matching process, proposed a discriminator network based on Wasserstein distance, and defined a style loss function. Wasserstein discriminator can better fit the Wasserstein distance between feature distributions, and the defined style loss can better distinguish the difference between higher-order statistical information of image features. At the same time, in order to achieve the effect of real-time generation, a style transfer conversion module based on an encoder-decoder structure and an attention mechanism was introduced as a generation network. This generation network can effectively integrate the original image features and generate them. Specifically, the style loss was computed by adding a Wasserstein discriminator after the convolutional layer of the computational loss module. The training of the generative network was then supervised by the style loss together with the content loss calculated as the mean square error in the traditional methods, and after the network training, images can be input for the style transfer test. Finally, the network was trained on the benchmark MSCOCO and WikiArt datasets and the results were tested. The qualitative and quantitative experimental results showed that the proposed method can achieve real-time style transfer and generate high-quality stylization results compared with existing methods.
Key words:  attention mechanism  Wasserstein distance  feature distributions  style transfer
重庆工商大学学报(自然科学版) 版权所有
地址:中国 重庆市 南岸区学府大道19号 重庆工商大学学术期刊社 邮编:400067
电话:023-62769495 传真:
您是第4752840位访客
关注微信二维码