基于风格特征分布匹配的图像风格迁移
DOI:
作者:
作者单位:

作者简介:

通讯作者:

基金项目:


Image Style Transfer Based on the Distribution Matching of the Style Features
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
    摘要:

    本研究针对现有方法在实现风格迁移任务时只能提取图像特征的低阶统计量这一问题,考虑将风格迁移过 程建模为一个特征分布匹配过程,提出了一个基于 Wasserstein 距离的判别器网络并以此定义了一个风格损失函 数,Wasserstein 判别器能够更好地拟合特征分布之间的 Wasserstein 距离,定义的风格损失也能够更好地区分图像 特征的高阶统计信息之间的差异。 同时,为了达到实时生成的效果,引入一个基于编码器-解码器结构和一个基于 注意力机制的风格迁移转换模块作为生成网络,该生成网络能够有效融合原始图像特征并生成。 具体而言,通过 在计算损失模块的卷积层(CNN)后面添加 Wasserstein 判别器来计算风格损失,然后将风格损失与传统方法中计 算为均方误差的内容损失一起监督生成网络的训练,在网络训练结束后,可以输入任意图像进行风格迁移测试。 最后,在基准 MSCOCO 和 WikiArt 数据集上训练网络并测试结果,定性实验和定量实验结果表明,与现有方法相 比,所提出的方法可以实现实时风格迁移,并且生成高质量风格化效果。

    Abstract:

    Aiming at the problem that the existing methods can only extract the low-order statistics of image features when implementing the task of style transfer, this study considered modeling the style transfer process as a feature distribution matching process, proposed a discriminator network based on Wasserstein distance, and defined a style loss function. Wasserstein discriminator can better fit the Wasserstein distance between feature distributions, and the defined style loss can better distinguish the difference between higher-order statistical information of image features. At the same time, in order to achieve the effect of real-time generation, a style transfer conversion module based on an encoder-decoder structure and an attention mechanism was introduced as a generation network. This generation network can effectively integrate the original image features and generate them. Specifically, the style loss was computed by adding a Wasserstein discriminator after the convolutional layer of the computational loss module. The training of the generative network was then supervised by the style loss together with the content loss calculated as the mean square error in the traditional methods, and after the network training, images can be input for the style transfer test. Finally, the network was trained on the benchmark MSCOCO and WikiArt datasets and the results were tested. The qualitative and quantitative experimental results showed that the proposed method can achieve real-time style transfer and generate high-quality stylization results compared with existing methods.

    参考文献
    相似文献
    引证文献
引用本文

米志鹏.基于风格特征分布匹配的图像风格迁移[J].重庆工商大学学报(自然科学版),2023,40(2):51-56
MI Zhipeng. Image Style Transfer Based on the Distribution Matching of the Style Features[J]. Journal of Chongqing Technology and Business University(Natural Science Edition),2023,40(2):51-56

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2023-04-06
×
2024年《重庆工商大学学报(自然科学版)》影响因子显著提升