发散步长准则下的增量聚合梯度算法
DOI:
作者:
作者单位:

作者简介:

通讯作者:

基金项目:


Incremental Aggregated Gradient Algorithm with the Divergence Step Size
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
    摘要:

    针对目标函数是若干光滑函数之和的优化问题,提出采用发散步长准则的增量聚合梯度算法。与增量梯度算法一样,增量聚合梯度算法的每次迭代也只需要计算其中一个函数的梯度。目前关于增量聚合梯度算法的研究主要是采用常值步长的增量聚合梯度算法,这一算法要求目标函数二阶连续可微且强凸,且常值步长的选取依赖最优点的二阶导数;而发散步长准则不依赖目标函数。 在目标函数的梯度有界且李普希兹连续假设条件下,证明了采用发散步长的增量聚合梯度算法的收敛性;最后,通过数值例子验证了算法的收敛性,并与采用相同步长准则的增量梯度算法进行比较; 数值结果表明对于某些优化问题,增量聚合梯度算法比采用相同步长的增量梯度算法更有效。

    Abstract:

    For optimizing that objective function is the sum of several smooth functions, the incremental aggregated gradient algorithm employing the divergence step size is proposed. As the same as the incremental gradient algorithm, each iteration of the incremental aggregated gradient algorithm only needs to calculate a single gradient of a function. At present, the studies on the incremental aggregated gradient algorithm mainly employ the constant step size under the assumption that the objective function is twice continuously differentiable and strongly convex, and the choice of the constant step size depends on the second order derivative of the optimal point. However, the divergence step size is independent of the objective function. The convergence of the incremental aggregated gradient algorithm with the divergence step size is verified under the assumption that the gradients of the objective function are bounded and Lipschitz continuous. Finally, some numerical examples are provided to illustrate the theoretical result and to compare with the incremental gradient algorithm with the same step size. The numerical results show that, for some optimizations, the incremental aggregated gradient algorithm is more effective than the incremental gradient algorithm with the same step size.

    参考文献
    相似文献
    引证文献
引用本文

钱晓慧, 王湘美.发散步长准则下的增量聚合梯度算法[J].重庆工商大学学报(自然科学版),2020,37(2):6-11
QIAN Xiao-hui, ANG Xiang-mei. Incremental Aggregated Gradient Algorithm with the Divergence Step Size[J]. Journal of Chongqing Technology and Business University(Natural Science Edition),2020,37(2):6-11

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2020-06-08
×
2024年《重庆工商大学学报(自然科学版)》影响因子显著提升