钱晓慧, 王湘美.发散步长准则下的增量聚合梯度算法[J].重庆工商大学学报(自然科学版),2020,37(2):6-11
QIAN Xiao-hui, ANG Xiang-mei.Incremental Aggregated Gradient Algorithm with the Divergence Step Size[J].Journal of Chongqing Technology and Business University(Natural Science Edition),2020,37(2):6-11
发散步长准则下的增量聚合梯度算法
Incremental Aggregated Gradient Algorithm with the Divergence Step Size
  
DOI:
中文关键词:  光滑优化  增量梯度算法  增量聚合梯度算法  发散步长
英文关键词:smooth optimization  incremental gradient algorithm  incremental aggregated gradient algorithm  divergence step size
基金项目:
作者单位
钱晓慧, 王湘美 贵州大学 数学与统计学院贵阳 550025 
摘要点击次数: 183
全文下载次数: 113
中文摘要:
      针对目标函数是若干光滑函数之和的优化问题,提出采用发散步长准则的增量聚合梯度算法。与增量梯度算法一样,增量聚合梯度算法的每次迭代也只需要计算其中一个函数的梯度。目前关于增量聚合梯度算法的研究主要是采用常值步长的增量聚合梯度算法,这一算法要求目标函数二阶连续可微且强凸,且常值步长的选取依赖最优点的二阶导数;而发散步长准则不依赖目标函数。 在目标函数的梯度有界且李普希兹连续假设条件下,证明了采用发散步长的增量聚合梯度算法的收敛性;最后,通过数值例子验证了算法的收敛性,并与采用相同步长准则的增量梯度算法进行比较; 数值结果表明对于某些优化问题,增量聚合梯度算法比采用相同步长的增量梯度算法更有效。
英文摘要:
      For optimizing that objective function is the sum of several smooth functions, the incremental aggregated gradient algorithm employing the divergence step size is proposed. As the same as the incremental gradient algorithm, each iteration of the incremental aggregated gradient algorithm only needs to calculate a single gradient of a function. At present, the studies on the incremental aggregated gradient algorithm mainly employ the constant step size under the assumption that the objective function is twice continuously differentiable and strongly convex, and the choice of the constant step size depends on the second order derivative of the optimal point. However, the divergence step size is independent of the objective function. The convergence of the incremental aggregated gradient algorithm with the divergence step size is verified under the assumption that the gradients of the objective function are bounded and Lipschitz continuous. Finally, some numerical examples are provided to illustrate the theoretical result and to compare with the incremental gradient algorithm with the same step size. The numerical results show that, for some optimizations, the incremental aggregated gradient algorithm is more effective than the incremental gradient algorithm with the same step size.
查看全文  查看/发表评论  下载PDF阅读器
关闭
重庆工商大学学报自然科学版 版权所有
地址:中国 重庆市 南岸区学府大道19号,重庆工商大学学报编辑部 邮编:400067
电话:023-62769495 传真:
您是第2348156位访客
关注微信二维码