Abstract:For optimizing that objective function is the sum of several smooth functions, the incremental aggregated gradient algorithm employing the divergence step size is proposed. As the same as the incremental gradient algorithm, each iteration of the incremental aggregated gradient algorithm only needs to calculate a single gradient of a function. At present, the studies on the incremental aggregated gradient algorithm mainly employ the constant step size under the assumption that the objective function is twice continuously differentiable and strongly convex, and the choice of the constant step size depends on the second order derivative of the optimal point. However, the divergence step size is independent of the objective function. The convergence of the incremental aggregated gradient algorithm with the divergence step size is verified under the assumption that the gradients of the objective function are bounded and Lipschitz continuous. Finally, some numerical examples are provided to illustrate the theoretical result and to compare with the incremental gradient algorithm with the same step size. The numerical results show that, for some optimizations, the incremental aggregated gradient algorithm is more effective than the incremental gradient algorithm with the same step size.