70237

Автор(ы): 

Автор(ов): 

1

Параметры публикации

Тип публикации: 

Статья в журнале/сборнике

Название: 

On the Convergence Analysis of Aggregated Heavy-Ball Method

ISBN/ISSN: 

978-3-031-09607-5

DOI: 

10.1007/978-3-031-09607-5_1

Наименование источника: 

  • Lecture Notes in Computer Science

Обозначение и номер тома: 

V. 13367

Город: 

  • Cham

Издательство: 

  • Springer Link

Год издания: 

2022

Страницы: 

3-17
Аннотация
Momentum first-order optimization methods are the workhorses in various optimization tasks, e.g., in the training of deep neural networks. Recently, Lucas et al. (2019) proposed a method called Aggregated Heavy-Ball (AggHB) that uses multiple momentum vectors corresponding to different momentum parameters and averages these vectors to compute the update direction at each iteration. Lucas et al. (2019) show that AggHB is more stable than the classical Heavy-Ball method even with large momentum parameters and performs well in practice. However, the method was analyzed only for quadratic objectives and for online optimization tasks under uniformly bounded gradients assumption, which is not satisfied for many practically important problems. In this work, we address this issue and propose the first analysis of AggHB for smooth objective functions in non-convex, convex, and strongly convex cases without additional restrictive assumptions. Our complexity results match the best-known ones for the Heavy-Ball method. We also illustrate the efficiency of AggHB numerically on several non-convex and convex problems.

Библиографическая ссылка: 

Данилова М.Ю. On the Convergence Analysis of Aggregated Heavy-Ball Method // Lecture Notes in Computer Science. 2022. V. 13367. С. 3-17.