K. Antonakopoulos, E. V. Belmega, and P. Mertikopoulos. In NeurIPS '19: Proceedings of the 33rd International Conference on Neural Information Processing Systems, 2019.
Lipschitz continuity is a central requirement for achieving the optimal $\mathcal{O}(1/T)$ rate of convergence in monotone, deterministic variational inequalities (a setting that includes convex minimization, convex-concave optimization, nonatomic games, and many other problems). However, in many cases of practical interest, the operator defining the variational inequality may exhibit singularities at the boundary of the feasible region, precluding in this way the use of fast gradient methods that attain this optimal rate (such as Nemirovski’s mirror-prox algorithm and its variants). To address this issue, we propose a regularity condition which we call Bregman continuity, and which relates the variation of the operator to that of a suitably chosen Bregman function. Leveraging this condition, we derive an adaptive mirror prox algorithm which attains the optimal $\mathcal{O}(1/T)$ rate of convergence in problems with possibly singular operators, without any prior knowledge of the degree of smoothness (the Bregman analogue of the Lipschitz constant). We also present an extension of our algorithm to stochastic variational inequalities where the algorithm achieves a $\mathcal{O}(1/\sqrt{T})$ convergence rate.