Quick Estimates of Combined Multivariate Inequalities

In this paper, we study integrated optimization for solving stochastic variational inequalities (VIs), a problem that has attracted increasing attention in recent years. Despite great progress, a significant gap still exists between existing convergence standards and the state-of-the-art known federated convex optimization. In this work, we address this limitation by establishing a series of improved convergence standards. First, we show that, for smooth general inequalities with monotone variation, the classical Local Extra SGD algorithm admits strong guarantees under refined analysis. Next, we identify the natural limit of Local Extra SGD, which can lead to excessive client flooding. Motivated by this observation, we propose a new algorithm, the Local Inexact Proximal Point Algorithm with an Extra Step (LIPPAX), and show that it reduces client drift and achieves improved guarantees in several settings, including bounded Hessian, bounded operator, and low-variance settings. Finally, we extend our results to composite inequality and establish improved convergence guarantees.
- † Georgia Institute of Technology


