2026-03-01
Subgradient methods for nonsmooth convex functionswith adversarial errors
Publication
Publication
ACM SIGMETRICS Performance Evaluation Review , Volume 53 - Issue 4 p. 109- 110
We consider minimizing nonsmooth convex functions with bounded subgradients. However, instead of directly observing a subgradient at every step, the optimizer receives an adversarially corrupted subgradient. The adversary's power is limited to a finite corruption budget, with which the adversary can strategically time their perturbations. We show that the averaged subgradient descent method, which is optimal in the noiseless case, has worst-case performance that deteriorates quadratically with the corruption budget. Using performance optimization programming, (i) we construct and analyze the performance of three novel subgradient descent methods, and (ii) propose a novel lower bound on the worst-case suboptimality gap of any first-order method satisfying a mild cone condition proposed by [5]. The worst-case performance of each of our methods degrades only linearly with the corruption budget. Furthermore, we prove that all three proposed subgradient descent methods are nearoptimal and suffer a worst-case suboptimality gap which asymptotically matches our lower bound. Our subgradient descent methods achieve near-optimal performance without needing momentum nor averaging. This suggests that these techniques are not necessary in this context, which is in line with recent results by [9].
| Additional Metadata | |
|---|---|
| doi.org/10.1145/3797823.3797856 | |
| ACM SIGMETRICS Performance Evaluation Review | |
| Organisation | Centrum Wiskunde & Informatica, Amsterdam (CWI), The Netherlands |
|
Gösgens, M., & van Parys, B. (2026). Subgradient methods for nonsmooth convex functionswith adversarial errors. ACM SIGMETRICS Performance Evaluation Review, 53(4), 109–110. doi:10.1145/3797823.3797856 |
|