We give bounds on the additive gap between the value of a random integer program $\max c^\mathsf{T} x, Ax \leq b, x \in \{0,1\}^n$ with $m$ constraints and that of its linear programming relaxation for a range of distributions on $(A,b,c)$. Dyer and Frieze (MOR '89) and Borst et al (IPCO '21) respectively showed that for random packing and Gaussian IPs, where the entries of $A,c$ are independently distributed according to either uniform $[0,1]$ or $\mathcal{N}(0,1)$, that the integrality gap is bounded by $O_m(s \log^2 n / n)$ with probability at least $1-1/n-e^{-\Omega_m(s)}$ for $s \geq 1$. In this paper, we extend these results to the case where $A$ is discretely distributed (e.g., entries $\{-1,0,1\}$), and where the columns of $A$ are logconcave distributed. Second, we improve the success probability from constant, for fixed $s$ and $m$, to $1-1/\mathrm{poly}(n)$. Using a connection between integrality gaps and Branch-and-Bound due to Dey, Dubey and Molinaro (SODA '21), our gap results imply that Branch-and-Bound is polynomial for these IPs. <br>Our main technical contribution and the key for achieving the above results, is a new discrepancy theoretic theorem which gives general conditions for when a target $t$ is equal or very close to a $\{0,1\}$ combination of the columns of a random matrix $A$. Compared to prior results, our theorem handles a much wider range of distributions on $A$, both continuous and discrete, and achieves success probability exponentially close to $1$ as opposed to constant. We prove this lemma using a Fourier analytic approach, building on the work of Hoberg and Rothvoss (SODA '19) and Franks and Saks (RSA '20) who studied similar questions for $\{-1,1\}$ combinations.

Networks and Optimization

Borst, S.J, Dadush, D.N, & Mikulincer, D. (2022). Integrality gaps for random integer programs via discrepancy.