In order to provide differential privacy, Gaussian noise with standard deviation σ is added to local SGD updates after performing a clipping operation in Differential Private SGD (DP-SGD). By non- trivially improving the account method we prove a simple and easy to evaluate closed form (, δ)-DP guarantee: DP-SGD is (, δ)-DP if σ = √2( + ln(1/δ))/ and T is at least ≈ 2k2/, where T is the total number of rounds, and K = kN is the total number of gradient computations where k measures K in number of epochs of size N of the local data set. We prove that our expression is close to tight in that if T is more than a constant factor ≈ 8 smaller than the lower bound ≈ 2k2/, then the (, δ)-DP guarantee is violated. Choosing the smallest possible value T ≈ 2k2/ not only leads to a close to tight DP guarantee, but also minimizes the total number of communicated updates and this means that the least amount of noise is aggregated into the global model and accuracy is optimized as confirmed by simulations. In addition this minimizes round communication.