Differential Private Stochastic Optimization with Heavy-tailed Data:
Towards Optimal Rates
Differential Private Stochastic Optimization with Heavy-tailed Data:
Towards Optimal Rates
We study convex optimization problems under differential privacy (DP). With heavy-tailed gradients, existing works achieve suboptimal rates. The main obstacle is that existing gradient estimators have suboptimal tail properties, resulting in a superfluous factor of $d$ in the union bound. In this paper, we explore algorithms achieving optimal rates of …