Depth Separations in Neural Networks: Separating the Dimension from the
Accuracy
Depth Separations in Neural Networks: Separating the Dimension from the
Accuracy
We prove an exponential separation between depth 2 and depth 3 neural networks, when approximating an $\mathcal{O}(1)$-Lipschitz target function to constant accuracy, with respect to a distribution with support in $[0,1]^{d}$, assuming exponentially bounded weights. This addresses an open problem posed in \citet{safran2019depth}, and proves that the curse of dimensionality …