Approximation Error and Complexity Bounds for ReLU Networks on
Low-Regular Function Spaces
Approximation Error and Complexity Bounds for ReLU Networks on
Low-Regular Function Spaces
In this work, we consider the approximation of a large class of bounded functions, with minimal regularity assumptions, by ReLU neural networks. We show that the approximation error can be bounded from above by a quantity proportional to the uniform norm of the target function and inversely proportional to the …