Quantitative approximation results for complex-valued neural networks
We show that complex-valued neural networks with the modReLU activation function σ(z) = ReLU(|z| - 1) · z / |z| can uniformly approximate complex-valued functions of regularity C^n on compact subsets of ℂ^d, giving explicit bounds on the approximation rate.
READ FULL TEXT