(no title)
acdc4life | 6 years ago
Okay, so what? You require more and more neurons (ie. parameters) to approximate your function better and better. You can do the same with piecewise constant (Riemann sums). You can do this with trig functions too (Fourier transform).
"This result tells us that neural networks have a kind of universality."
I don't know what this statement means. What mathematical properties do neural networks have that other functions don't? The ability to approximate continuous functions isn't special. Given 5 points, I can perfectly fit an elephant to your function. And it's not like you are fitting the function with as few parameters as possible.
dual_basis|6 years ago