top | item 19713200

(no title)

acdc4life | 6 years ago

"No matter what the function, there is guaranteed to be a neural network so that for every possible input, x , the value f(x) or some close approximation) is output from the network"

Okay, so what? You require more and more neurons (ie. parameters) to approximate your function better and better. You can do the same with piecewise constant (Riemann sums). You can do this with trig functions too (Fourier transform).

"This result tells us that neural networks have a kind of universality."

I don't know what this statement means. What mathematical properties do neural networks have that other functions don't? The ability to approximate continuous functions isn't special. Given 5 points, I can perfectly fit an elephant to your function. And it's not like you are fitting the function with as few parameters as possible.

discuss

order

dual_basis|6 years ago

It's a baseline desirable property. It tells us that, at the very least, neutral networks are capable of approximating any continuous function. This isn't true for linear functions, for example, so we wouldn't want to try and model everything using linear functions.