This is laughable... Neural networks also have basically unknowable abstractions encoded in their weights. There was some work not long ago which taught an ANN to do modular arithmetic, and found that it was doing Fourier transforms with the learned weights....
When a human has done the same thing many times they tend to try to generalize and take shortcuts. And make tools. Perhaps I missed something but I haven't seen a neural net do that.
Is that very different than the distillation and amplification process that happens during training? Where the neural net learns to predict in one step what initially required several steps of iterated execution.
skepticATX|2 years ago
It’s just that usually these abstractions are fuzzy and hard to formalize, so they aren’t shared. It doesn’t mean that they don’t exist.
sdenton4|2 years ago
amelius|2 years ago
mitthrowaway2|2 years ago
littlestymaar|2 years ago