That is genuinely a really fun way to look at it, thank you for linking this! Especially this fits nicely with Markov matrices where you have N input nodes and N output nodes and the sum of all of the probabilities coming out of one of the nodes needs to equal 1.
What I might find a little more difficult to teach to people through this lens is the phenomenon of eigenvectors, but I suppose that's to be expected—nothing will work well for all purposes.
>What I might find a little more difficult to teach to people through this lens is the phenomenon of eigenvectors
Why? Eigenvectors are simply inputs to the network where the output keeps its shape, that is, at most it gets rescaled, as if you had applied a uniform gain to the components, but otherwise it will be the same as the input.
Was just coming in to point to this. While the geometric intuition in the post being discussed is quite visual, thinking of matrix elements as directed arrows generalizes easily to high-dimensional vector spaces, and also makes it very easy to understand the behavior of sparse transformation matrices, and motivates a nice correspondence between linear algebra and graph algorithms (ref. graphblas)
This seems more accurate than the original post, which seems quite tied to this code/data analogy.
OTOH, it is just vectors in a many-dimensional space. We're neigh-supernatural machines for describing vectors, that's how we throw things better than any other animal, I don't really know why we need analogies here. Why do we want x'x? It maps to a distance...
crdrost|5 years ago
What I might find a little more difficult to teach to people through this lens is the phenomenon of eigenvectors, but I suppose that's to be expected—nothing will work well for all purposes.
layoutIfNeeded|5 years ago
Why? Eigenvectors are simply inputs to the network where the output keeps its shape, that is, at most it gets rescaled, as if you had applied a uniform gain to the components, but otherwise it will be the same as the input.
ssivark|5 years ago
ssivark|5 years ago
bee_rider|5 years ago
OTOH, it is just vectors in a many-dimensional space. We're neigh-supernatural machines for describing vectors, that's how we throw things better than any other animal, I don't really know why we need analogies here. Why do we want x'x? It maps to a distance...
unknown|5 years ago
[deleted]
tsjq|5 years ago