top | item 44857701

(no title)

mlnomadpy | 6 months ago

they are one of the reasons neural networks are blackbox, we lose information about the data manifold the deeper we go in the network, making it impossible to trace back the output

this preprint is not coming from a standpoint of optimizing the inference/compute, but from trying to create models that we can interpret in the future and control

discuss

order

No comments yet.