(no title)
lowyek | 1 year ago
Just another thought experiment -> sometimes I imagine neural networks as a zip of the training data where compression algorithm is backpropagation. Just like we have programs which let us see what files inside the zip are -> I imagine there can be programs which will let us select certain inference path of the neural net and then see what data affected that => then we edit that data to fix our issues or add more data there => and we have live neural network debugging and reprogramming in the same way we edit compressed zips
No comments yet.