top | item 38049205

(no title)

mvcalder | 2 years ago

Whenever this topic comes up I like to provide a citation to some work I've done:

https://towardsdatascience.com/gradient-kernel-regression-e4...

Not out of vanity (ok, a little) but because I think the idea has importance that has not been fully explored. The article's Bayesian perspective may be the whole story but somehow I don't think so. Unlike the article's author, my work left me feeling model architecture was the most important thing (behind training data) whereas they seem to feel it is ancillary.

discuss

order

sdenton4|2 years ago

It's data, then loss, and then, finally, architecture. And including some additional conditioning or metadata to help prediction will often have higher value than an architecture change...

pictureofabear|2 years ago

Can you explain this idea a little further? Or do you know of some further reading on this topic?