top | item 22585233 Fast and Easy Infinitely Wide Networks with Neural Tangents 15 points| rainboiboi | 6 years ago |ai.googleblog.com | reply 3 comments order hn newest [+] [-] dgreensp|6 years ago|reply Can someone explain the implications of this for performance, or what we can do now that we couldn’t do before? [+] [-] perl4ever|6 years ago|reply Well, they seem to be claiming it allows insight or understanding into how the networks work. However, they don't seem to demonstrate that as such. [+] [-] rllin|6 years ago|reply It really looks more and more like JAX is the internal winner after the tf 2.0 fiasco [+] [-] miscPerson|6 years ago|reply [deleted]
[+] [-] dgreensp|6 years ago|reply Can someone explain the implications of this for performance, or what we can do now that we couldn’t do before? [+] [-] perl4ever|6 years ago|reply Well, they seem to be claiming it allows insight or understanding into how the networks work. However, they don't seem to demonstrate that as such.
[+] [-] perl4ever|6 years ago|reply Well, they seem to be claiming it allows insight or understanding into how the networks work. However, they don't seem to demonstrate that as such.
[+] [-] rllin|6 years ago|reply It really looks more and more like JAX is the internal winner after the tf 2.0 fiasco [+] [-] miscPerson|6 years ago|reply [deleted]
[+] [-] dgreensp|6 years ago|reply
[+] [-] perl4ever|6 years ago|reply
[+] [-] rllin|6 years ago|reply
[+] [-] miscPerson|6 years ago|reply
[deleted]