top | item 35001570

(no title)

fl0ps | 3 years ago

I'm with you there on hoping for more concentration on local training and inference, especially on something like the new Orin (open to donations, NVidia!).

I think there's still something of a financial disincentive to promote selfhost over cloud capabilities for just about every party involved except the selfhoster. NVidia loses out if they're selling a few Nano or Orin platforms relative to the much pricier datacenter cards sold in huge lots. The cloud hosters are making up for the cost of hosting in terms of end-user pricing, snarfing all that lovely customer data and likely selling analytics, and probably other measures I can't understand yet. And the large companies that fund research and initial model development want to know what's being done with their models so as to gain any possible competitive advantage. They can't necesarily guarentee that intel from a self-hoster. Almost nobody is willing to spend the time explicitly to make it easier for the individual dev at a small lab or at home to do this because that's essentially a donation and not a business expense that might yeild obvious returns.

discuss

order

No comments yet.