top | item 43226137

(no title)

Zamiel_Snawley | 1 year ago

I think this is the best economic function of open source—it forces innovation by elimination of rent seeking.

If you sit on your laurels, someone will make a solution thats at least 70% as good for free.

discuss

order

Philpax|1 year ago

It's a nice sentiment and I agree, but I'm not sure we'll see this applied to AI going forward. These training runs cost a lot of money - at some point, every player in the game will realise that they need to charge something or they'll left without enough chips to play in the next round.

DeepSeek's altruism has taken them far, but they have costs, too, and High Flyer / their personal warchest can only take them so far. And that's before any potential government intervention - it's very likely that this will become a natsec concern for all nations involved.

ForTheKidz|1 year ago

There are models with open training as well as weights. It's not chatgpt, but that trained model can be fully audited and reused without further training. I think for most daily use I'd be fine using a greatly outdated model. Open source has always relied on contributors eating costs—even just the opportunity cost of contributing time.

FLOSS software is slow, much slower than VC-funded explosive growth, but it's hard to compete with in the long term.

tecleandor|1 year ago

Yeah well, but you should add OpenAI to that "altruism" first. They're the ones burning money like crazy that could find themselves in the end of their runway quite quickly. Isn't crazy to think DeepSeek could break even and make a good profit before OpenAI goes broke, if they don't find another investor with even bigger pockets.

baby_souffle|1 year ago

I think there's always going to be somebody that doesn't want to bother with the complexities of setting up a model to run locally.

Spending way too much time trying to track down a very particular version of a GPU driver or similar just isn't going to be worth it if you can make an API call to some remote endpoint that's already done the heavy lifting.

Plenty of value in handling the hard part so your customer doesn't have to.

I don't know how much of the current focus on local models comes from privacy concerns, but at least some does. Once there's something like the gdpr but for data provided for inference, I think even more people will put down the docker containers and pick up the rest endpoints.

wavefunction|1 year ago

or they will regulatory capture, that seems more likely