(no title)
just-ok | 1 year ago
I’m excited to see models become open, but given the curve of progress we’ve seen, even being “a little” behind is a gap that grows exponentially every day.
just-ok | 1 year ago
I’m excited to see models become open, but given the curve of progress we’ve seen, even being “a little” behind is a gap that grows exponentially every day.
crocowhile|1 year ago
Most importantly, this is a signal: openAI and META are trying to build a moat using massive hardware investments. Deepseek took the opposite direction and not only does it show that hardware is no moat, it basically makes fool of their multibillion claims. This is massive. If only investors had the brain it takes, we would pop this bubble alread.
diego_sandoval|1 year ago
I mean, sure, no one is going to have a monopoly, and we're going to see a race to the bottom in prices, but on the other hand, the AI revolution is going to come much sooner than expected, and it's going to be on everyone's pocket this year. Isn't that a bullish signal for the economy?
riffraff|1 year ago
If people can replicate 90% of your product in 6 weeks you have competition.
chii|1 year ago
The moat for these big models were always expected to be capital expenditure for training costing billions. It's why these companies like openAI etc, are spending massively on compute - it's building a bigger moat (or trying to at least).
If it can be shown, which seems to have been, that you could use smarts and make use of compute more efficiently and cheaply, but achieve similar (or even better) results, the hardware moat bouyed by capital is no longer.
i'm actually glad tho. An opensourced version of these weights should ideally spur the type of innovation that stable diffusion did when theirs was released.
unknown|1 year ago
[deleted]
nialv7|1 year ago
Mond_|1 year ago
And this is based on what exactly? OpenAI hides the reasoning steps, so training a model on o1 is very likely much more expensive (and much less useful) than just training it directly on a cheaper model.
karmasimida|1 year ago
R1's biggest contribution IMO, is R1-Zero, I am fully sold with this they don't need o1's output to be as good. But yeah, o1 is still the herald.
acchow|1 year ago
This theory has yet to be demonstrated. As yet, it seems open source just stays behind by about 6-10 months consistently.
resters|1 year ago
I thought that too before I used it to do real work.
havkom|1 year ago