top | item 46235127

(no title)

elgatolopez | 2 months ago

Where did you get that from? Cutoff date says august 2025. Looks like a newly pretrained model

discuss

order

FergusArgyll|2 months ago

> This stands in sharp contrast to rivals: OpenAI’s leading researchers have not completed a successful full-scale pre-training run that was broadly deployed for a new frontier model since GPT-4o in May 2024, highlighting the significant technical hurdle that Google’s TPU fleet has managed to overcome.

- https://newsletter.semianalysis.com/p/tpuv7-google-takes-a-s...

It's also plainly obvious from using it. The "Broadly deployed" qualifier is presumably referring to 4.5

ric2b|2 months ago

How is that a technical hurdle if they obviously were able to do it before?

It's probably just a question of cost/benefit analysis, it's very expensive to do, so the benefits need to be significant.

SparkyMcUnicorn|2 months ago

If the pretraining rumors are true, they're probably using continued pretraining on the older weights. Right?