top | item 46811423 (no title) matusp | 1 month ago If this was the case, we would see R&D costs dropping for OpenAI. Not sure if that is the case. discuss order hn newest Lerc|1 month ago They do still train models from scratch, and they are still making larger models.You would expect that to use a lot more resources.It seems to be that GPT 5.x are all likely to be extensions of the gpt5 base model with similar numbers of parameters.The money spent on extending base models would be dwarfed by the scale increase of the next major number.
Lerc|1 month ago They do still train models from scratch, and they are still making larger models.You would expect that to use a lot more resources.It seems to be that GPT 5.x are all likely to be extensions of the gpt5 base model with similar numbers of parameters.The money spent on extending base models would be dwarfed by the scale increase of the next major number.
Lerc|1 month ago
You would expect that to use a lot more resources.
It seems to be that GPT 5.x are all likely to be extensions of the gpt5 base model with similar numbers of parameters.
The money spent on extending base models would be dwarfed by the scale increase of the next major number.