(no title)
dmakian | 2 years ago
A lot of why I tried this out was to test the limits of this belief, you see a lot of talk like this out there and it sounded like nonsense to me.
Finetuning is fundamentally not much different than continued pretraining; if you feed the model high-quality and high-volume data I think it's reasonable to expect it to acquire new skills
No comments yet.