top | item 36901298

(no title)

royal__ | 2 years ago

I would agree with other commenters that fine tuning is very much not obsolete, and for another important reason: many people and domains do not have the resources or even desire to work with extremely large models like GPT-4. The world outside of OpenAIs monoliths is still very much important.

discuss

order

lmeyerov|2 years ago

Yep as soon as query per second matters, and cost per query... Definitely going through that on some work now, where we use GPT4 on subset and rest on GPT3 or tuned. Only slow user-facing and other low-volume / high-latency goes to GPT4..