Definitely give them a go, we use fine-tuned ada a bunch for classification work for example; I personally think the smaller models are overlooked and don't get enough love - if OpenAI increased the context window of a model like babbage to 8k tokens I feel like that would be as much of a big deal as making a marginal improvement to davinci, purely because so many use cases rely on low-latency, many request models.
Buoy|3 years ago