1 - yes our current solution does require you to be allowed to use ChatGPT/OpenAI. Unfortunately the accuracy using smaller models (even GPT-3.5) is poor. We don't see a local model (which will be much worse than GPT-3.5) even with fine tuning being anywhere close to good enough (would also require a really large number of queries). So we are relying on GPT-4 for now.2 - agreed the background isn't why anyone should adopt a tool, just wanted to share our story. I would add that creating a good wrapper can actually be quite challenging, need to synthesize many pieces under constraints like memory, compute, speed, accuracy.
No comments yet.