top | item 40401035

(no title)

yatz | 1 year ago

I believe it will get better and more efficient as we go. On a side note, OpenAI seems to release products before they are ready and they evolve as they go.

discuss

order

nl|1 year ago

> I believe it will get better and more efficient as we go.

Yes of course. The point remains: the LLM has to process the data somehow.

If you are concerned about costs and token usage then switch to a provider that works for your problem (Flash Gemini looks very interesting..)