top | item 46998966

(no title)

kaveh_h | 17 days ago

Have you tried providing multiple pages at a time to the model? It might do better transcription as it have bigger context to work with.

discuss

order

netdur|17 days ago

Gemini 3 long context is not good as Gemini 2.5

ody4242|17 days ago

I'm 100% sure that all providers are playing with the quantization, kv cache and other parameters of the models to be able to serve the demand. One of the biggest advantage of running a local model is that you get predictable behavior.