(no title)
saghul | 1 year ago
According to the docs it only uses that if the model doesn't begin tieh gpt- and uses an old API, so alas it doesn't seem compatible with ollama.
Edit: here is the relevant part of the code: https://github.com/gnachman/iTerm2/blob/a196d31658a8d0aa2dc5...
Looks like adding ollama support is really a matter of changing a few lines of code.
Edit 2: Seems to be fixed in the next beta release: https://github.com/gnachman/iTerm2/commit/fcd212490626f1d8ea...
No comments yet.