top | item 42553926

(no title)

d3VwsX | 1 year ago

If I understand the distinction correctly, I run llamafile as a backend. I start it with the filename of a model on the command-line (might need a -M flag or something) and it will start up a chat-prompt for interaction in the terminal but also opens a port that speaks some protocol that I can connect to using a frontend (in my case usually gptel in emacs).

discuss

order

No comments yet.