top | item 36868941

(no title)

nomand | 2 years ago

Is it possible for such local install to retain conversation history so if for example you're working on a project and use it as your assistance across many days that you can continue conversations and for the model to keep track of what you and it already know?

discuss

order

simonw|2 years ago

My LLM command line tool can do that - it logs everything to a SQLite database and has an option to continue a conversation: https://llm.datasette.io

jmiskovic|2 years ago

There is no fully built solution, only bits and pieces. I noticed that llama outputs tend to degrade with amount of text, the text becomes too repetitive and focused, and you have to raise the temperature to break the model out of loops.

nomand|2 years ago

Does what you're saying mean you can only ask questions and get answers in a single step, and that having a long discussion where refinement of output is arrived at through conversation isn't possible?

knodi123|2 years ago

llama is just an input/output engine. It takes a big string as input, and gives a big string of output.

Save your outputs if you want, you can copy/paste them into any editor. Or make a shell script that mirrors outputs to a file and use that as your main interface. It's up to the user.