top | item 36807794

(no title)

jasonwcfan | 2 years ago

Yep. We use LangChain's basic text splitter to chunk the documents and the QA chain to stuff it into the prompt. But AFAIK it doesn't check for context length so that's a piece that's still missing.

Upper limit depends on the model, Llama 2 is 4k including the prompt.

discuss

order

No comments yet.