top | item 42509465

(no title)

bubaumba | 1 year ago

Absolutely worth doing. Here is interesting related video, local RAG:

https://www.youtube.com/watch?v=bq1Plo2RhYI

I'm not an expert, but I'll do it for learning. Then open source if it works. As far as I understand this approach requires a vector database and LLM which doesn't have to be big. Technically it can be implemented as local web server. Should be easy to use, just type and get a sorted by relevance list.

discuss

order

Quizzical4230|1 year ago

Perfect!

Although, atm I am only using retrieval without any LLM involved. Might try integrating if it significantly improves UX without compromising speeds.