top | item 40278264

(no title)

tiomat | 1 year ago

Interesting how long it takes to get this “assistant” functionality on a local running models with llama for e.g.? It’s seems like many use cases require access to user data.

discuss

order

timm37|1 year ago

local running would be impossible as you would need your computer running 24/7 (since the widget is available on your website which is always online).

But building a simple RAG/Vector DB on top of llama3 is not very complicated.