top | item 44670005

(no title)

keminghe | 7 months ago

You're absolutely right about the root cause being outdated AI knowledge bases/training data. I agree, my solution doesn't address that directly.

Where this actually shines is with local LLMs (Ollama, etc) - smaller models, no API costs, fully offline, and the AI gets fresh docs without waiting months for model retraining cycles. Your point about convincing major providers to integrate something like Dash (https://kapeli.com/dash) would definitely be the ideal solution though.

I definitely hear you on the broader ecosystem approach. Anything you've been working on in the same space?

discuss

order

imcritic|7 months ago

We are just trying to adopt an LLM to answer users' questions based on our internal KB/wiki, that's all.