top | item 38760752

(no title)

sroecker | 2 years ago

We just held a workshop about this a few weeks ago: https://red.ht/llmappdev We created a simple chatbot using local models with Ollama (llamacpp), LlamaIndex and streamlit. Have a look at the streamlit folder, it's super easy.

I used this simple example to teach about RAG, the importance of the system prompt and prompt injection. The notebook folder has a few more examples, local models can even do natural language SQL querying now.

discuss

order

3abiton|2 years ago

Llamaindex has so mucu potential. Any benchmarks on performance compared to fine-tuning?

agilob|2 years ago

looks very promising, do you plan to keep this single repo up to date as new things are released?

sroecker|2 years ago

Good question, as you can see I haven't touched it for a month. I wanted to show what's possible then with open source and (open) local models and there's already so much new stuff out there.

I'll probably fix some things this week and then either update it or start from scratch. Guided generation, structured extraction, function calling and multi-modal are things I wanted to add and chainlit looks interesting.