top | item 39692315

My Private LLM Setup at Home, Confidentially Accessible on Mobile

4 points| thethindev | 2 years ago |thin.computer | reply

3 comments

order
[+] thethindev|2 years ago|reply
I setup a simple homelab on a macbook air and added ollama + open webui. Now I have an LLM I can talk to whenever I'm out.

I'm using llama2 and llama2-uncensored, but I'm going to download llava and mixtral later. Either way, it's been a great experience

[+] adrimubo96|2 years ago|reply
This is exactly the type of setup I'm interested on replicating on my end. Thanks a lot for the post!!

Are you planning on fine-tunning the LLM further for your own needs? Have you thought about that?