top | item 39692315 My Private LLM Setup at Home, Confidentially Accessible on Mobile 4 points| thethindev | 2 years ago |thin.computer | reply 3 comments order hn newest [+] [-] thethindev|2 years ago|reply I setup a simple homelab on a macbook air and added ollama + open webui. Now I have an LLM I can talk to whenever I'm out.I'm using llama2 and llama2-uncensored, but I'm going to download llava and mixtral later. Either way, it's been a great experience [+] [-] adrimubo96|2 years ago|reply This is exactly the type of setup I'm interested on replicating on my end. Thanks a lot for the post!!Are you planning on fine-tunning the LLM further for your own needs? Have you thought about that? [+] [-] thethindev|2 years ago|reply > This is exactly the type of setup I'm interested on replicating on my end. Thanks a lot for the post!!My pleasure!> Are you planning on fine-tunning the LLM further for your own needs? Have you thought about that?Initially, I wasn't thinking about it but it looks easy enough [1][2]![1] https://iwasnothing.medium.com/llm-fine-tuning-with-macbook-...[2] https://github.com/ml-explore
[+] [-] thethindev|2 years ago|reply I setup a simple homelab on a macbook air and added ollama + open webui. Now I have an LLM I can talk to whenever I'm out.I'm using llama2 and llama2-uncensored, but I'm going to download llava and mixtral later. Either way, it's been a great experience
[+] [-] adrimubo96|2 years ago|reply This is exactly the type of setup I'm interested on replicating on my end. Thanks a lot for the post!!Are you planning on fine-tunning the LLM further for your own needs? Have you thought about that? [+] [-] thethindev|2 years ago|reply > This is exactly the type of setup I'm interested on replicating on my end. Thanks a lot for the post!!My pleasure!> Are you planning on fine-tunning the LLM further for your own needs? Have you thought about that?Initially, I wasn't thinking about it but it looks easy enough [1][2]![1] https://iwasnothing.medium.com/llm-fine-tuning-with-macbook-...[2] https://github.com/ml-explore
[+] [-] thethindev|2 years ago|reply > This is exactly the type of setup I'm interested on replicating on my end. Thanks a lot for the post!!My pleasure!> Are you planning on fine-tunning the LLM further for your own needs? Have you thought about that?Initially, I wasn't thinking about it but it looks easy enough [1][2]![1] https://iwasnothing.medium.com/llm-fine-tuning-with-macbook-...[2] https://github.com/ml-explore
[+] [-] thethindev|2 years ago|reply
I'm using llama2 and llama2-uncensored, but I'm going to download llava and mixtral later. Either way, it's been a great experience
[+] [-] adrimubo96|2 years ago|reply
Are you planning on fine-tunning the LLM further for your own needs? Have you thought about that?
[+] [-] thethindev|2 years ago|reply
My pleasure!
> Are you planning on fine-tunning the LLM further for your own needs? Have you thought about that?
Initially, I wasn't thinking about it but it looks easy enough [1][2]!
[1] https://iwasnothing.medium.com/llm-fine-tuning-with-macbook-...
[2] https://github.com/ml-explore