(no title)
asenna | 2 years ago
Goliath is too big for my system but Mixtral_34Bx2_MoE_60B[1] is giving me some really good results.
PSA to anyone that does not understand what we're talkign about: I was new to all of this until two weeks ago as well. If you want to get up to speed with the incredible innovation and home-tinkering happening with LLMs, you have to checkout - https://www.reddit.com/r/LocalLLaMA/
I believe we should be at GPT4 levels of intelligence locally sometime later this year (Possibly with the release of Llama3 or Mistral Medium open-model).
[1] - https://huggingface.co/TheBloke/Mixtral_34Bx2_MoE_60B-GGUF
No comments yet.