top | item 38949296

(no title)

asenna | 2 years ago

Dolphin-mixtral is incredible for the size that it is. But I'm curious, have you tried Goliath-120b or the new `Mixtral_34Bx2_MoE_60B` (it's named Mixtral but the base is actually Yi).

Goliath is too big for my system but Mixtral_34Bx2_MoE_60B[1] is giving me some really good results.

PSA to anyone that does not understand what we're talkign about: I was new to all of this until two weeks ago as well. If you want to get up to speed with the incredible innovation and home-tinkering happening with LLMs, you have to checkout - https://www.reddit.com/r/LocalLLaMA/

I believe we should be at GPT4 levels of intelligence locally sometime later this year (Possibly with the release of Llama3 or Mistral Medium open-model).

[1] - https://huggingface.co/TheBloke/Mixtral_34Bx2_MoE_60B-GGUF

discuss

order

No comments yet.