top | item 39802932

Mistral AI announce 7B v0.2 base model release

35 points| nefitty | 1 year ago |twitter.com

2 comments

order

jerpint|1 year ago

Any news on how this model will compare to Mixtral? Interesting that they aren’t releasing a model with MoE this time given the success mixtral had

Reubend|1 year ago

Not yet, but I'm sure they will release some benchmarks soon. As for it not being an MoE model, there's still a ton of value in having a small non-MoE model for many usecases, and improvements that get discovered to train the small model can potentially improve the next version of the MoE model down the line.