top | item 46363592

MiniMax M2.1

5 points| scottyeager | 2 months ago |minimaxi.com

4 comments

order

SlavikCA|2 months ago

The HuggingFace link is published, but not working yet: https://huggingface.co/MiniMaxAI/MiniMax-M2.1

Looks like this is 10 billion activated parameters / 230 billion in total.

So, this is biggest open model, which can be run on your own host / own hardware with somewhat decent speed. I'm getting 16 t/s on my Intel Xeon W5-3425 / DDR5-4800 / RTX4090D-48GB

And looking at the benchmark scores - it's not that far from SOTA (matches or exceeds the performance of Claude Sonnet 4.5)

XCSme|2 months ago

Only 10b active params and close to SOTA?

scottyeager|2 months ago

"Significantly Enhanced Multi-Language Programming, Built for Real-World Complex Tasks"