top | item 44758986

(no title)

joaogui1 | 7 months ago

Mixture of Experts isn't using multiple models with different specialties, it's more like a sparsity technique, where you massively increase the number of parameters and use only a subset of the weights in each forward pass.

discuss

order

No comments yet.