top | item 41474710

(no title)

leogao | 1 year ago

In domains like ML, people care way more about the half precision FLOPs than single precision.

discuss

order

bee_rider|1 year ago

They don’t have much application outside ML, at least as far as I know. Just call them ML ops, and then they can include things like those funky shared exponent floating point formats, and or stuff with ints.

Or they could be measured in bits per second.

Actually I’m pretty interested in figuring out if we can use them for numerical linear algebra stuff, but I think it’d take some doing.