top | item 37033979

(no title)

hpcjoe | 2 years ago

Catastrophic loss of precision is, as the name implies, catastrophic in terms of the calculation context. For scientific/engineering codes, and things requiring a preservation of resolution for proper functioning, FP32 is rarely sufficient. FP64 is usually better. For ML/AI apps, resolution isn't nearly as important.

discuss

order

tails4e|2 years ago

Absolutely. If anything ML is migrating to lower precision types, like fp16, mx9, mx6, int8, etc.