Exactly, like how a “bitcoin mining” chip would implement the SHA in hardware.
And, CPUs prioritize hiding latency with all sorts of caches, and GPUs prioritize cores and bandwidth to hide latency, so there’s different tradeoffs about memory bandwidth versus latency.
Thank you. Think we've hit the level of my "run away scared at the first sight of machine code" understanding, but I now vaguely understand what's going on.
lsb|1 year ago
And, CPUs prioritize hiding latency with all sorts of caches, and GPUs prioritize cores and bandwidth to hide latency, so there’s different tradeoffs about memory bandwidth versus latency.
chimpansteve|1 year ago