top | item 31550095

(no title)

canarypilot | 3 years ago

Why would you consider prediction based on dynamic conditions to be the sign of a dystopian optimization cycle? Isn’t it mostly intuitive that interesting program executions are not going to be things you can determine statically (otherwise your compiler would have cleaned them up for you with inlining etc.), or could be determined statically but at too great cost to meet execution deadlines (JiTs and so on), or resource constraints (you don’t really want N code clones specialising each branch backtrace to create strictly predictable chains).

Or is the worry on the other side; that processors have gotten so out-of-order that only huge dedication to guesswork can keep the beast sated? I don’t see this as a million miles from software techniques in JiT compilers to optimistically optimize and later de-deoptimize when an assumption proves wrong.

I think you might be right to be nervous if you wrote programs that took fairly regular data and did fairly regular things to it. But, as Itanium learned the hard way, programs have much more dynamic, emergent and interesting behaviour than that!

discuss

order

amelius|3 years ago

I guess the fear is that the CPU might start guessing wrong, causing your program to miss deadlines. Also, the heuristics are practically useless for realtime computing, where timings must be guaranteed.

nine_k|3 years ago

I suppose that if you assume in-order execution and count the clock cycles, you should get a guaranteed lower bound of performance. It may be, say, 30-40% of the performance you really observe, but having some headroom should feel good.