i'm impressed how much the runtime is optimized across so many layers - pretty rare to see an interpreted language push this far without a JIT. Do you see this approach eventually rivaling JIT performance for real world workloads, esp where predictability matters?
No comments yet.