top | item 45140116

(no title)

catgary | 5 months ago

I agree with you that writing kernels isn’t necessarily the most important thing for most ML devs. I think an MLIR-first workflow with robust support for the StableHLO and LinAlg dialects is the best path forward for ML/array programming, so on one hand I do applaud what Mojo is doing.

But I’m much more interested in how MLIR opens the door to “JAX in <x>”. I think Julia is moving in that direction with Reactant.jl, and I think there’s a Rust project doing something similar (I think burn.dev may be using ONNX has an even higher-level IR). In my ideal world, I would be able to write an ML model and training loop in some highly verified language and call it from Python/Rust for training.

discuss

order

No comments yet.