burmako's comments

burmako | 3 years ago | on: OpenXLA Is Available Now

If you mean StableHLO, then it has an MLIR dialect: https://github.com/openxla/stablehlo/blob/main/stablehlo/dia....

In the StableHLO spec, we are talking about this in more abstract terms - "StableHLO opset" - to be able to unambiguously reason about the semantics of StableHLO programs. However, in practice the StableHLO dialect is the primary implementation of the opset at the moment.

I wrote "primary implementation" because e.g. there is also ongoing work on adding StableHLO support to the TFLite flatbuffer schema: https://github.com/tensorflow/tensorflow/blob/master/tensorf.... Having an abstract notion of the StableHLO opset enables us to have a source of truth that all the implementations correspond to.

burmako | 3 years ago | on: OpenXLA Is Available Now

StableHLO has a serialization format which is based on MLIR bytecode. https://github.com/openxla/stablehlo/blob/main/docs/bytecode... goes into details of reading/writing portable artifacts for StableHLO programs and associated compatibility guarantees.

I'd also like to comment on our (StableHLO's) relationship with related work. StableHLO was a natural choice for the OpenXLA project, because a very similar operation set called HLO powers many of its key components. However, I would also like to give a shout out to related opsets in the ML community, including MIL, ONNX, TFLite, TOSA and WebNN.

Bootstrapping from HLO made a lot of sense to get things going, but that's just a starting point. There are many great ideas out there, and we're looking to evolve StableHLO beyond its roots. For example, we want to provide functionality to represent dynamism, quantization and sparsity, and there's so much to learn from related work.

We'd love to collaborate, and from the StableHLO side we can offer production-grade lowerings from TensorFlow, JAX and PyTorch, as well as compatibility with OpenXLA. Some of these connections in the ML ecosystem have already started growing organically, and we're super excited about that.

burmako | 3 years ago | on: OpenXLA Is Available Now

The WebML working group has kindly invited us to one of their meetings to present about StableHLO a few months ago. Here are the slides and the meeting minutes: https://www.w3.org/2022/11/17-webmachinelearning-minutes.htm....

Also, OpenXLA is one of the external organizations in the Coordination section of the working group charter: https://w3c.github.io/machine-learning-charter/charter.html. We're looking forward to collaborating with WebML folks!

page 1