(no title)
erwald | 19 days ago
I've only seen information suggesting that you can run inference with Ascends, which is obviously a very different thing. The source you link also just says: "The latest model was developed using domestically manufactured chips for inference, including Huawei's flagship Ascend chip and products from leading industry players such as Moore Threads, Cambricon and Kunlunxin, according to the statement."
cherryteastain|19 days ago
Note that Z.ai also publically announced that they trained another model, GLM-Image, entirely on Huawei Ascend silicon a month ago [1].
[1] https://www.scmp.com/tech/tech-war/article/3339869/zhipu-ai-...
erwald|19 days ago
As I wrote in another comment, I think so for a few reasons:
1. The z.ai blog post says GML-5 is compatible with Ascends for inference, without mentioning training -- it says they support "deploying GLM-5 on non-NVIDIA chips, including Huawei Ascend, Moore Threads, Cambricon, Kunlun Chip, MetaX, Enflame, and Hygon" -- many different domestic chips. Note "deploying". https://z.ai/blog/glm-5
2. The SCMP piece you linked just says: "Huawei’s Ascend chips have proven effective at training smaller models like Zhipu’s GLM-Image, but their efficacy for training the company’s flagship series of large language models, such as the next-generation GLM-5, was still to be determined, according to a person familiar with the matter."
3. You're right that z.ai trained a small image model on Ascends. They made a big fuss about it too. If they had trained GLM-5 with Ascends, they likely would've shouted it from the rooftops. https://www.theregister.com/2026/01/15/zhipu_glm_image_huawe...
4. Ascends just aren't that good