top | item 27019249

Dissecting the Apple M1 GPU, Part IV

379 points| caution | 4 years ago |rosenzweig.io

121 comments

order
[+] _ph_|4 years ago|reply
Great work and even more surprising: a great writeup (writing that as someone who always struggles to document my own work). This really gives me hope to have an usable, native Linux on the M1 machines.

I really wished Apple would see, how much benefit this would bring to their platform. Their new hardware is really exciting, the first ARM on the desktop, which doesn't just compete, but in many aspects beat current x86 chips. A lot of the Linux and tech enthusiast crowd would love to jump onto Apple Silicon. And while they might not bring huge profits by themselves, these are the people who do come up with great new technologies. Better give them a home on Apple devices. It doesn't need a complete and formal documentation of every aspect, just supporting those projects with a little bit of information would go a long way. A single engineer who would answer questions by those developers might be sufficient. So come on, Apple, do it! :)

[+] Aaargh20318|4 years ago|reply
> the first ARM on the desktop, which doesn't just compete, but in many aspects beat current x86 chips

I think the first ARM on the desktop title goes to the Acorn Archimedes (https://en.wikipedia.org/wiki/Acorn_Archimedes). It kicked ass back then as well:

> A mid-1987 Personal Computer World preview of the Archimedes based on the "A500 Development System" expressed enthusiasm about the computer's performance, that it "felt like the fastest computer I have ever used, by a considerable margin"

[+] dannyw|4 years ago|reply
This is a company that actively fights right to repair and implements software DRM to lock out non-Apple authorised replacements.

Don't hold your breath. Apple's stance by actions is the opposite.

[+] monopoledance|4 years ago|reply
As a Linux person, I have to say my iPad almost got me into becoming also an Apple person. Let's be real, Apple would not lose a single penny to inviting the Linux crowd directly, however, some of those Linux people will become Apple people. There are two reasons to not open up, for the future outlook:

1. Why tho?

2. MacOS increasing lock-down makes even Linux attractive to a wider customer base, and therefore threatening their huge, carefree software extortion business.

Personally I think, they do number 2 on their customer base. I think not opening up to Linux is an indirect admission to their unfair competition game. I think Linux support would very much limit how much they can push their DRM, subscription, software extortion and expropriation mischief. It would allow for consumer choices.

[+] eointierney|4 years ago|reply
So well written, so engaging, such love in the labour.

"We’re looking for conspicuous gaps in our understanding of the hardware, like looking for black holes by observing the absence of light."

Gorgeous

[+] simondotau|4 years ago|reply
Seconded. Despite being a genuinely challenging and complicated process with countless moving parts, it was written to be as approachable as possible. They didn't feel a need to "prove" how hard the work is by writing from deep within the weeds.
[+] GeekyBear|4 years ago|reply
I have also enjoyed reading these progress reports.
[+] KirillPanov|4 years ago|reply
Alyssa is a hero for her work on Panfrost, which gave us open-source 3D graphics on ARM Mali GPUs. I am eternally grateful for this; the Samsung Kevin I use is the only blobless laptop currently in production, and now it has 3D accelerated graphics thanks to her work!

But part of me is sad to see her working on such closed hardware now.

Does anybody think Linux on the M1 will ever be able to touch the internal SSD? Apple has been locking that down with proprietary controllers and signed firmware since their Intel days (i.e. T1 chip). Are people really going to drag around their shiny new macbooks with an external USB-C dongle hanging off of it because that's required in order to run Linux?

I worry that the endgame here is Linux becoming Just Another MacOS App. Apple is quite happy for Linux to be a MacOS app running in their VM. Elite developers will buy Macbooks and run Linux in a VM because "we'll have Linux on the bare metal soon, it's just temporary" and that will just keep getting pushed back and pushed back and pushed back...

[+] mjg59|4 years ago|reply
Apple don't block access to their NVMe controllers at all. They do appear to have a, well, interesting approach to spec compliance, but Linux is now entirely capable of handling the SSD on all x86 Apple hardware. In the M1 case the NVMe controller isn't exposed via PCI so the existing driver won't work, but there's already in-kernel abstraction between the actual NVMe code and the PCI interface, so adding an alternative shouldn't be a problem.

The M1 systems depend on some number of blobs, but the amount of non-free code required to boot one looks like it'll end up being less than a typical x86 system requires.

[+] marcan_42|4 years ago|reply
Please, please, please drop the internal SSD myth.

That isn't true, and has never been true, ever. That is a complete bullshit story made up by a YouTuber who saw that the SSD didn't show up under Linux back when the T2 Macs were released (because it didn't have a compatible driver) and decided that must mean Apple were "blocking Linux".

All internal Mac SSDs work fine under Linux these days and have for years. I have a local branch with preparatory work to bring up the M1 SSDs already (requires some driver refactoring to do it properly).

[+] kzrdude|4 years ago|reply
(Starry-eyed idealism.)

Her work on this closed hardware might define it for the future - practicalities matter. If Linux de-facto runs and even better, if it's in wise use, that makes the hardware more open and it might help bend this new platform towards openness. Getting in this early might be a benefit there too.

[+] Jasper_|4 years ago|reply
> Well-written apps generally require primitive restart, and it’s almost free for apps that don’t need it.

I'm really surprised that primitive restart is forced on in Metal, but they also have a hardware bit for it! Primitive restart makes your whole pipeline slower, because you can't just chunk your index list when building your vertex packets to send to the work distributors; the possibility of a primitive restart index means you have to linearly scan for it. Though I'm not a hardware guy -- maybe the degenerate tris thing makes it just as annoying in practice.

I'd guess that the limitation to force primitive restart on in Metal came from another IHV limitation (maybe old PVR chips?)

That said, you really shouldn't be using tristrips in 2021 anyway, they have poor locality, and are hard to optimize. Index buffers solve the problems that tristrips wanted to solve to begin with.

IME, a well-written app should just ignore tristrips / restartable topologies, and just use straight tri-lists.

[+] atq2119|4 years ago|reply
In theory you could do parallel scan tricks to implement primitive restart at full speed, but obviously it's going to cost you and it complicates the index fetch immensely since it effectively ends up being split into two parts.

Most likely, Apple simply doesn't build GPUs big enough to run into that problem, but it makes you wonder what they're doing on their hardware with AMD GPUs.

[+] varispeed|4 years ago|reply
Hopefully one day companies like Apple will be forced to disclose documentation to the devices they sell so that consumers can make the full use of them and won't have to waste time on reverse engineering.
[+] dapids|4 years ago|reply
Word of warning, stay away from the metal shader bitcode (AIR), or they are gonna sue ya, they did a couple academics for publishing on it.
[+] Jasper_|4 years ago|reply
That whole stack is conveniently avoided in this research, because they're going from NIR straight to the GPU machine code, without touching AIR.
[+] bronxbomber92|4 years ago|reply
Do you have references to the incident(s) or published work(s)?
[+] 2OEH8eoCRo0|4 years ago|reply
On what grounds do they sue? You can't talk about hardware you bought?
[+] loop0|4 years ago|reply
That’s an impressive amount of progress already. Congrats on the work. I’m waiting apple’s next silicon to jump to arm, and hopefully by then I’m going to be able to run linux on it.
[+] tambourine_man|4 years ago|reply
Natively, it's gonna take a long time, I'd wager. Virtualization works pretty well today, however.
[+] Engineering-MD|4 years ago|reply
So, given the current trend in progress, when will we have truly usable linux on M1?
[+] mekster|4 years ago|reply
Would server vendors start putting out M1 in data centers once Linux truly runs on it?

Doesn't opening up driver specification also let M1 start ruling in the server spaces?

[+] theunamedguy|4 years ago|reply
A big shoutout to Alyssa here. For those of you who don't know, she is doing this work as a college student.
[+] viktorcode|4 years ago|reply
She's brilliant. Here's hoping she'll leave her mark on software industry.
[+] atq2119|4 years ago|reply
College students are just as smart as graduates and they tend to have much more time on their hands.

So yeah, this is cool work, kudos and everything, but the fact that a college student is doing it is the least surprising part of it if you really think about it.