> I think Apple should be thanking the Raspberry Pi world for showing what you can do with SoCs, and for driving so much software to already be ported to the ARM processor.
With all due respect, I love my raspberry pi, but Apple just needs to thank whoever was in charge of acquiring PA Semi’s know how in 2008, along with the chip mastermind that is Johni Srouji. Or themselves from 1990, when they actually founded ARM as a joint venture with Acorn computers to make chips for the Newton.
The first ARM-based device for Linux that I remember getting big was the Sheevaplug. It was an ARM-based "plug computer" with an SD card slot, a USB port IIRC, serial/JTAG port, Ethernet, Wifi, and no display.
I had its next generation, the Guruplug, which had 2 Ethernet ports, Bluetooth, and a couple USB ports. Used it as a router for a few years.
I don't know how long Debian's arm distro was around before then but that was my goto for the Guruplug. Worked great.
Yes, the raspberry pi is more of a biproduct of the world having moved towards ARM before than a driver by itself. In the end it's just reusing an SoC which was developed for other applications.
And we had countless applications in mobile phones and automotive infotainment systems for Cortex A series CPUs and other embedded systems for Cortex M series CPUs before.
Also, the Raspberry Pi is based on possibly the shittiest family of SoCs available on the market. As an embedded engineer, it's hard to overstate how bad the BCM283* series is. Peripherals locked to GPU clock. Why!? Watchdog timer is straight up broken. How do you mess that up?
Yes, the chip team at Apple is likely worth tens of billions on their own. Think about how many companies out there that would pay top dollar to have the chip performance that Apple does. I know the ARM ISA != Intel's X86 but on a perf v perf basis if Apple-level-performance CPUs were available to the mass market Intel would be in a world of hurt and would likely have to drop prices even more aggressively than it does when AMD puts out a competent chip.
> The Intel world has stagnated in recent years, and I look forward to seeing the CPU market jump ahead again.
Really? AMD's recent moves with Zen/Zen2/Zen3/EPYC look like a big step forward. Zen2 chiplets are the biggest change in years. Zen3 IPC is supposed to be significantly better than Zen2 (17%). The previous gen Ryzen was like 15w TDP, where as the A12Z in the Mac Mini ARM is 15w TDP, but the Zen crushes the A12Z Bionic on benchmarks.
It's difficult for people to remember, but ARM came from really terrible performance to a spot where it is getting in the ballpark of x86, so the advancements look impressive, but that's like saying if I go from $100 to $200, the gains look impressive, whereas you only went from $10000 to $11000, it looks like you are standing relatively still in comparison.
But x86 architectures are decades of maturity, so someone getting a 17% IPC lift (Zen2->Zen3) or a massive reduction in TDP on such a complex chip, isn't stagnation, it's actually MORE impressive IMHO.
ARM is going to reach marginal returns, and soon the yearly perf boosts won't look as impressive anymore.
In the end, I think we'll see convergence of performance. ARM will still have a power advantage on mobile, because they don't have so much backwards compatibility legacy that x86 has to support. However, are Macbooks and Mac Desktops going to be performance and price competitive with Linux x86? I doubt it.
First of all, ARM vendors haven't even come close to the GPU performance of NVidia or AMD's discrete GPUs. An ARM A13 or Mali is not going to compete with a laptop with an RTX 2060, 3060, or RDNA1/2. And secondly, just looking at AMD, it's possible to lift performance / watt still in x86.
I think the laptop space in the x86 realm is still exciting, because of what AMD and NVidia are doing.
You might find it interesting that Apple made some very suggestive statements at WWDC regarding graphics performance, like “don’t assume discrete graphics is better than integrated”. I think it might be interesting to wait a bit to see what they come out with.
> The previous gen Ryzen was like 15w TDP, where as the A12Z in the Mac Mini ARM is 15w TDP, but the Zen crushes the A12Z Bionic on benchmarks.
Hmm? The A12X beats the 3700U in Geekbench 5's single-thread, multi-thread and compute benchmarks, and not by trivial margins.
Your power comparisons are also unfair; if the A12X ever draws 15W, it would be way at the edge of its power curve on all cores[1], whereas 15W on the 3700U is a tepid clock for the Ryzen. The A12X is much more reasonably considered an ~8W part.
You are looking at x86 from an AMD's perspective which really is just catching up to Intel in many aspects and enjoying the advantage of TSMC and Chiplets.
The Zen3 rumoured impressive IPC improvement will still be below Willow Cove or Tiger Lake. An uArch that was supposed to be out in 2018. Intel's 7nm ( In between TSMC 5nm and 3nm Node ) Golden Cove was supposed to be launch this year in Intel's original roadmap.
From roughly 2.5 years ahead of the industry to now lagging behind 1.5 years. That is 4 years of difference.
That is why The Intel world has stagnated in recent years. 4 years is very long in tech industry.
From the limited data regarding the Tiger Lake benchmarks floating around, it could be just as much of a revelation as the Zen2 jump which you mentioned. The GPU is much more powerful than Icelake and the CPU seems to have benefited from the higher clock speeds possible from the more mature 10nm process. TDP is said to be the same, but it's hard to go by Intel watts until someone does a battery life test.
I was a bit surprised the article didn't mention Windows on ARM at all. Following the Apple announcement, I managed to snag a Windows laptop that uses the Qualcomm Snapdragon 850 ARM SoC for dramatically less than MSRP on eBay. (To be fair, they were selling it for parts since they couldn't figure out how to remove the password. Wiping the drive and reinstalling Windows was easy enough.) For the most part, it feels just like Windows. Every app I've downloaded has _just worked_. That being said, there's definitely at least one app that won't (Wireguard since it requires an ARM64 driver to work). I've actually been tracking which software provides an ARM64 version[1]. Sadly, it looks like virtually every toolchain still needs to update to support ARM64 on Windows. I'm tracking a handful of GitHub issues, and support is definitely in the pipeline, but it's slow going. For example, .NET _still_ doesn't support Windows on ARM, despite the fact it's the flagship way to build apps on Windows, and ARM64 Windows devices have been available for almost two years.
Apple has tight enough grips on their hardware and software ecosystem that they can force major architectural changes onto the userbase with a hardline take-it-or-leave-it attitude, which led them through the previous three transitions. It's much harder to succeed in massive transitions when the userbase is not being forced to and are perfectly free to stay where they are and avoid the transitional inconveniences (Itanium, IPv6, Windows on ARM, etc, etc, etc.)
Commenting on this post from my Galaxy Book S, which is an Snapdragon ARM laptop running Windows 10 Pro.
It's currently driving 2x 1080p screens over USB C + DisplayPort chaining. It can drive 3 of these chained in this way. Keyboard and mouse are connected to the USB hub on the display, so it's a single cable connection from my laptop to start working in the morning.
I hope to have a phone at some point that runs Windows 10 Pro, that I can plug into a USB C plug and get straight to work, that would be amazing!
MSFT Edge and Windows Terminal both have ARM64 builds (and so does VS Code Insiders), and WSL works great. I work with tmux+nvim on Ubuntu ARM, which runs almost everything I need.
A lot of stuff _doesn't_ work though, but the stuff that does works well. I hope that we see some inexpensive (200-300USD?) NUC style devices that can run Windows 10, it would make for a great little computer.
I'm quite interested in what Nvidia is planning. Nvidia purchased Mellanox last year (they make high-end network gear often used in compute clusters). Nvidia is very involved in machine learning with their GPUs. The only missing part of puzzle is CPUs which was either Intel or AMD (a direct competitor). ARM changes things and means they're not dependant on those companies (and the x86 licensing clusterfcuk preventing newcomers), and they have some room to tailor it to fit their needs (like Amazon & Google recently).
These definitely are exciting times. Most Brits born in the 80's and 90's will have used Acorn computers in school, no one predicted that what would grow out of it (they weren't particularly fast).
> One possible downside of the new Macs, is that Apple keeps talking about the new secure boot feature only allowing Apple signed operating systems to boot as a security feature. Does this mean we won’t be able to run Linux on these new Macs, except using virtualization?
I take delivery of a Raspberry Pi 4 tomorrow. I’m really hoping it will replace my MBP for almost everything I do — namely clerical office work, teaching high school CS, and web browsing. Exciting times.
I would be prepared to be underwhelmed at using a rpi as a desktop. Graphics drivers still need work.
This is where I think Apple can do it right. They can tune drivers and fully optimize system performance, since they basically own the whole stack, and won't be stuck with proprietary broken blobs or reverse engineering a 3rd party design.
Hindsight is 20/20, but Intel selling off their ARM division (XScale) almost exactly a year before the first iPhone was announced looks like a pretty bad call in retrospect.
The only reason Intel owned it in the first place was because they lost a lawsuit with DEC. Intel has repeatedly shown that it is just not that into ARM.
My fingers are crossed for Steam on ARM and distributing ARM binaries for games that devs cross-compiled. Given Valve's support for Linux, if an ARM client ran on Android too and could finally access my library on my phone, I'd be a happy customer.
What size is a memory page on other ARM CPUs? I think Apple's processors use 16KiB pages. Doesn't x86 software assume a 4KiB page size, unless it deals with huge pages?
> Of course this computer runs Linux and currently is being used to solve protein folding problems around developing a cure for COVID-19, similar to folding@home. This is a truly impressive warehouse of technology and shows where you can go with the ARM CPU and the open source Linux operating system.
Extraordinary claim, but any evidence to support that Fugaku is super useful apart from making headlines?
I am not sure what to make of the fact that the article never says "X86", but keeps mentioning "Intel" and occasionally "AMD". Was that tailored to the expected audience, or the author is not confident enough about what "X86" means?
Did Apple say they would move their entire line of Macs to Apple Silicon? I thought Federighi said they had « amazing » Intel-powered new Macs in their pipeline.
They will likely have some Apple specific features, but I doubt they'd want to make them incompatible to the point of requiring a software rewrite compared to other ARM chips. Just recompile your stuff, jump through whatever hoops you need to publish it, and that should be it.
Apple chips have one custom extension (AMX), but IIRC, they don't give third party developers access to this, so from a developer's perspective it's a standard ARMv8 chip.
Pure speculation on my part, but I think Apple wouldn't want people installing either other OSes on their machines or their OS on other hardware, so I would expect some lock-in hardware/firmware/software to make impossible or really difficult to build ARM hackintoshes and the other way around. Hoping to be wrong, though.
"Please don't complain about website formatting, back-button breakage, and similar annoyances. They're too common to be interesting. Exception: when the author is present. Then friendly feedback might be helpful."
Given that Apple was one of the original founders of Arm, and has an ARM architectural license (the most gold plated license Arm sells, basically allowing the customer to create their own chips implementing the ARM ISA), I'd guess Apple would be the last one to switch from Arm to RISC-V (assuming such a switch would ever happen).
[+] [-] camillomiller|5 years ago|reply
With all due respect, I love my raspberry pi, but Apple just needs to thank whoever was in charge of acquiring PA Semi’s know how in 2008, along with the chip mastermind that is Johni Srouji. Or themselves from 1990, when they actually founded ARM as a joint venture with Acorn computers to make chips for the Newton.
[+] [-] tenebrisalietum|5 years ago|reply
The first ARM-based device for Linux that I remember getting big was the Sheevaplug. It was an ARM-based "plug computer" with an SD card slot, a USB port IIRC, serial/JTAG port, Ethernet, Wifi, and no display.
I had its next generation, the Guruplug, which had 2 Ethernet ports, Bluetooth, and a couple USB ports. Used it as a router for a few years.
I don't know how long Debian's arm distro was around before then but that was my goto for the Guruplug. Worked great.
[+] [-] Matthias247|5 years ago|reply
And we had countless applications in mobile phones and automotive infotainment systems for Cortex A series CPUs and other embedded systems for Cortex M series CPUs before.
[+] [-] centimeter|5 years ago|reply
[+] [-] gigatexal|5 years ago|reply
[+] [-] FPGAhacker|5 years ago|reply
Jim Keller as much as anyone I would think.
[+] [-] cromwellian|5 years ago|reply
Really? AMD's recent moves with Zen/Zen2/Zen3/EPYC look like a big step forward. Zen2 chiplets are the biggest change in years. Zen3 IPC is supposed to be significantly better than Zen2 (17%). The previous gen Ryzen was like 15w TDP, where as the A12Z in the Mac Mini ARM is 15w TDP, but the Zen crushes the A12Z Bionic on benchmarks.
It's difficult for people to remember, but ARM came from really terrible performance to a spot where it is getting in the ballpark of x86, so the advancements look impressive, but that's like saying if I go from $100 to $200, the gains look impressive, whereas you only went from $10000 to $11000, it looks like you are standing relatively still in comparison.
But x86 architectures are decades of maturity, so someone getting a 17% IPC lift (Zen2->Zen3) or a massive reduction in TDP on such a complex chip, isn't stagnation, it's actually MORE impressive IMHO.
ARM is going to reach marginal returns, and soon the yearly perf boosts won't look as impressive anymore.
In the end, I think we'll see convergence of performance. ARM will still have a power advantage on mobile, because they don't have so much backwards compatibility legacy that x86 has to support. However, are Macbooks and Mac Desktops going to be performance and price competitive with Linux x86? I doubt it.
First of all, ARM vendors haven't even come close to the GPU performance of NVidia or AMD's discrete GPUs. An ARM A13 or Mali is not going to compete with a laptop with an RTX 2060, 3060, or RDNA1/2. And secondly, just looking at AMD, it's possible to lift performance / watt still in x86.
I think the laptop space in the x86 realm is still exciting, because of what AMD and NVidia are doing.
[+] [-] GreenHeuristics|5 years ago|reply
Further, the average laptop consumer is happy with their intel integrated graphics. All they do is browse the web and punch some numbers in excel.
[+] [-] saagarjha|5 years ago|reply
[+] [-] pertymcpert|5 years ago|reply
Source for 15W A12Z TDP? And the benchmarks?
[+] [-] Veedrac|5 years ago|reply
Hmm? The A12X beats the 3700U in Geekbench 5's single-thread, multi-thread and compute benchmarks, and not by trivial margins.
Your power comparisons are also unfair; if the A12X ever draws 15W, it would be way at the edge of its power curve on all cores[1], whereas 15W on the 3700U is a tepid clock for the Ryzen. The A12X is much more reasonably considered an ~8W part.
[1] https://images.anandtech.com/doci/14892/a12-fvcurve.png
[+] [-] nojito|5 years ago|reply
There is no chance the A12z in the DTK is a 15w chip.
Do you have a source?
[+] [-] ksec|5 years ago|reply
The Zen3 rumoured impressive IPC improvement will still be below Willow Cove or Tiger Lake. An uArch that was supposed to be out in 2018. Intel's 7nm ( In between TSMC 5nm and 3nm Node ) Golden Cove was supposed to be launch this year in Intel's original roadmap.
From roughly 2.5 years ahead of the industry to now lagging behind 1.5 years. That is 4 years of difference.
That is why The Intel world has stagnated in recent years. 4 years is very long in tech industry.
[+] [-] shorts_theory|5 years ago|reply
[+] [-] dstaley|5 years ago|reply
[1] https://iswindowsonarmready.netlify.app/
[+] [-] needle0|5 years ago|reply
[+] [-] ryan-allen|5 years ago|reply
It's currently driving 2x 1080p screens over USB C + DisplayPort chaining. It can drive 3 of these chained in this way. Keyboard and mouse are connected to the USB hub on the display, so it's a single cable connection from my laptop to start working in the morning.
I hope to have a phone at some point that runs Windows 10 Pro, that I can plug into a USB C plug and get straight to work, that would be amazing!
MSFT Edge and Windows Terminal both have ARM64 builds (and so does VS Code Insiders), and WSL works great. I work with tmux+nvim on Ubuntu ARM, which runs almost everything I need.
A lot of stuff _doesn't_ work though, but the stuff that does works well. I hope that we see some inexpensive (200-300USD?) NUC style devices that can run Windows 10, it would make for a great little computer.
[+] [-] pjmlp|5 years ago|reply
[+] [-] justinfrankel|5 years ago|reply
[+] [-] asdff|5 years ago|reply
[+] [-] velox_io|5 years ago|reply
These definitely are exciting times. Most Brits born in the 80's and 90's will have used Acorn computers in school, no one predicted that what would grow out of it (they weren't particularly fast).
[+] [-] saagarjha|5 years ago|reply
No, you can turn off secure boot.
[+] [-] gorgoiler|5 years ago|reply
[+] [-] Klinky|5 years ago|reply
This is where I think Apple can do it right. They can tune drivers and fully optimize system performance, since they basically own the whole stack, and won't be stuck with proprietary broken blobs or reverse engineering a 3rd party design.
[+] [-] lemoncucumber|5 years ago|reply
[+] [-] jerrysievert|5 years ago|reply
[+] [-] jeffbee|5 years ago|reply
[+] [-] ethbro|5 years ago|reply
Neither of which they executed on.
[+] [-] tedk-42|5 years ago|reply
The A13 bionic wouldn't physically share any components built by ARM?
If that's the case, it's like saying AMD use Intel chips, but really they just licence the instruction set
[+] [-] toastal|5 years ago|reply
[+] [-] rwmj|5 years ago|reply
[+] [-] Rolcol|5 years ago|reply
[+] [-] ekianjo|5 years ago|reply
Extraordinary claim, but any evidence to support that Fugaku is super useful apart from making headlines?
[+] [-] sameerds|5 years ago|reply
[+] [-] giomasce|5 years ago|reply
[+] [-] Ecco|5 years ago|reply
[+] [-] butz|5 years ago|reply
[+] [-] jarfil|5 years ago|reply
[+] [-] baddox|5 years ago|reply
[+] [-] skavi|5 years ago|reply
[+] [-] squarefoot|5 years ago|reply
[+] [-] ttul|5 years ago|reply
[+] [-] thenewwazoo|5 years ago|reply
[+] [-] neallindsay|5 years ago|reply
[+] [-] kristianpaul|5 years ago|reply
[deleted]
[+] [-] treebornfrog|5 years ago|reply
[+] [-] dang|5 years ago|reply
https://news.ycombinator.com/newsguidelines.html
[+] [-] person_of_color|5 years ago|reply
[+] [-] jabl|5 years ago|reply
[+] [-] saagarjha|5 years ago|reply