top | item 17629855

Macbook eGPU Redux: Sticking a GTX 1080 in an AKiTiO Thunder2

125 points| archagon | 7 years ago |archagon.net

58 comments

order
[+] fermienrico|7 years ago|reply
If you must have a macOS + nVidia GPU, I am running 2x GTX 1080Ti's for ML/DL on a Hackintosh. They work flawlessly and I also enabled the iGPU on i7-8086 coffee lake CPU. So, now my monitors are driven using iGPU and they keep the GTXs free to do other heavy lifting (Octane Rendering, Tensorflow, C4D, etc.)

Hackintosh community is amazing and I have a blazing fast 5Ghz 6-core machine that beats the pants off any Mac barring the ones that cost over $10k.

Let me know if anyone wants my EFI.zip & exact specs.

[+] alliecat|7 years ago|reply
I've been considering going back to having a desktop on my desk and carrying a lightweight laptop; versus using a desktop-replacement at both my desk and on the move.

I was considering a 5K iMac (and I may still go for one; it's still a great 5K display with a decent computer thrown in) but I'll admit, the Hackintosh route intrigues me.

I remember in my last few years of Highschool I had an original EeePC running Lion: worked great, until the SD card decided it didn't like being thrashed and went a bit melty on me.

Roughly what did your setup cost, and how much of a pain is it to maintain?

[+] pyro2927|7 years ago|reply
Link to full specs? It's been years since I've maintained a ?Hackintosh and getting sleep/audio working has always been a nightmare.
[+] jonkiddy|7 years ago|reply
I'm very interested in this. Please share.
[+] ComputerGuru|7 years ago|reply
Has iMessage been made to work on hackintosh or is it still locked to Apple hardware?
[+] pokemongoaway|7 years ago|reply
What would it take for hackintosh users to move to Linux? You'd think we'd get enough Linux users asking for Adobe products these days... And with Apple's hardware choices lately creatives could be killing it on Linux + whatever HW they can afford. I think one company - like Adobe - migrating over would create a huge domino effect.
[+] rememberlenny|7 years ago|reply
I got a eGPU too for machine learning work. I have a 1080ti which I put in the Akitio Node Pro. Its nice because it has a handle on the top, is portable, and sits on my desk at home.

I also have a desktop machine that I built with another 1080ti in it. I use the desktop machine as a remote machine for longer running jobs, but like the convenience of running files directly on my machine.

If anyone has questions, I'm happy to answer.

[+] edhu2017|7 years ago|reply
I heard the eGPU gets throttled memory wise because there are less pcie lanes. does this have a big impact on ML performance?
[+] jonkiddy|7 years ago|reply
Do you have a blog post or could recommend one that covers your setup? Particularly any software that needs to be installed to get ML setup properly. I’m putting together an eGPU together in the near future and have read quite a bit, however any pitfalls that could be avoided would be helpful to know up front.
[+] chrischen|7 years ago|reply
The article mentions it at the bottom briefly, but Gigabyte's gaming box basically does what the hacked together Thunder 2 setup does and at a lower cost ($100 premium over the graphics card alone): https://amzn.to/2NPyXEM.

It also comes with Thunderbolt 3 instead of 2. I'm not sure why the author simply buy that unit and be done with it since it's probably cheaper than the combined costs of all his parts and doesn't have a giant desktop power supply sticking out the back.

[+] archagon|7 years ago|reply
Author here. Unfortunately, according to numerous accounts[1], the TB3 to TB2 adapter does not work when you're trying to use an Nvidia eGPU with a Macbook that has an Nvidia dGPU. (Mine has the 750M.) Additionally, the gaming box uses a custom card (so less resale value) and seems to have limited support for other GPUs (thus reducing its use as a general-purpose case).

But yes, for most people, it's almost certainly the better buy. (Though it's still not officially supported, apparently.)

[1]: https://egpu.io/forums/mac-setup/problem-setting-up-aorus-gt...

[+] nottorp|7 years ago|reply
Every time i see an article like this I get my hopes up that it's about the eGPU behaviour on Mac OS. Turns out it's yet another Bootcamp article :(
[+] oneplane|7 years ago|reply
Works fine on macOS. Running a Akitio Thunder 3 with RX570 on a MBP 2015 13" using a TB2-TB3 adapter, and on a MBP 2017 (also 13") without the adapter. On the TB3 model it's plug and play, on the TB2 model you have to patch DisplayWrangler to make it accept the lower bandwidth TB2 as it was patched out by Apple. After that, the older MBP is plug and play too.
[+] archagon|7 years ago|reply
It’s happening, but slowly. High Sierra seems to support many AMD GPUs over TB3 out of the box. (I was more interested in gaming performance, though.)
[+] pokemongoaway|7 years ago|reply
We need Adobe and just a few apps to support Linux and we'll create a cascading effect that will lower the cost and increase the power and freedom of many users laptop and desktop choices!
[+] bhouston|7 years ago|reply
I recent got the Intel nuc with the amd Vega chip. It is small and quite powerful. Much cleaner setup than this setup and less than 1300 USD with 32gb of ram and a 512gb ssd. Play beatsaber all the time on it with a vive and other steam games at 4k on the tv.
[+] jotm|7 years ago|reply
eGPU, huh, that brings back memories. So much time spent on trying to get a 9800GTX to work via ExpressCard on a laptop, the community work on the whole thing was amazing.
[+] rozenmd|7 years ago|reply
and here I am in 2018 using my Lenovo X220's ExpressCard slot to run a GTX 960, only requiring plug-and-play.
[+] chewxy|7 years ago|reply
from time to time I wonder about the cost benefit of using an eGPU vs just sshfs+EC2's G2xLarge and I don't think there's ever been a time in the past year where it was more worth it to have an eGPU to do ML dev work.
[+] alliecat|7 years ago|reply
In isolation, it's probably not, but most of the compute-intensive researchers I know also have a need for a reasonably powerful laptop anyway. From there, it's not a particularly expensive hop to getting desktop-grade performance at your desk without needing an entire different machine. That, for me, is where the eGPU value proposition makes sense.
[+] Baal|7 years ago|reply
Meanwhile, no OpenGL, no DirectX, no Vulkan. Indeed, they are “Pro” machines.
[+] oneplane|7 years ago|reply
The hardware has nothing to do with those software libraries if that's what you are getting at. And when using Windows or Linux, all three are available. On macOS, you currently only have OpenGL and Metal, and OpenGL is deprecated.

This does however not mean that you can't use the other API's, it just means that when you are using macOS, you'll have to bring the API's with you, and that is totally possible.

In most cases, you're not using those libraries directly and using an engine instead, and most popular engines have support for the bigger API's, including Mantle, Metal, DirectX, OpenGL, Vulkan etc. You simply select the build profile of choice (or make a build that has it all) and build it to taste.