If you must have a macOS + nVidia GPU, I am running 2x GTX 1080Ti's for ML/DL on a Hackintosh. They work flawlessly and I also enabled the iGPU on i7-8086 coffee lake CPU. So, now my monitors are driven using iGPU and they keep the GTXs free to do other heavy lifting (Octane Rendering, Tensorflow, C4D, etc.)
Hackintosh community is amazing and I have a blazing fast 5Ghz 6-core machine that beats the pants off any Mac barring the ones that cost over $10k.
Let me know if anyone wants my EFI.zip & exact specs.
I've been considering going back to having a desktop on my desk and carrying a lightweight laptop; versus using a desktop-replacement at both my desk and on the move.
I was considering a 5K iMac (and I may still go for one; it's still a great 5K display with a decent computer thrown in) but I'll admit, the Hackintosh route intrigues me.
I remember in my last few years of Highschool I had an original EeePC running Lion: worked great, until the SD card decided it didn't like being thrashed and went a bit melty on me.
Roughly what did your setup cost, and how much of a pain is it to maintain?
What would it take for hackintosh users to move to Linux? You'd think we'd get enough Linux users asking for Adobe products these days... And with Apple's hardware choices lately creatives could be killing it on Linux + whatever HW they can afford. I think one company - like Adobe - migrating over would create a huge domino effect.
I got a eGPU too for machine learning work. I have a 1080ti which I put in the Akitio Node Pro. Its nice because it has a handle on the top, is portable, and sits on my desk at home.
I also have a desktop machine that I built with another 1080ti in it. I use the desktop machine as a remote machine for longer running jobs, but like the convenience of running files directly on my machine.
Do you have a blog post or could recommend one that covers your setup? Particularly any software that needs to be installed to get ML setup properly. I’m putting together an eGPU together in the near future and have read quite a bit, however any pitfalls that could be avoided would be helpful to know up front.
The article mentions it at the bottom briefly, but Gigabyte's gaming box basically does what the hacked together Thunder 2 setup does and at a lower cost ($100 premium over the graphics card alone): https://amzn.to/2NPyXEM.
It also comes with Thunderbolt 3 instead of 2. I'm not sure why the author simply buy that unit and be done with it since it's probably cheaper than the combined costs of all his parts and doesn't have a giant desktop power supply sticking out the back.
Author here. Unfortunately, according to numerous accounts[1], the TB3 to TB2 adapter does not work when you're trying to use an Nvidia eGPU with a Macbook that has an Nvidia dGPU. (Mine has the 750M.) Additionally, the gaming box uses a custom card (so less resale value) and seems to have limited support for other GPUs (thus reducing its use as a general-purpose case).
But yes, for most people, it's almost certainly the better buy. (Though it's still not officially supported, apparently.)
Works fine on macOS. Running a Akitio Thunder 3 with RX570 on a MBP 2015 13" using a TB2-TB3 adapter, and on a MBP 2017 (also 13") without the adapter. On the TB3 model it's plug and play, on the TB2 model you have to patch DisplayWrangler to make it accept the lower bandwidth TB2 as it was patched out by Apple. After that, the older MBP is plug and play too.
We need Adobe and just a few apps to support Linux and we'll create a cascading effect that will lower the cost and increase the power and freedom of many users laptop and desktop choices!
I recent got the Intel nuc with the amd Vega chip. It is small and quite powerful. Much cleaner setup than this setup and less than 1300 USD with 32gb of ram and a 512gb ssd. Play beatsaber all the time on it with a vive and other steam games at 4k on the tv.
eGPU, huh, that brings back memories. So much time spent on trying to get a 9800GTX to work via ExpressCard on a laptop, the community work on the whole thing was amazing.
from time to time I wonder about the cost benefit of using an eGPU vs just sshfs+EC2's G2xLarge and I don't think there's ever been a time in the past year where it was more worth it to have an eGPU to do ML dev work.
In isolation, it's probably not, but most of the compute-intensive researchers I know also have a need for a reasonably powerful laptop anyway. From there, it's not a particularly expensive hop to getting desktop-grade performance at your desk without needing an entire different machine. That, for me, is where the eGPU value proposition makes sense.
The hardware has nothing to do with those software libraries if that's what you are getting at. And when using Windows or Linux, all three are available. On macOS, you currently only have OpenGL and Metal, and OpenGL is deprecated.
This does however not mean that you can't use the other API's, it just means that when you are using macOS, you'll have to bring the API's with you, and that is totally possible.
In most cases, you're not using those libraries directly and using an engine instead, and most popular engines have support for the bigger API's, including Mantle, Metal, DirectX, OpenGL, Vulkan etc. You simply select the build profile of choice (or make a build that has it all) and build it to taste.
[+] [-] fermienrico|7 years ago|reply
Hackintosh community is amazing and I have a blazing fast 5Ghz 6-core machine that beats the pants off any Mac barring the ones that cost over $10k.
Let me know if anyone wants my EFI.zip & exact specs.
[+] [-] alliecat|7 years ago|reply
I was considering a 5K iMac (and I may still go for one; it's still a great 5K display with a decent computer thrown in) but I'll admit, the Hackintosh route intrigues me.
I remember in my last few years of Highschool I had an original EeePC running Lion: worked great, until the SD card decided it didn't like being thrashed and went a bit melty on me.
Roughly what did your setup cost, and how much of a pain is it to maintain?
[+] [-] pyro2927|7 years ago|reply
[+] [-] jonkiddy|7 years ago|reply
[+] [-] ComputerGuru|7 years ago|reply
[+] [-] DividableMiddle|7 years ago|reply
[+] [-] pokemongoaway|7 years ago|reply
[+] [-] newman314|7 years ago|reply
[+] [-] rememberlenny|7 years ago|reply
I also have a desktop machine that I built with another 1080ti in it. I use the desktop machine as a remote machine for longer running jobs, but like the convenience of running files directly on my machine.
If anyone has questions, I'm happy to answer.
[+] [-] edhu2017|7 years ago|reply
[+] [-] jonkiddy|7 years ago|reply
[+] [-] unknown|7 years ago|reply
[deleted]
[+] [-] chrischen|7 years ago|reply
It also comes with Thunderbolt 3 instead of 2. I'm not sure why the author simply buy that unit and be done with it since it's probably cheaper than the combined costs of all his parts and doesn't have a giant desktop power supply sticking out the back.
[+] [-] archagon|7 years ago|reply
But yes, for most people, it's almost certainly the better buy. (Though it's still not officially supported, apparently.)
[1]: https://egpu.io/forums/mac-setup/problem-setting-up-aorus-gt...
[+] [-] nottorp|7 years ago|reply
[+] [-] oneplane|7 years ago|reply
[+] [-] archagon|7 years ago|reply
[+] [-] chx|7 years ago|reply
[+] [-] pokemongoaway|7 years ago|reply
[+] [-] bhouston|7 years ago|reply
[+] [-] mrmondo|7 years ago|reply
[+] [-] jotm|7 years ago|reply
[+] [-] chx|7 years ago|reply
[+] [-] rozenmd|7 years ago|reply
[+] [-] chewxy|7 years ago|reply
[+] [-] alliecat|7 years ago|reply
[+] [-] Baal|7 years ago|reply
[+] [-] oneplane|7 years ago|reply
This does however not mean that you can't use the other API's, it just means that when you are using macOS, you'll have to bring the API's with you, and that is totally possible.
In most cases, you're not using those libraries directly and using an engine instead, and most popular engines have support for the bigger API's, including Mantle, Metal, DirectX, OpenGL, Vulkan etc. You simply select the build profile of choice (or make a build that has it all) and build it to taste.