top | item 43711655

(no title)

soup10 | 10 months ago

i agree, no matter how much wishful thinking jensen sells to investors about paradigm shifts the days of everyone rushing out to get 6 figure tensor core clusters for their data center probably won't last forever.

discuss

order

bigyabai|10 months ago

If Nvidia was at all in a hurry to lock-out third-parties, then I don't think they would support OpenCL and Vulkan compute, or allow customers to write PTX compilers that interface with Nvidia hardware.

In reality, the demand for highly parallelized compute simply blindsided OEMs. AMD, Intel and Apple were all laser-focused on raster efficiency, none of them have a GPU architecture optimized for GPGPU workloads. AMD and Intel don't have competitive fab access and Apple can't sell datacenter hardware to save their life; Nvidia's monopoly on attractive TSMC hardware isn't going anywhere.

mlinhares|10 months ago

The profit margins on Macs must be insane because it just doesn’t make sense at all Apple just doesn’t give a fuck about data center workloads when they have some of the best ARM CPUs and whole packages on the market.

imtringued|10 months ago

I don't know how it happened, but Intel completely dropped out of the AI accelerator market.

There are really only three competitors in this market with one also-ran company.

Obviously it's Nvidia, Google and tenstorrent.

The also ran company is AMD, whose products are only bought as a hedge against Nvidia. Even though the hardware is better on paper, the software is so bad that you get worse performance than Nvidia. Hence "also ran".

Tenstorrent isn't there yet, but it's just a matter of time. They are improving with every generation of hardware and their software stack is 100% open source.

int_19h|10 months ago

Even if you can squeeze an existing model into smaller hardware, that means that you can squeeze a larger (and hence smarter) model into that 6 figure cluster. And they aren't anywhere near smart enough for many things people attempt to use them for, so I don't see the hardware demand for inference subsiding substantially anytime soon.

At least not for these reasons - if it does, it'll be because of consistent pattern of overhyping and underdelivering on real-world applications of generative AI, like what's going on with Apple right now.

layoric|10 months ago

He is fully aware, that is why he is selling his stock on the daily.