top | item 41702789

AI chipmaker Cerebras files for IPO

219 points| TradingPlaces | 1 year ago |cnbc.com

135 comments

order

knowitnone|1 year ago

NVIDIA is pretty established but there's also Intel, AMD, Google to contend with. Sure Cerebras is unique in that they make one large chip out of the entire wafer but nothing prevents these other companies from doing the same thing. Currently they are choosing not to because of wafer economics but if they chose to, Cerebras would pretty much lose their advantage. https://www.servethehome.com/cerebras-wse-3-ai-chip-launched... 56x the size of H100 but only 8x the performance improvement isn't something I would brag about. I expected much higher performance since all processing is on one wafer. Something doesn't add up (I'm no system designer). Also, at $3.13 million per node, one could buy 100 H100s at $30k each (not including system, cooling, cluster, etc). Based on price/performance Cerebras loses IMO.

fnordpiglet|1 year ago

I think the wafer itself isn’t the whole deal. If you watch their videos and read the link you posted the wafer size allows them to stack them in a block with integrated power and cooling at a higher density than blades and attach enormous amounts of memory. Not including the system, cooling, cluster, etc seems like a relatively unfair comparison too given the node includes all of those things - which are very expensive when considering enterprise grade data center hardware.

I don’t think their value add is simple “single wafer” with all other variables the same. In fact I think the block and system that gets the most out of that form factor is the secret sauce and not as easily replicated - especially since the innovations are almost certainly protected by an enormous moat of patents and guarded by a legion of lawyers.

artemisart|1 year ago

Correction: it's 8x the TFLOPS of a DGX (8 H100), not 1 H100. But it's true that if it stays at $3M it's probably too much and I don't think the memory bottleneck on gpus is large enough to justify this price/performance.

jfoster|1 year ago

> 56x the size of H100 but only 8x the performance improvement isn't something I would brag about.

It doesn't sound like it's too bad for a 9 year old company. Nvidia had a 20-year head start. I would expect that they will continue to shrink it and increase performance. At some point, that might become compelling?

Zandikar|1 year ago

Comparing a WSE-3 to a H100 without considering the systems they go in or the systems, cooling, networking, etc that supports them means little when doing cost analysis, be it CapEx or TCO. A better (but still flawed) comparison would be a DGX H200 (a cluster of H100's and their essential supporting infra) to a CS-3 (a cluster of WSE-3's and their essential supporting infra in a similar form factor/volume of a DGX H200).

Now, is Cerebras going to eventually beat Nvidia or at least compete healthily with Nvidia and other tech titans in the general market or a given lucrative niche of it? No idea. That'd be a cool plot twist, but hard to say. But it's worth acknowledging that investing in a company and buying their products are two entirely separate decisions. Much of silicon valleys success stories are a result of people investing in the potential of what they could become, not because they were already the best on the market, and for nothing else, Cerebras approach is certainly novel and promising.

dzhiurgis|1 year ago

> wafer economics

What are they?

Is this related to defects? Can't they disable parts of defective chip just like other CPUs do? Sounds cheaper than cutting up and packaging chips individually!

pants2|1 year ago

Agreed, it just seems like Nvidia chips are going to be easier to produce at scale. Cerberas will be limited to a few niche use-cases, like HFT where hedge funds are using LLMs to analyze SEC filings as fast as possible.

yieldcrv|1 year ago

they don’t need an advantage, they just need orders and inventory

get extorted by nvidia sales people for a 2026 delivery date that gets pushed out if you say anything about it or decline cloud services

or another provider delivering earlier

thats what the market wants, and even then, who cares? this company is trying to IPO at whay valuation? this article didnt say but the last valuation was like $1.5bn? so you mean a 300x of delta between this and Nvidia’s valuation if these guys get a handful of orders? ok

hulitu|1 year ago

> Sure Cerebras is unique in that they make one large chip out of the entire wafer

I'm sure tgey test it thoroughly. /s

tempusalaria|1 year ago

On the one hand, the financials are terrible for an IPO in this market.

On the other, Nvidia is worth 3trn so they can sell a pretty good dream of what success looks like to investors.

Personally I would expect them to get a valuation well about the 4bln from the 2021 round, despite the financials not coming close to justifying it.

imdoxxingme|1 year ago

Saying the financials are terrible is a bit of a stretch. Rapidly growing revenue, decreasing loss/share and a loss/share similar to other companies that IPO'ed this year.

The more concerning thing is just not having diversity of revenue, since most of it comes from G42.

m00x|1 year ago

IPOs are coming back. Expect pretty big ones in 2025.

lobochrome|1 year ago

It’ll pop. The it’ll rot.

TrapLord_Rhodo|1 year ago

Rev for last 2 years:

$24.6M $78.7M $270M($136.4M)

Sounds like a rocketship. You also get a better sharp if you take some money off the table in the form of leverage and put it in other firms within the industry. E.G. Leveraging your NVDA shares and buying Cerebras.

GeorgeTirebiter|1 year ago

Cerebras is well-known in the AI chip market. They make chips that are an entire wafer.

https://spectrum.ieee.org/cerebras-chip-cs3

alephnerd|1 year ago

Yep! Them, SambaNova, and Groq are super exciting mid-late stage startups imo.

ericd|1 year ago

Interesting that they’ve scaled on-chip memory sublinearly with the growth of transistors between their generations, I would’ve thought they would try to bump that number up. Maybe it’s not a major bottleneck for their training runs?

tonetegeatinst|1 year ago

I'd bet that making a chip the size of the waver has the benefit on not losing any silicon to dicing the wafer up like a desktop or GPU chips coming from a wafer. Major downside is you need to either have a massive x and Y exposure size or break the wafer into smaller exposures which means your still needing to focus on alignment between the steps, and if a defect can't be corrected then is that wafer just scrap?

idiotsecant|1 year ago

Making larger monolithic silicon doesn't get 2x as expensive to get 2x as large. Bigger silicon is massively more expensive. I'm not sure that making each piece require a large chunk of perfect wafer is a fantastic idea, especially when you're looking to unseat juggernauts who have a great deal of experience making high quality product already.

fancyfredbot|1 year ago

The only way for Cereberas to actually succeed in the market is to raise funds. They need better software, better developer relations, and better hardware too. It's a gamble, but if they can raise enough money then there's a chance of success, whereas if they can't it's pretty hopeless.

touisteur|1 year ago

Time (and the market ?) will tell whether all the people clamoring for NVIDIA alternatives actually put their money on it (understanding that NVIDIA's headstart is a long-term heavy investment on software too: compilers, libraires, and of course hardware/software co-design). I still can't fathom how Intel thought Arc and/or Ponte Vecchio would pay for themselves on day one.

zone411|1 year ago

They have a cloud platform. I just ran a test query on their version of Llama 3.1 70B and got 566 tokens/sec.

greesil|1 year ago

Is that a lot? Do they have MLPerf submissions?

mlboss|1 year ago

The real winner in chip war is TSMC. Everyone is using them to make chips.

bjornsing|1 year ago

Yeah I also have a feeling more value will gravitate towards the really hard stuff once we’ve got the NN architectures fairly worked out and stable.

To put my money where my mouth is I’m long TSMC and ASML among others, and (moderately) short NVidia. Very long the industry as a whole though.

alecco|1 year ago

If Cerebras keeps improving it will be a decent contender to Nvidia. Nvidia VRAM-SRAM is a bottleneck. For just inference, it needs to download a model at least once per token (divided by batch size). The bottleneck is not Tensor Cores but memory transfers. They say it themselves. Cerebras fixes that (at a cost of software complexity and narrower target solution).

adrian_b|1 year ago

"the filing puts the spotlight on the company’s financials and its relationship with its biggest customer, Abu Dhabi-based G42, which is also a partner and an investor in the company."

"The documents also say that a single customer, G42, accounted for 83% of revenue in 2023 and 87% in the first half of 2024."

https://www.eetimes.com/cerebras-ipo-paperwork-sheds-light-o...

lamontcg|1 year ago

Kind of vaguely reminds me of Transmeta vs Intel/AMD back in ~2000.

gdiamos|1 year ago

Cerebras has a real technical advantage in development of wafer scale.

Nokinside|1 year ago

They use the whole wafer for a chip (wafer scale). The WSE-3 chip is optimized for sparse linear algebra ops, used 5nm TSMC process.

Their idea is to have 44 GB SRAM per chip. SRAM is _very_expensive_ compared to DRAM (about two orders of magnitude).

It's easy to design larger chip. What determines the price/performance ratio are things like

- performance per chip area.

- yield per chip area.

ggm|1 year ago

Wafer scale integration has been a thing since wafers. Yet, I almost never read of anyone taking it the full distance to a product. I don't know if it turns out the yield per die per wafer or the associated technology problems were the glitch, but it feels like a good idea which never quite makes it out the door.

cootsnuck|1 year ago

Concerning in terms of hype bubble now having even more exposure to the stock market. Perhaps less concerning since it's a hardware startup? Nah, nvm, I think this will end up cratered within 3 years.

brk|1 year ago

I’m going to go ahead and predict this flubs long term. Not only is what they are doing very challenging, I’ve had some random brokerage house reach out to me multiple times about investing in this IPO. When your IPO process resorts to cold calling I don’t think it’s a good sign. Granted I have some associations with AI startups I don’t think that had anything to do with the outreach from the firm.

drcode|1 year ago

Agreed, it seems like NVIDIA would be happy to make whole-wafer chips if it seemed like a good play.

My guess is there are a lot of bespoke limitations that the software has to work around to run on a "whole wafer" chip, and even companies that have 99% similar designs to Nvidia already are struggling to deal with software incompatibilities, even with such a tiny difference.

imdoxxingme|1 year ago

You do realize that brokerages earn commissions on selling shares, so why wouldn't they contact people who may be interested?

gyre007|1 year ago

I don’t know enough to say they’ll fail or be successful but I am wondering who will underwrite this IPO — they must have balls of steal and confidence gallore

system2|1 year ago

Is it a good idea to go IPO when the balance sheet looks terrible?

metadat|1 year ago

Does cerebras make gaming GPUs, or is it enterprise-only?

ericd|1 year ago

Very solidly enterprise-only. They make single chips that take an entire wafer, use something like 10 kilowatts, and have liquid cooling channels that go through the chip. Systems are >$1M.

elorant|1 year ago

The chip is huge. It wouldn't fit in any conceivable PC form card.

eikenberry|1 year ago

They sound more like NPUs or TPUs than GPUs. Though that doesn't answer the question about the market they are targeting.

brcmthrowaway|1 year ago

How does Cerebras compare to D-Matrix?

bloqs|1 year ago

They have zero moat

parentheses|1 year ago

So many things here smell funny...

I have never heard of any models trained on this hardware. How does a company IPO on the basis of having the "best tech" in this industry, when all the top models are trained on other hardware.

It just doesn't add up.

ClassyJacket|1 year ago

Plenty of companies IPO before releasing anything, or before building a large audience. That's how lots of things that requite a long lead time and large initial investment get made. It's just a bigger risk for the investors.

Tesla IPOed in 2010 after selling only a few hundred Roadsters.

txyx303|1 year ago

Seems like they support training on a bunch of industry standard models. I think most of the customers in the training space tend to be for fine tuning right? The P and T in GPT stand for pre-trained - then you tune for your actual specification. I don't think they will take over the insane computational effort of training Llama or GPT from scratch - those companies are using clusters that cost more than Cerebras' last evaluation.

cootsnuck|1 year ago

I thought they were fore inference not training...either way, kind of is concerning that I've heard about them plenty from the hype bubble but I apparently still don't really understand what they do.

will-burner|1 year ago

This is the first I've heard of Cerebras Systems.

From the article

>Cerebras had a net loss of $66.6 million in the first six months of 2024 on $136.4 million in sales, according to the filing.

That doesn't sound very good.

What makes them think they can compete with Nvidia, and why IPO right now?

Are they trying to get government money to make chip fabs like Intel or something?

hn_throwaway_99|1 year ago

You seemed surprised that this company is having an IPO to actually raise funds for operations and expansion, vs as just an "exit" where VCs and other insiders can dump their shares onto the broader public.

I might be a bit suspicious if a company in some low-capital-intensive industry was IPOing while unprofitable, but this is chip making. Even if they're not making their own fabs this is still an industry with high capital requirements.

We should be thrilled at a company actually using an IPO for its original intended purpose as opposed to some financialization scheme.

est31|1 year ago

Nvidia's moat is real but not big enough that one can't surpass it with a lot of engineering. It's not the only company making AI accelerators, and this has been the case for many years already. The first TPU was introduced in 2015. Nvidia has just managed to get a leader position in the race.

will-burner|1 year ago

> Taiwan Semiconductor Manufacturing Company makes the Cerebras chips. Cerebrus warned investors that any possible supply chain disruptions may hurt the company.

They get their chips from the same company that Nvidia does.

ericd|1 year ago

Compare it to the same period last year ($8.7M in sales). That’s a pretty solid growth rate.

blurbleblurble|1 year ago

Their tech is very impressive, look it up.