(no title)
alted | 1 year ago
For a minimum 100 wafers = 10k chips, Groq may have paid $100M = $10k/chip purely in amortizing design costs.
Chip design (software + engineer time) and fabrication setup (lithography masks) grow exponentially [1][2] with smaller nodes, e.g., maybe $100M for Groq's current 14nm chips to ~$500M for their planned 4nm tapeout. Once you reach mass production (>>1000 wafers, which have ~150 large chips each), wafers are $10k each. On top of this, it takes ~1 year to design then have prototypes made. (These same issues still exist on older slower nodes, albeit not as bad.)
This could be reduced somewhat if chip design software were cheaper and margins were lower, but maybe 20% of this cost is due to fundamental manufacturing difficulty.
(disclosure: I don't work with recent tech nodes myself; this is my best guess)
[1] https://www.semianalysis.com/p/the-dark-side-of-the-semicond... [2] https://www.extremetech.com/computing/272096-3nm-process-nod...
latchkey|1 year ago
Think about the amount of money being dumped into "AI" at this point. If you've got the technology and people to make stuff faster/better/cheaper, finding investors to dump money into your chip making business is probably not as hard as it was 2 years ago.
Groq is making this change for other reasons than the expense of tapping out chips.
shrubble|1 year ago
karma_pharmer|1 year ago
jkachmar|1 year ago
can’t comment on specifics, but imo our hardware team punches above its weight class in terms of # of people and time spent in design.