(no title)
roadbuster | 2 months ago
They have no incentive to purchase a rapidly-depreciating asset and then immediately shelve it, none
They might have to warehouse inventory until they can spin-up module-manufacturing capacity, but that's just getting their ducks in a row
didibus|2 months ago
I'm not saying it's true, but it is suspicious at the very least. The RAM is unusable as it stands, it's just raw wafer, they'd need a semiconductor fab + PCB assembly to turn them into usable RAM modules. Why does OpenAI want to become a RAM manufacturer, but of only the process post-wafer.
roadbuster|2 months ago
The wafers are processed. That means Samsung/Hynix have taken the raw ("blank") wafers, then run them through their DRAM lithography process, etching hundreds of DRAM dies ("chips") onto the wafer.
You could attach test probes to individual chips on the wafer and you'd have a working DRAM chip. In fact, that's how testing is performed: you connect to each die one at a time with a "probe card" which supplies power & ground, plus an electrical interface for functional testing.
If OpenAI takes possession of the processed wafers and wants finished RAM modules, they need to do a few things: test each die (expensive), saw the wafer into individual chips (cheap), package them (moderately expensive), test them again (medium expense), and then assemble the final module (inexpensive). Modern semiconductor test facilities cost billions of dollars and take years to build, so they'd need to immediately outsource that work (typically done in Southeast Asia)
OpenAI likely doesn't want to do any of this. They probably just want to make sure they're in control of their own destiny with regard to DRAM, then decided the best place to accomplish that was by cutting deals directly with the DRAM semiconductor producers. This will allow them to take the wafers to the existing supply chain, then contract them to turn the wafers into finished modules.
themafia|2 months ago
It screws up the price for their competitors. That's an incentive. Particularly with so many "AI datacenter" buildouts on the horizon.