top | item 43940321

(no title)

absolutelastone | 9 months ago

I think their point is the billions in private investment which preceded those millions.

I think this is a common issue in computer science, where credit is given to sexy "software applications" like AI when the real advances were in the hardware that enabled them, which everyone just views as an uninteresting commodity.

discuss

order

heylook|9 months ago

> I think their point is the billions in private investment which preceded those millions.

But the "billions" didn't precede the "millions". They're just completely incorrect, and anyone that knows even a tiny amount about the actual history can see it immediately. That's why these comment sections are so polarized. It's a bunch of people vibe commenting vs people that have spent even like an hour researching the industry.

The history of semiconductor enterprise in the US is just a bunch of private companies lobbying the government for contracts, grants, and legal/trade protections. All of them would've folded at several different points without military contracts or government research grants. Read Chip War.

https://en.wikipedia.org/wiki/Chip_War:_The_Fight_for_the_Wo...

absolutelastone|9 months ago

You seem to be arguing that the second government touches anything then everything it does gets credited to the government funding column. Seems simplistic to me, but you can believe what you like. Go back far enough and there was only private industry, and no government funding until the space race basically.

Either way the fact remains that the billions spent developing GPU's preceded the millions spent to use those GPUs for AI. Not sure what it has to do with polarization of the comment section. I assume it's just people seeking an opportunity to heap abuse on anything close to a representative of the evil "other side".

Frost1x|9 months ago

I wonder if it deals more with the approachability of software applications. If I even begin to think I’d compete with NVIDIA delivering similar hardware, I’d very quickly realize I was an idiot. Meanwhile as a single individual, there is still a reasonable amount of commercial markets of software I really do have some chance at tackling or competing against. As software complexity rises it’s becoming far less tractable than it was in say the 90s but there are still areas individuals and small sums of capital can enter. I think that makes the sector alluring in general.

Hardware is just in general capital intensive, not even including all the intellectual capital needed. So it’s not that it’s uninteresting or even a commodity to me, it’s just a stone wall that whatever is there is there and that’s it in my mind.

absolutelastone|9 months ago

That difference in difficulties is kind of the point. Imagine, as an extreme, a company makes a machine with certain functions performed based on which button combinations you press. A second company gets a patent for using the first company's machine for doing various tasks by pressing various button combinations, which are new uses of the machine no one had thought of yet. Now the second company has all the bargaining power in the market and so gets giant margins, despite doing a tiny fraction of the work it takes to make those tasks possible.

I wonder if our current system ended up this way because it is the most efficient in terms of specialization, or because the patent system drove things in this direction where the people last dealing with customers (i.e., those making the software layer) have the best info of what tasks the customers want to do with their computers, and hence patent the solutions first. Leaving hardware vendors no choice but to serve the software monopolies (one after another since the 80's).