top | item 13736763

The rise of AI is creating new variety in the chip market, and trouble for Intel

113 points| hackathonguy | 9 years ago |economist.com | reply

55 comments

order
[+] friedman23|9 years ago|reply
> Instead of making ASICS or FPGAs, Intel focused in recent years on making its CPU processors ever more powerful

If only, intel has been abusing their market position and pushing out "upgrades" that barely have a performance improvement over the previous generation.

AI is not going to eat intel's lunch, all those computers still require cpus. AMD on the other hand may eat intel's lunch by releasing powerful multicore processors for half the price all because they don't waste space on the die for things like integrated graphics.

[+] coldtea|9 years ago|reply
>If only, intel has been abusing their market position and pushing out "upgrades" that barely have a performance improvement over the previous generation.

And why is that because of "abusing their market position", as opposed to plainly and clearly being more difficult to get faster processors at 14 nm and lower resolutions (and with the low power requirements of today)?

Besides, the trend the article points to is a BS fad as I see it (and I've seen 5-6 of those play out in the last 30 years). ASICS and FPGAs wont even come close to bringing in as much cash as general purpose CPUs do for Intel.

[+] petra|9 years ago|reply
ASIC/FPGA or even GPU's aren't the future of neural computing. The future is analog(orders of magnitude perf/watt and perf/$). That require older fabs, optimized for analog, and having good embedded flash, which TSMC has and Intel mostly hasn't got.

And that same future applies not only for neural computing, but for a field called approximate-computing, i.e. computing where results aren't accurate.Some/many signal and image processing work well with that. I've also seen some research about doing scientific computing on approximate hardware and correcting errors.

[+] ganfortran|9 years ago|reply
> AI is not going to eat intel's lunch, all those computers still require cpus.

But they might not need that powerful CPU anymore, if computational heavy task happens elsewhere anyway. So as time goes, CPU will become less critical, meaning less money for Intel

[+] cma|9 years ago|reply
>AMD on the other hand may eat intel's lunch by releasing powerful multicore processors for half the price all because they don't waste space on the die for things like integrated graphics.

If that's all it is isn't that going to be pretty much a simple knob for Intel to turn to respond? They already have Xeon which doesn't use the space on graphics.

The main reason for pushing iGPU on desktop so hard was so the same chips could be pretty much used on mobile I guess?

[+] xbmcuser|9 years ago|reply
Intel would still loose even without amd. Yesterday people that were ordering 100 Intel chips will be ordering maybe just 10% of these as the major processing s has been passed on to a different processor. They could even go for cheaper intel CPUs as the major processing power requirement would be met by gpus.
[+] ghaff|9 years ago|reply
AI/ML/etc. may be part of it. But the other factor is that you can't just wait 18 months any longer for a new generation of x86 to be a lot [EDIT] faster. That was the big problem with specialized architectures historically. The volume architecture would catch up soon enough without you having to rewrite software to optimize for some different design of processor.

That's no longer the case so specialized designs for the compute-hungry workload du jour (which happens to be AI at the moment) are starting to look a lot more attractive.

[+] Zenst|9 years ago|reply
It is more a case most cpu intensive demands being addressed by dedicated chips, as we always have had. Many area's of information technology move from general cpu's towards dedicated silicon. Even CPU's adapt and add instructions and with that small area's of silicon space for some dedicated demands (think MMX, AVX, AES,...).

This is no change at all in what we already have. It is when we finally dedicate all tasks down to dedicated silicon that the glue of a CPU processing wise will diminish.

But then CPU's of today are constantly adapting and I'd say the C in CPU is better defined as Centralised rather than Central.

For me, I'm looking forward to a AI grammar and contextual spelling checker that will make all grama nazi's obsolete.

So my perspective upon this for intel is that I foresee no trouble for Intel, who already adapt to change and are not to be dismissed any time soon, just yet.

[+] jdjebc82747|9 years ago|reply
>For me, I'm looking forward to a AI grammar and contextual spelling checker that will make all grama nazi's obsolete.

I'm not. I feel like human language is meant to be fluid and to evolve as we do. This would potentially lead to more of a global monoculture than we are already starting to get.

[+] jhj|9 years ago|reply
The "3,854 cores" versus "28 cores" is dubious as always. 3,854 I think counts just the individual fp32 ALUs; a true similar comparison would be number of warp schedulers or maximum number of warps resident at once, or even just SM count (which share a cache).

Apples to oranges (a super-hyperthreaded 1024/2048-bit wide vector machine with minimal cache to a minimally hyperthreaded 128/256-bit wide vector machine with lots of cache).

[+] astrodust|9 years ago|reply
Apples to oranges? It's more like how a hundred thousand squirrels can't write a novel no matter how long they're given but one person can given a few months.

Not all compute devices are equivalent and "core" vs. "core" is a totally absurd comparison.

[+] varelse|9 years ago|reply
Crazy idea: buy the rights to sell AMD's Vega GPUs fabbed out of Intel and use Intel's resources to build top-notch math and AI libraries for them.

Stupid idea: Keep insisting that x86-compatibility is the killer feature for winning the parallel processor wars.

Stupidest idea: CS professors continuing to tell their students that learning concurrent programming is too hard.

[+] kogepathic|9 years ago|reply
> Crazy idea: buy the rights to sell AMD's Vega GPUs fabbed out of Intel

1) I don't think Intel has a lot of spare fab capacity. Certainly not on the nodes AMD is looking to produce Vega on.

2) Intel only just announced a deal to start manufacturing ARM chips on their fabs. [0]

Honestly I can't believe it took Intel so long to wake up and realize that their x86 business is okay, but if they want to survive long term they have to accept that they need another business segment to bring in money after x86 stops being as relevant as it is today.

Just look at TSMC [1] if you want an example of why Intel is foolish to think they can keep being top dog with only x86. TSMC was nobody in the 90's, and now their market cap is within ~10% of Intel's [2] (TSMC @ 160B versus Intel @ 175B).

TSMC doesn't even design their own chips. I'm not saying building semiconductors is easy, or that TSMC has no R&D costs, but you're talking about a company which specializes only in manufacturing some of the most advanced chips on the planet, and doing it at volumes I doubt Intel can match. I predict unless Intel does something major in the near future (<24 months), TSMC will surpass Intel's market cap.

The former CEO of Intel Paul Otellini captured it best himself:

"It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought." [3]

Intel still thinks they can kill it by selling expensive CPUs. TSMC is proving that thinking is outdated. You don't have to have a 60%+ margin on your chips, you just have to make it up in volume.

Where do you think the next billion chips are going to be sold? It's not going to be $500 x86 CPUs. It's going to be <$5 ARM chips in embedded devices, and that's exactly the market segment TSMC is appealing to.

[0] http://www.theverge.com/2016/8/16/12507568/intel-arm-mobile-...

[1] http://www.google.com/finance?q=NYSE%3ATSM

[2] http://www.google.com/finance?q=NASDAQ%3AINTC

[3] http://www.theinquirer.net/inquirer/news/2268985/outgoing-in...

[+] deepnotderp|9 years ago|reply
Founder of a similar startup here.

The strategy that nervana is taking is to reduce precision to 16 bit fixed point and then accumulate in 48 bits (which appears to be unnecessary and 24 bits should be sufficient).

I can answer any questions if anyone has any.

[+] lowglow|9 years ago|reply
Yeah, what's a good intro on understanding all of this? I've got an EE/Chem/Math background.
[+] Filligree|9 years ago|reply
The article is visible at first, but once the page loads entirely it disappears.
[+] SideburnsOfDoom|9 years ago|reply
It happens to me too. I think it's The Economist's paywall in action. It's pretty annoying that this article is linkable but not-really on the web.
[+] fstephany|9 years ago|reply
Same in Firefox for me. I switched to "Reader View" (I love this feature) and the text appeared.
[+] phkahler|9 years ago|reply
Do most of these application (machine learning, vision, etc) rely on OpenCL? It seems to me that GPUs are better suited to OpenCL than a regular CPU, but if that's what all the excitement is about I suggest reading up on some of the work on adding vector extensions to RISC-V and the corresponding flops/watt they're may achieve. They are basing some of the work on results from here: http://hwacha.org although they make it clear that hwacha will not be the standard vector instruction set.
[+] TazeTSchnitzel|9 years ago|reply
Intel must be regretting dropping their dedicated GPU project.
[+] modeless|9 years ago|reply
They didn't exactly drop it. It's the Knights series (Xeon Phi). They just removed the graphics bits.
[+] abhianet|9 years ago|reply
> But the GPUs also have new destinations: notably data centres where artificial-intelligence (AI) programmes gobble up the vast quantities of computing power that they generate.

Should not it be "programs"? Or is "programmes" used in some dialect of English I am not aware of?

[+] tomatsu|9 years ago|reply
"Programme" is used in British English for that broadcasting stuff etc. As far as I can tell, they do use "program" for computer programs, though.