What exactly is Intel trying to achieve with these integrated GPUs? It seems that performance-wise they're quite far below discrete graphics cards, so I'd guess that they're not really meant for gaming.
Are they intended for everyday computing purposes that might require graphics acceleration, such as high end display managers?
I'd certainly love to see them enter the graphics card arena and compete with ATI/Nvidia by having phenomenal open source drivers. I'd vote for that with my wallet.
Intel's only three generations into on die GPUs - starting with the 32nm Westmere dual core chips.
They don't even have DRAM on the chip yet - normal graphics cards use monstrously high bandwidth connections (10x higher than DDR3) to stream in textures. HD 4000 et al just access main memory, competing with the CPU for bandwidth.
They might be unimpressive now, but it's a focus of Intel to keep improving them, and that will happen significantly faster than Moore's Law.
Also, it's wrong to compare them to discrete graphics cards. They're cheap and low power, used in the MacBook Air. They replace the much inferior Intel integrated graphics and nVidia chipset graphics (used in the original Air).
They're now nipping at the heels of low end discrete graphics chips (especially on laptops). That's a great thing.
As a game developer, I'm excited by them. My games will run badly, but at least now they'll run on even the cheapest computers.
They're not meant to do anything other than exist. It's used as a low end way to make a PC actually have a graphics card, you know, that can display Microsoft Word prettily enough with Aero. That's what Intel cards started out as.
Moving forward, they're giving the cards enough power to help render movie times, playing games such as League of Legends casually at low (think 1366 by 768, not 1080p) resolutions and medium settings, etc that can satisfy 90% of user needs.
Maybe in a few years Intel will make their own graphics cards that can compare in power to current Nvidia and ATI cards. Until then, these cards serve as cheap stuff that can be found on every computer.
For me it's an optimal solution. Apart from an occasional game of quake, all my activity is in the terminal / browser / media player. Having 3d support is OK, but buying any serious graphics card would be a waste of money. And it's still better than previous i915.
The idea is they come basically for free with your processor, bundled in, with the additional advantage of not requiring more space for a separate graphics unit and additional GPU memory plus associated cooling.
For some applications this is an enormous benefit. For an office computer, which doesn't require high-end 3D to start with, the HD4000 will be more than good enough.
They're meant for gaming, but not performance gaming. You can play something like Starcraft 2 on it without much trouble if you tune it down to "Low" settings. It just doesn't look anywhere near as amazing as it would on "Ultra".
They have open source specs and drivers, so they are the best you can get under Linux! Most display managers use compositing now so need some sort of GPU.
They're not meant for gaming, but they're good-enough for gaming. My '11 MBP (13", Intel HD 3000) runs Portal 2, Diablo 3 and, really, most of the new games I throw at it surprisingly well.
On notebooks they let you save battery life by not using the discrete graphics at all for a larger proportion of desktop graphics activity. On desktops and notebooks they make the graphics not completely suck without having to have a separate GPU, making low-end computers be cheaper.
They're meant to differentiate their CPUs, and find a good use for the billions of transistors that their process technology allows them to include in their designs. So the question is, what's better: more cache, more cores (diminishing returns after 4), or a specialized accelerator for 3D, video and generic heavily parallel computations (GPGPU)?
These GPUs are cheaper since they are directly integrated in the microprocessor. Performance wise, I think they are still better than the GMA line of Intel processors, and they provide support for latest DirectX out of the box.
With the HD 4000 I heard you can play some of the relatively recent games at lower resolution and low details.
Nahh, not everyone needs to play the latest games in the highes details on a FullHD screen.
I remember playing CS and UT years ago on centrino notebook with intel graphics and it was all i needed.
Now those chpis can do a lot more, so i'm looking forward to get away with that horrible dual-graphics setup in my notebook in the future :)
As a matter of fact it is catching up quite nicely. Haswell are looking at 2.5x of current HD4000 performance. And Broadwell will do another 2x of Haswell.
These performance would have been nice if not for Retina display manages to use 4x the pixel on screen. Things are just never fast enough.
As soon as WebGL gets mainstream a certain amount of graphics power is mandatory for every Computer out there just to be able to surf the web. If it is not WebGL some kind of other advanced 3D technology will be used in future web applications.
>What exactly is Intel trying to achieve with these integrated GPUs?
A better performance/energy consumption ratio for lightweight laptops?
>It seems that performance-wise they're quite far below discrete graphics cards, so I'd guess that they're not really meant for gaming.
No they are not. But then again, very few GPUs are actually used for gaming. Most people over 25 use their computers for others tasks (including all enterprise and work computers).
>Are they intended for everyday computing purposes that might require graphics acceleration, such as high end display managers?
Most computer use today requires graphics acceleration. From browsing the web (canvas), to watching an HD movie, to talking on Skype/Facetime.
Do I just fail at reading, or do these benchmarks lack any context?
The question I want answered is how these chipsets compare with other offerings. The context I would like to see is the same tests run on low/mid/high-end AMD and nVidia parts using both the open-source and proprietary drivers.
The number I've heard somewhere -- don't quote me on this! -- is that the Intel HD4000 GPU is roughly on par with a low-end ($150 to $200) discrete GPU from either ATI or nVidia. It's certainly nothing to write home about, but it's more than sufficient for most tasks, and even workable for some light gaming.
I was looking to buy mobo + ivy bridge CPU and I could not find a motherboard that could drive my display that need dual link DVI or display port - most of them have HDMI that could drive Full HD but not bigger monitors (my display is 24" 16:10, so it is not something extraordinary). So I am waiting for Ivy Bridge processors and chipsets without integrated GPU (like Sandy Bridge i7-3820)
What is the point of this? Obviously, 4000 > 3000 > 2000. Of more interest might be a head-to-head comparison of the performance of these processors under Linux versus Windows.
[+] [-] aw3c2|13 years ago|reply
[+] [-] silon3|13 years ago|reply
[+] [-] akurilin|13 years ago|reply
Are they intended for everyday computing purposes that might require graphics acceleration, such as high end display managers?
I'd certainly love to see them enter the graphics card arena and compete with ATI/Nvidia by having phenomenal open source drivers. I'd vote for that with my wallet.
[+] [-] reitzensteinm|13 years ago|reply
They don't even have DRAM on the chip yet - normal graphics cards use monstrously high bandwidth connections (10x higher than DDR3) to stream in textures. HD 4000 et al just access main memory, competing with the CPU for bandwidth.
They might be unimpressive now, but it's a focus of Intel to keep improving them, and that will happen significantly faster than Moore's Law.
Also, it's wrong to compare them to discrete graphics cards. They're cheap and low power, used in the MacBook Air. They replace the much inferior Intel integrated graphics and nVidia chipset graphics (used in the original Air).
They're now nipping at the heels of low end discrete graphics chips (especially on laptops). That's a great thing.
As a game developer, I'm excited by them. My games will run badly, but at least now they'll run on even the cheapest computers.
[+] [-] jcitme|13 years ago|reply
Moving forward, they're giving the cards enough power to help render movie times, playing games such as League of Legends casually at low (think 1366 by 768, not 1080p) resolutions and medium settings, etc that can satisfy 90% of user needs.
Maybe in a few years Intel will make their own graphics cards that can compare in power to current Nvidia and ATI cards. Until then, these cards serve as cheap stuff that can be found on every computer.
[+] [-] viraptor|13 years ago|reply
[+] [-] astrodust|13 years ago|reply
For some applications this is an enormous benefit. For an office computer, which doesn't require high-end 3D to start with, the HD4000 will be more than good enough.
They're meant for gaming, but not performance gaming. You can play something like Starcraft 2 on it without much trouble if you tune it down to "Low" settings. It just doesn't look anywhere near as amazing as it would on "Ultra".
[+] [-] justincormack|13 years ago|reply
[+] [-] klausa|13 years ago|reply
[+] [-] shrughes|13 years ago|reply
[+] [-] batgaijin|13 years ago|reply
[+] [-] brg1007|13 years ago|reply
[+] [-] fiatmoney|13 years ago|reply
[+] [-] ronnix|13 years ago|reply
[+] [-] ekianjo|13 years ago|reply
With the HD 4000 I heard you can play some of the relatively recent games at lower resolution and low details.
[+] [-] buster|13 years ago|reply
I remember playing CS and UT years ago on centrino notebook with intel graphics and it was all i needed. Now those chpis can do a lot more, so i'm looking forward to get away with that horrible dual-graphics setup in my notebook in the future :)
[+] [-] ksec|13 years ago|reply
These performance would have been nice if not for Retina display manages to use 4x the pixel on screen. Things are just never fast enough.
[+] [-] sprash|13 years ago|reply
[+] [-] batista|13 years ago|reply
A better performance/energy consumption ratio for lightweight laptops?
>It seems that performance-wise they're quite far below discrete graphics cards, so I'd guess that they're not really meant for gaming.
No they are not. But then again, very few GPUs are actually used for gaming. Most people over 25 use their computers for others tasks (including all enterprise and work computers).
>Are they intended for everyday computing purposes that might require graphics acceleration, such as high end display managers?
Most computer use today requires graphics acceleration. From browsing the web (canvas), to watching an HD movie, to talking on Skype/Facetime.
[+] [-] nathanb|13 years ago|reply
The question I want answered is how these chipsets compare with other offerings. The context I would like to see is the same tests run on low/mid/high-end AMD and nVidia parts using both the open-source and proprietary drivers.
[+] [-] duskwuff|13 years ago|reply
[+] [-] karavelov|13 years ago|reply
[+] [-] haeikou|13 years ago|reply
[+] [-] salmanapk|13 years ago|reply
[+] [-] MrDOS|13 years ago|reply
[+] [-] mda|13 years ago|reply
[+] [-] kleiba|13 years ago|reply