Intel: Integrated Graphics Are Catching Up With Discrete GPUs

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Did you hear that guys? We can all ditch our SLI and Crossfire setups in favor of integrated graphics! Think of all the money we'll be saving! :D

The top-level graphics processors integrated in Intel’s chips, called Iris and Iris Pro, can outperform 80 percent of discrete graphics chips, Bryant said. “We have improved graphics 30 times what they were five years ago,” Bryant said during a speech at a J.P. Morgan forum last week at CES.
 
can outperform 80% of discrete graphics chips? Ok... so does that include things like the 8400GS and 6200 cards that are still out there on shelves?
 
Everyone knows the standard for discrete graphics is the Tseng Labs ET4000W32P. So that is obviously the class of device they are comparing to.
 
can outperform 80% of discrete graphics chips? Ok... so does that include things like the 8400GS and 6200 cards that are still out there on shelves?

Don't forget they've improved their integrated video by 30 times what it was 5 years ago... considering it was practically non-existent and AMD had just released its APUs. It's not like they are really lying, they are just throwing large numbers out there to impress potential investors or business partners.
 
That's not really a good comparison. While GT4e can probably outperform 80% of the installed base of discrete GPUs *in use*, many of which are years old and/or never intended for gaming (business graphics like low end Quadro NVS models), it's only available on pretty pricey processors. Plus it's not like you can just add GT4e to your PC for $75, which is around the performance level of dGPUs it competes against.

Worse is that Intel is using its flagship Iris Pro Graphics 580, which isn't even available in a shipping processor yet, as the basis of the comparison.
 
Well, to be honest, AMD has not updated its once class-leading iGPU for ages. So Intel saw an opportunity to steal thunder.
 
can outperform 80% of discrete graphics chips? Ok... so does that include things like the 8400GS and 6200 cards that are still out there on shelves?

That was my first thought also. The second thing is the fact that even with the newer generations the low end sells several orders of magnitude more than the high end, especially with the big OEM inventory movers like Dell, HP, etc.
 
Intel is presumably talking about the Iris Pro 580, which apparently will be available in its Skull Canyon Skylake NUCs. That article states that it runs at "...a promising 1.15 TFlops. For good measure, that makes the Iris Pro 580 almost as potent as the Nvidia's Geforce GTX 750."

From this article, it appears the Iris Pro 580 is reserved for Intel's H Series CPUs, while the desktop S series (which includes the 6600K and 6700K) have the HD Graphics 530, but the NUC lines use U series CPUs so I remain a bit confused at the distinction between the U and H lines.

What I get from all that is that the really high-end Iris Pro stuff won't be offered for devices that can also run discrete video cards (without using an eGPU), so Intel's Iris Pro isn't competing directly with discrete video cards.
 
Show us the benchmarks that proves what they say is true or not. New project for HardOCP.
 
Show us the benchmarks that proves what they say is true or not. New project for HardOCP.

HardOCP doesn't review on-board graphics. There's no interest here in a platform that maxes out at less than a GTX 750.
 
I have a laptop with the new Sky Lake HD Graphics 530. The drivers are trash, I have to run an old release to prevent graphical glitches when using the internal screen and the HDMI out doesn't work properly when the Wi-Fi is enabled (it could be Broadcom's fault I guess).

Performance (when the wind is in the right direction and the stars are aligned) is like a Geforce 620. And that's only when the drivers work. You can play Starcraft 2 reasonably on it if you have to. All in, I'd say it's faster than 0% of current dedicated GPUs. It's fast enough at rendering the desktop and reducing battery life and that's all it's really there for anyway.
 
Although not meaningful for the more serious gamer, the Steam hardware survey indicates 18.66% of gamers using Intel graphics ... also many older cards are in use ... getting more powerful basic graphics out of the Chipsets or basic motherboard configurations is a good thing that benefits the majority of PC consumers
 
If Intel wasn't making their integrated graphics better the majority here would bitch. If they do, once again, a pile of bitching.

Just enjoy the progress of each company and buy what makes you happy.
 
I would LOVE to see IGP actually compete with discreet cards but it's never going to happen. With larrabee's failure that killed any chance of anything happening with intel.
 
With tech like HBM I can see integrated graphics catching up quite quickly to discrete. Just sticking a x86 core with a huge GPU and HBM stacks.
 
With articles like this I can imagine some young person who wants to play games having an argument with their parents as I once did over the necessity of a graphics card. Only now it would be more of an argument about the ability to play new releases.

"You don't need a graphics card it's a PC, you use it for word processing and spreadsheets, you didn't go to school for CAD, therefore you don't...need...it."

I know the new chips have good enough graphics to use for many older and less demanding games, but there's no way Intel is going to convince me to buy a new CPU every time I feel I need better graphics performance. On-board has come a long way but integrated hardware, I worry about PCs trying to have everything integrated. I love PCs because they aren't consoles with everything soldered to one silicon board.
 
As a dad with an 8 year old daughter who loves minecraft, hey I don't mind. The Geforce 750 is amazing and if you are telling me that in a few years I can build a machine for her by the time she's 11 with a nice little onboard video card. Hell ya I'm game.
 
You guys do know that if Intel achieves Graphene/Phosphorene chips first (estimated 2020 at the earliest), then yeah, they would eat almost all the market? just through brute force.

And if there is any indication about the technology delta between intel and the rest of the market, there is a high probability that it will be the case at some point.
 
You guys do know that if Intel achieves Graphene/Phosphorene chips first (estimated 2020 at the earliest), then yeah, they would eat almost all the market? just through brute force.

And if there is any indication about the technology delta between intel and the rest of the market, there is a high probability that it will be the case at some point.

2020!? Eat MOST of the market? Get real dude. And where will the development of dedicated graphics cards be by then? Seriously. This thread fails
 
You guys do know that if Intel achieves Graphene/Phosphorene chips first (estimated 2020 at the earliest), then yeah, they would eat almost all the market? just through brute force.

And if there is any indication about the technology delta between intel and the rest of the market, there is a high probability that it will be the case at some point.

Maybe Intel can finally compete with ARM by that time ;)
 
2020!? Eat MOST of the market? Get real dude. And where will the development of dedicated graphics cards be by then? Seriously. This thread fails

Intel integrated graphics IS the market. Your fancy AMD and Nvidia GPU's are a drop in the bucket.
 
can outperform 80% of discrete graphics chips? Ok... so does that include things like the 8400GS and 6200 cards that are still out there on shelves?

Intel is no doubt comparing their top performing integrated GPU in Broadwell against all the piece of shit dated graphics cards on store shelves for around $100 and the install base of laptops with "dedicated" graphics that barely outran integrated GPUs when they were new that are all still in service. When talking about pure numbers, Intel is probably right. Their most modern integrated GPU is probably better than the bulk of the actual video cards that have been built to this point. Cards like the 980Ti and Fury X make up the smallest portion of the video cards sold. The only reason why companies even bother developing them is because those cards can be scaled down to other markets and despite their small sales numbers, the margins on them are rather high.
 
Sales wise yes, in use also and that sums it up.

Which is why it could be a big deal if Intel gets some truly decent stuff out there. I mean, the GT3e is already much faster than any spare card I have like the 9800GT.
 
...and yet Intel released most (all?) of their new Skylake chips with integrated GPUs far slower than the Iris hardware...
 
...and yet Intel released most (all?) of their new Skylake chips with integrated GPUs far slower than the Iris hardware...

Iris Pro requires a special eDRAM buffer on the same chip substrate. This is much harder and expensive to produce.
 
Intel iGPU has gotten good enough but is still plagued by crappy drivers and QA process. I mean do I really need to report simple YouTube playback issues or should it have been caught by simple QA testing? Also, the Surface line tarnished with crappy Intel driver issues.
 
...and yet Intel released most (all?) of their new Skylake chips with integrated GPUs far slower than the Iris hardware...

Skylake was supposedly designed for the enthusiast. According to Intel It was designed to be paired with a discreet graphics solution rather than rely on integrated graphics.
 
The top-level graphics processors integrated in Intel’s chips, called Iris and Iris Pro, can outperform 80 percent of discrete graphics chips, Bryant said. “We have improved graphics 30 times what they were five years ago,” Bryant said during a speech at a J.P. Morgan forum last week at CES.


That right there explains it. If we dont oversell, we will be undervalued :p
 
Does anyone actually have comparative sales figures for different levels of discrete GPU's?

I mean, they are certainly not claiming to be competitive with my dual 980ti's, but we tend to forget that Nvidia also sells the likes of the GeForce GT 705's, GT 710's and GT 720's in each generation. I wouldn't be surprised if these low end models make up a majority of their GPU sales, because after all, most PC component buyers don't use their rigs for games. We are a minority after all, as shown by the fact that PC Gaming is growing explosively while PC sales at the same time are tanking.
 
Back
Top