Intel HD 630 IGPs can Run the Division 2 at 30 FPS

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
Intel IGPs don't have the best reputation, but as we've mentioned before, Intel's Graphics division is looking to change that. Intel told us that they expect their next generation integrated GPUs to run twice as fast as the current generation, but in the meantime, they've been working to optimize games on Coffee Lake IGPs. The company's YouTube channel just uploaded a video of the Division 2 running on a Core i5-7500 with 2400Mhz RAM at 30FPS.

Check out the video here.

All things considered, that's somewhat impressive, as the game seems to be running faster than the original Division on the exact same rig.
 
Close, they do show in the video it's running at 1280 x 720p. So it's 921,600 or about .9Mpix 1080p is a bit over 2x the resolution.

For reference:

1024 x 768 = 786,432
1280 x 720 = 921,600
1366 x 768 = 1,049,088
1280 x 1024 = 1,310,720
1600 x 1200 = 1,920,000
1920 x 1080 = 2,073,600

So 720p falls about halfway between 1024 x 768 and 1280 x 1024.

So running a modern game at a resolution we were using about 20 years ago at roughly half the frame rate.

EDIT:

I can toss in a few more for reference of how much greater the demand goes up.

1920 x 1200 = 2,304,000
2048 x 1536 = 3,145,728
2560 x 1440 = 3,686,400
2560 x 1600 = 4,096,000
4096 x 2048 = 8,388,608


So 1440p is about 3.5x higher resolution, and 4k is about 9x higher. I could include 8K for academic purposes but just know that it's around 33Mpix so it's way off the chart.
 
Last edited:
Close, they do show in the video it's running at 1280 x 720p. So it's 921,600 or about .9Mpix 1080p is a bit over 2x the resolution.

For reference:

1024 x 768 = 786,432
1280 x 720 = 921,600
1366 x 768 = 1,049,088
1280 x 1024 = 1,310,720
1600 x 1200 = 1,920,000
1920 x 1080 = 2,073,600

So 720p falls about halfway between 1024 x 768 and 1280 x 1024.

So running a modern game at a resolution we were using about 20 years ago at roughly half the frame rate.

They use 50% resolution scaling too, so it's not quite 720p either.
 
It is an improvement. My take Intel is showing they are serious about entering the GPU market.

A third viable contender with deep pockets.... or back to a second when they buy Nvidia, if the NV stocks keep going down......

I'd rather Intel come out swinging hard and have the market see some need to aggressively evolve. Intel should look at Matrox... all the awesome 2D multi-monitor goodness. I still remember my G450 then 450max fondly.
 
They use 50% resolution scaling too, so it's not quite 720p either.

Damn I didn't even catch that. I saw everything was on Low settings. If that's the case then I feel like I should be able to dust off my Nvidia 6800GT and give this thing a run for it's money. I know it's more complicated than just resolution, but it seems like we're going backwards in performance. The game probably requires like 2GB of VRAM to deliver PS3 level graphics.
 
Hey, just think about how far the iGPU has come since the Sandy Bridge era.

I'd be interested in a chart on that actually. One of the more saddening things that I found out before was I needed to replace a GPU in a computer. I figured throwing in a GT710 would be good enough to be okay. After doing a bunch of comparisons I think I determined that an 8800GT would give a GT710 a run for it's money. The GT710 has more VRAM, but it has 1/4 of the memory bandwidth. If it wasn't for the massive vram bloat these days, an 8800GT would still provide better performance. So yes I know the IGP was complete garbage compared to a dedicated GPU back in the day, but even if you quadrupled the performance of garbage, it's still slow. 7fps to 30 fps just means it's barely playable now.
 
Great so they can turn everything down and play at a resolution that has not be the standard for like 15 years now. AND it will only play at 30 fps.....which is bearable at best. Get too much going on in a large fight and watch it turn into a powerpoint presentation.
 
Damn I didn't even catch that. I saw everything was on Low settings. If that's the case then I feel like I should be able to dust off my Nvidia 6800GT and give this thing a run for it's money. I know it's more complicated than just resolution, but it seems like we're going backwards in performance. The game probably requires like 2GB of VRAM to deliver PS3 level graphics.

Yeah, I missed the scaling bit too.

The 6xxx series was the first with DX9 right?

Does this game run on DX9 only GPU's?
 
Yeah, I missed the scaling bit too.

The 6xxx series was the first with DX9 right?

Does this game run on DX9 only GPU's?

Most likely the game won't run on anything that's DX11 compatible and has like 2GB of VRAM. The 5xxx series was the first to introduce DX9 though, that's part of the reason why the 5 series tanked in benchmarks. It offered a regression to DX7 and DX8, but did provide an edge for DX9. The only issue is that cards like the FX5200 weren't fast enough to handle DX9 games anyway.


The Division 2 PC specs
Minimum – 1080p/30 FPS
  • OS: Windows 7 ,8, 10
  • Processor: AMD FX-6350, Intel Core I5-2500K
  • Memory: Processor: 8 GB RAM
  • Graphics: AMD Radeon R9 270, Nvidia Geforce GTX 670
  • VRAM: 2 GB
  • DIRECT X: DirectX 11, 12
https://www.vg247.com/2019/01/09/the-division-2-pc-specs/

Yes 2GB min and DX11. So the issue is that even way newer cards like HD5870 would be fast enough to run it, but it doesn't have the minimum vram.

So I'm sure that technically this IGP doesn't meet minimum specs either, but they wanted to showcase it either way.
 
Most likely the game won't run on anything that's DX11 compatible and has like 2GB of VRAM. The 5xxx series was the first to introduce DX9 though, that's part of the reason why the 5 series tanked in benchmarks. It offered a regression to DX7 and DX8, but did provide an edge for DX9. The only issue is that cards like the FX5200 weren't fast enough to handle DX9 games anyway.



https://www.vg247.com/2019/01/09/the-division-2-pc-specs/

Yes 2GB min and DX11. So the issue is that even way newer cards like HD5870 would be fast enough to run it, but it doesn't have the minimum vram.

So I'm sure that technically this IGP doesn't meet minimum specs either, but they wanted to showcase it either way.

Usually the VRAM requirements are very resolution and settings dependent though. Drop the resolution to 50% of 720p like Intel did, and maybe these would go way down?
 
Most likely the game won't run on anything that's DX11 compatible and has like 2GB of VRAM. The 5xxx series was the first to introduce DX9 though, that's part of the reason why the 5 series tanked in benchmarks. It offered a regression to DX7 and DX8, but did provide an edge for DX9. The only issue is that cards like the FX5200 weren't fast enough to handle DX9 games anyway.



https://www.vg247.com/2019/01/09/the-division-2-pc-specs/

Yes 2GB min and DX11. So the issue is that even way newer cards like HD5870 would be fast enough to run it, but it doesn't have the minimum vram.

So I'm sure that technically this IGP doesn't meet minimum specs either, but they wanted to showcase it either way.

DX11 has a built in "fallback render" to DX10/9 cards buildt into the API, so those "requirements" make no sense to me.
 
Most likely the game won't run on anything that's DX11 compatible and has like 2GB of VRAM. The 5xxx series was the first to introduce DX9 though, that's part of the reason why the 5 series tanked in benchmarks. It offered a regression to DX7 and DX8, but did provide an edge for DX9. The only issue is that cards like the FX5200 weren't fast enough to handle DX9 games anyway.



https://www.vg247.com/2019/01/09/the-division-2-pc-specs/

Yes 2GB min and DX11. So the issue is that even way newer cards like HD5870 would be fast enough to run it, but it doesn't have the minimum vram.

So I'm sure that technically this IGP doesn't meet minimum specs either, but they wanted to showcase it either way.

Sidenote: I though of the first consoles when I saw the resolution/image quality...
 
Usually the VRAM requirements are very resolution and settings dependent though. Drop the resolution to 50% of 720p like Intel did, and maybe these would go way down?

Maybe? I honestly have no clue on that one. I vaguely seem to remember it not making a huge swing in GTA:V, like half the resolution making have the VRAM. There's probably a bunch of textures or something loaded into memory that don't scale down past a certain point. So if they aim for like 1080p minimum their textures might take up the same amount of space even if the game was ran at 800 x 600. On the flip side they might have higher resolution textures for like 4k which would double the memory usage from 1080p to 4k. That's just a wild guess.


Factum Yea I would guess the memory bloat is related to consoles. I don't ever remember a time previously where VRAM was a make or break for the ability to run a game. These games still look terrible, but use 4x the memory of a previous title with similar IQ. I don't know about the DX11 thing either, so maybe it does work? I don't know if anyone's tested anything like that. The trouble with that is I'm quite positive there is no such thing as a DX10 card with 2GB VRAM. So it may work but the game will be a slideshow while it swaps out memory. So it's more academic that saying something like a GTX280 or a 6800GT could be a faster GPU because of the VRAM limitations.

*Side Note*
Because it's fun to make comparisons, when you start talking about IGPs using the system memory versus dedicated GDDR, things become interesting.

6800GT ~ 22GBps
8800GT ~ 57GBps

DDR3-1600 ~ 22GBps
DDR4-3600 ~ 50GBps

DDR4-2400 ~ 35GBps

So without any type of cache, that's likely one of the big issues with IGP still. A 6xxx card has similar bandwidth to standard DDR3, but even an 8800GT still outpaces the fastest system memory available. It would be interesting to see what kind of effect memory speed has on this benchmark, and if you bought the cheapest DDR4 for the system it would actually run slower still.
 
Last edited:
not bad for a free gpu.

"FREE" as in the current price of one of these CPUs is still $200+. I wonder how much more free a CPU would be if they didn't have GPUs on them. The sad part however is the computer market is just bad these days. If these CPUs cost $50 less without the GPU I still couldn't hop on ebay and buy a 7 year old minimum required GPU with it. (GTX670)
 
Hey, just think about how far the iGPU has come since the Sandy Bridge era.


Yeah, it mostly hit a wall after Haswell. Because they reached the scaling limit with 48 EUs.

They rode the free performance improvement with Skylake DDR4, and have done nothing since.

For comparison., the AMD 2400G is THREE TIMES faster than Intel HD 630 (24 EUs), running the exact same dual-channel DDR4. The 48-EU version with eDRAM only improves performance by 50% (AMD is still twice as fast).

https://www.techpowerup.com/reviews/AMD/Ryzen_5_2400G_Vega_11/11.html

There is a reason why they are redesigning their graphics architecture from the ground-up: it hasn't seen a complete redesign since Sandy Bridge - just architectural tweaks.

And since they only started with 12 EUs, there was plenty of room for growth (with good scaling) in the original design. But Intel took way too long to get around to this redesign, and now they are forced to cover-up their shit performance with articles like these.
 
Last edited:
"FREE" as in the current price of one of these CPUs is still $200+. I wonder how much more free a CPU would be if they didn't have GPUs on them. The sad part however is the computer market is just bad these days. If these CPUs cost $50 less without the GPU I still couldn't hop on ebay and buy a 7 year old minimum required GPU with it. (GTX670)

there was a hardocp front page story on how an intel cpu with no gpu was the same price as with.
 
According to that TPU article, the 2400G is quite close to a GT1030. The 1030 doesn't look too bad other than the cut down memory controller. Both provide about the same amount of memory bandwidth. For a comparison though it looks like both of those are almost identical in performance to an HD7770. So IMO the 2400G is a pretty solid buy if it's as fast as a 7 year old mid range card. But a GT1030 for $80, it's a bit disappointing that even a low end dedicated GPU performs like 7 year old mid range card. (And the 7770 has about 50% more memory bandwidth) I still have one of these cards kicking around and if I were making the choice it would probably make more sense to just buy the AMD for less money than the i5, and not even replace the GPU unless you stepped up to at least a GTX 1050.


there was a hardocp front page story on how an intel cpu with no gpu was the same price as with.

Gotcha. I can't seem to find it, but I'm curious if they mean a CPU that had a GPU cut from it, or a CPU designed without a GPU at all costs as much as designing a CPU with a GPU in it. It seems like there would be a non zero amount of money that goes into R&D to design a chip with one, so is it that they are giving away that design for free, or they are not lowering the price of a chip that wasn't designed with one?
 
Last edited:
wow and I thought I was slumming it to play it on my work laptops 860m, I have to set it to 720p low @ 75% res scaling but atleast i get 60fps with the occasional dip to 45) lol
 
They shouldn't be too proud of that video, it looks like ass. 720p on super LOW with 50% detail. That texture pop in was crazy.

So, good news for people with biz class Dell laptops with only igpu who want to get some (shitty quality) gaming done on a work laptop. Better than nothing, I assume.

OTOH.... Ryzen 2200G kicks this thing so hard in the balls.

Meanwhile, back in 20-effing-14 , AMD showed us this little demo... running on an A10-7850 igpu. It's still 720p but oh man so much better. On a shitty Kaveri igpu that cost like $75 5 years ago. Everyone give Intel a slow golf clap not beating a $75 bargain chip from 3 gens ago.

 
1280x720 at medium setting you say? Not too bad. Perfect for people who don't have a lot. Show me someone trying to game at 4k on igp and I'll show you someone who doesn't deserve to have a computer.
 
I mean it's great and all, but can we concentrate on laptops where it's needed if I'm gonna trade gsync for optimus tech it would be nice if the igpu was strong enough to handle lower end stuff with grace and a tidbit of glamour rather than uhm ok so I either do dedicated with crap battery life or igpu with crap graphics, no in between with graceful switch over. Maybe I am asking them to be wizards here but damn guys either make it work or give it up already especially on higher tiered products. I'd personally rather NVidia low power mode battery drain and actually have variable refresh than have Optimus on a gaming laptop.
 
1280x720 at medium setting you say? Not too bad. Perfect for people who don't have a lot. Show me someone trying to game at 4k on igp and I'll show you someone who doesn't deserve to have a computer.

I remember when Intel Integrated Mobo GPUs were great or better yet when they made graphics cards. Seeing them struggle with this is just.......well like watching golf on TV, boring as hell, put it on the cheaper chips where it's needed and let the die space be used for more actual valuable materials for the higher tier customers.
 
there was a hardocp front page story on how an intel cpu with no gpu was the same price as with.

That is actually impossible, T.I.N.S.T.A.A.F.L , there is no such thing as a free lunch. You pay for it, in one way shape or form you pay for it. It's utterly Kafka.
 
I remember when Intel Integrated Mobo GPUs were great or better yet when they made graphics cards. Seeing them struggle with this is just.......well like watching golf on TV, boring as hell, put it on the cheaper chips where it's needed and let the die space be used for more actual valuable materials for the higher tier customers.

I do believe your memberberries have become quite fermented, good sir. Integrated graphics is a rather recent (20ish years) occurrence and Intel didn't ever have a mass market discreet video card. Hell, It wasn't until the core2 that the GPU was moved from the northbridge to the processor. Additionally if you don't want the IGP go to HEDT. There is no struggle, good enough is good enough.
 
giphy.gif
 
I do believe your memberberries have become quite fermented, good sir. Integrated graphics is a rather recent (20ish years) occurrence and Intel didn't ever have a mass market discreet video card. Hell, It wasn't until the core2 that the GPU was moved from the northbridge to the processor. Additionally if you don't want the IGP go to HEDT. There is no struggle, good enough is good enough.

Sorry you apparently didn't get the joke?????????? They did make a Graphics card only engineering samples later got sold off, I believe Linus ended up getting his hands on one, They actually believe it or not did make integrated mobo graphics(752) that were really good,considering they barely struggle now and no not all of them were integrated on the North bridge...they were standalone integrated...…

So glad you understood the sarcasm...…………...
 
Just running is good enough, for a modern AAA-title; that means that the drivers are rolling and just waiting for the hardware to back them up.
 
Maybe? I honestly have no clue on that one. I vaguely seem to remember it not making a huge swing in GTA:V, like half the resolution making have the VRAM. There's probably a bunch of textures or something loaded into memory that don't scale down past a certain point. So if they aim for like 1080p minimum their textures might take up the same amount of space even if the game was ran at 800 x 600. On the flip side they might have higher resolution textures for like 4k which would double the memory usage from 1080p to 4k. That's just a wild guess.


Factum Yea I would guess the memory bloat is related to consoles. I don't ever remember a time previously where VRAM was a make or break for the ability to run a game. These games still look terrible, but use 4x the memory of a previous title with similar IQ. I don't know about the DX11 thing either, so maybe it does work? I don't know if anyone's tested anything like that. The trouble with that is I'm quite positive there is no such thing as a DX10 card with 2GB VRAM. So it may work but the game will be a slideshow while it swaps out memory. So it's more academic that saying something like a GTX280 or a 6800GT could be a faster GPU because of the VRAM limitations.

*Side Note*
Because it's fun to make comparisons, when you start talking about IGPs using the system memory versus dedicated GDDR, things become interesting.

6800GT ~ 22GBps
8800GT ~ 57GBps

DDR3-1600 ~ 22GBps
DDR4-3600 ~ 50GBps

DDR4-2400 ~ 35GBps

So without any type of cache, that's likely one of the big issues with IGP still. A 6xxx card has similar bandwidth to standard DDR3, but even an 8800GT still outpaces the fastest system memory available. It would be interesting to see what kind of effect memory speed has on this benchmark, and if you bought the cheapest DDR4 for the system it would actually run slower still.

It isn't just this flat unfortunately, you have to consider latency and also the fact that iGPU has a tiny memory buffer which requires constant streaming of data, same goes with RAM. This isn't directly available to video card and is done via software which is much slower. System memory is also not just for GPU and is used for everything else as well, thus you really isn't going to get that kind of bandwidth. This is kind of the same issue with AMD APU where higher RAM speed makes a big improvement but isn't cost effective as that money spend on discrete card will provide better performance. I think even if you compare GTX280 vs IGP at DX9, it will provide far better results. Sure IGP has improved a lot but so have the resolutions and simple lack of dedicated memory is essentially a limiting factor. Would be an interesting solution to have a processor with fewer cores but IGP have a much bigger VRAM which possibly could be a solution.
 
According to that TPU article, the 2400G is quite close to a GT1030. The 1030 doesn't look too bad other than the cut down memory controller. Both provide about the same amount of memory bandwidth. For a comparison though it looks like both of those are almost identical in performance to an HD7770. So IMO the 2400G is a pretty solid buy if it's as fast as a 7 year old mid range card. But a GT1030 for $80, it's a bit disappointing that even a low end dedicated GPU performs like 7 year old mid range card. (And the 7770 has about 50% more memory bandwidth) I still have one of these cards kicking around and if I were making the choice it would probably make more sense to just buy the AMD for less money than the i5, and not even replace the GPU unless you stepped up to at least a GTX 1050.

I really like my 2400g itx setup and I would plan on getting a 3300g or 3500g if they are released but I wonder what they will do to increase the bandwidth. 50 gb/s is just not enough for 15-20 Vega cores as it is not even enough for 11.
 
I really like my 2400g itx setup and I would plan on getting a 3300g or 3500g if they are released but I wonder what they will do to increase the bandwidth. 50 gb/s is just not enough for 15-20 Vega cores as it is not even enough for 11.

And could release and APU that has 128-512MB of dedicated cache, but the cost would be an issue.

I think that AMD will have worked really hard to make sure we see solid ddr4 3600 if not higher officially supported. This added with the massive cache changes may make things much better. Time will tell.
 
Back
Top