GTX 1080 Ti Test on Ryzen and Intel Kaby Lake

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,534
Nuggety Nathan Kirsch of Legit Reviews fame has put Ryzen up against Kaby Lake in four benchmarks that are specific to gaming with the brand new GTX 1080 Ti. (See the full HardOCP review of the 1080 Ti here.) He comes to an opinion that will not be popular with some of the fanoby crowd.


The 1440P scores also showed that the average frame rate was 22% higher on the Intel platform, so when it comes to 1080P or 1440P gaming performance the Intel Core i7-7700K and NVIDIA GeForce GTX 1080 Ti pair together nicely.

We here at HardOCP are surely going to take the mantle up on this Ryzen Vs. Kaby Lake gaming issue as well, but quite frankly, we want to give AMD, the motherboard builders, and Microsoft a week or so to get their ducks in line on performance. If there are any quick fixes that come about, we want to make sure we take those into consideration.
 
I'll definitely be following this closely, since it will undoubtedly be the deciding factor for what hardware is going under the hood when it's time for my next computer overhaul.
 
From that review it seems the higher the resolution the closer in performance the two become.
 
Damn nice review, damn nice card. Slap some bad ass cooling on this monkey and OC the hell out of it... Looks like the 4k numbers were not bad, looks like we are fast (finally) approaching the era of true single card 4k game play at decent framerates.

Me and my 1060 are feeling quite inadequate...
 
Good job AMD. You messed it up.

I knew you wouldn't let us down.

This is what happens when you outsource your engineering to China.
 
93804-Key-and-Peele-thank-you-gif-Im-akK1.gif
 
A binary blob is a binary blob. You never know what is happening there.
What I see is that game developers made bad ports of their games from game consoles to PCs. Game consoles have the exact same multiprocessor architecture as Ryzen. They use Jaguar cores split into two 4 core complexes.
While doing the PC port they optimized for Intel and reviews show that. On benchmarks which are not skewed towards the monopoly, Ryzen is better.

But this should change as per AMD communications.
 
A binary blob is a binary blob. You never know what is happening there.
What I see is that game developers made bad ports of their games from game consoles to PCs. Game consoles have the exact same multiprocessor architecture as Ryzen. They use Jaguar cores split into two 4 core complexes.
While doing the PC port they optimized for Intel and reviews show that. On benchmarks which are not skewed towards the monopoly, Ryzen is better.

But this should change as per AMD communications.
That is utter nonsense. newer consoles are x86 and everyone uses Intel compilers. heck if you are devolopng for the Xbone you are developing for Windows.
 
That is utter nonsense. newer consoles are x86 and everyone uses Intel compilers. heck if you are devolopng for the Xbone you are developing for Windows.
Please, elaborate.


One thing we can see in legitreviews is that framerate difference between 1080 Ti and 1080 on Ryzen is near zero or much closer than with Intel. Somehow games have more overhead with Ryzen architecture. Bad scheduling maybe?
 
Please, elaborate.


One thing we can see in legitreviews is that framerates between 1080 Ti and 1080 on Ryzen are equal or much closer than with Intel. Somehow games have more overhead with Ryzen architecture. Bad scheduling maybe?
AMD has basically said sched not the issue games have to be specially coded for Ryzen
 
I honestly thought it would do a lot worse. Doesn't look too terrible at all for the price. For a general all purpose processor it's not bad.
 
I honestly thought it would do a lot worse. Doesn't look too terrible at all for the price. For a general all purpose processor it's not bad.
Maybe, but when you compare it's price to the 7700k it only make sense as a production CPU no t a gaming CPU
 
AMD has basically said sched not the issue games have to be specially coded for Ryzen
AMD said that the scheduling issue was not due to bad OS scheduling but due to bad game scheduling. Although I do not agree with their wording because ultimately is the OS who decides where a thread goes. Of course, the application can say to the OS: "hey, put this thread in this core and don't move it to other cores" and if the scheduler supports that, the OS will obey. Otherwise, the OS will schedule with its own computed metrics (power consumption, load, NUMA distance, etc.).

It would be strange for game developers to manually set thread affinity unless they targeted also for NUMA machines. I seriously doubt they took into account NUMA architectures should the OS misbehave.

So, instead of putting burden into Microsft, they will make developers compute the needed metrics to correctly schedule threads. Well, you know, Microsoft buys a lot of AMD CPU units for its Xbox One and Scorpio machines, which also uses an NT kernel. Let me guess: game devs also do the scheduling on their own.
 
Maybe, but when you compare it's price to the 7700k it only make sense as a production CPU no t a gaming CPU

What's wrong with the gaming? It's not like you can't game on it. Yeah the 7700k is faster at the same price but it is also only 4 cores, you can do a lot more with 1700X. The cpu/board combo's have been out for two weeks and could have improvements. I'm hopeful that it will get better but even if it doesn't, it's a decent value. The real disappointment I've seen with it is the overclocking. There's really no headroom for this thing past it's turbo modes.
 
results don't surprise me at all given how well the 1080ti does at 4k. cards just to powerful for 1440p which means cpu bottlenecks still going to be an issue..


AMD said that the scheduling issue was not due to bad OS scheduling but due to bad game scheduling. Although I do not agree with their wording because ultimately is the OS who decides where a thread goes. Of course, the application can say to the OS: "hey, put this thread in this core and don't move it to other cores" and if the scheduler supports that, the OS will obey. Otherwise, the OS will schedule with its own computed metrics (power consumption, load, NUMA distance, etc.).

It would be strange for game developers to manually set thread affinity unless they targeted also for NUMA machines. I seriously doubt they took into account NUMA architectures should the OS misbehave.

So, instead of putting burden into Microsft, they will make developers compute the needed metrics to correctly schedule threads. Well, you know, Microsoft buys a lot of AMD CPU units for its Xbox One and Scorpio machines, which also uses an NT kernel. Let me guess: game devs also do the scheduling on their own.

all depends on the game, some engines rely on windows scheduling/load balancing, some engines use their own internal scheduling/load balancing(you tend to see it more in games that don't have a max thread limit or games that have their own physics engine).
 
What's wrong with the gaming? It's not like you can't game on it. Yeah the 7700k is faster at the same price but it is also only 4 cores, you can do a lot more with 1700X. The cpu/board combo's have been out for two weeks and could have improvements. I'm hopeful that it will get better but even if it doesn't, it's a decent value. The real disappointment I've seen with it is the overclocking. There's really no headroom for this thing past it's turbo modes.
I am not sure how future proof it will be. These tests allready show it is becoming CPU limited at 1440. people don't tend to upgrade their CPU every year and in two to three years this CPU could start dragging in games unless AMD can convince everyone to code Ryzen specific optimizations.
 
Click bait article, testing just like 1080p test

e took a look at GPU performance using the DX12 API setting paired with the ‘Medium’ image quality preset with MSAA and VSync turned off. We picked to run just ‘Medium’ image quality settings due to how tough this game title is to render and we feel that most gamers will use this setting.

Cause I'm going to spend $1000 CAD to play 1080p medium. Was hoping this would go into more detail on why some games on Ryzen had higher minimum frame rates like eteknix and linus tech tips witnessed rather than low detail gaming.
 
Click bait article, testing just like 1080p test

e took a look at GPU performance using the DX12 API setting paired with the ‘Medium’ image quality preset with MSAA and VSync turned off. We picked to run just ‘Medium’ image quality settings due to how tough this game title is to render and we feel that most gamers will use this setting.

Cause I'm going to spend $1000 CAD to play 1080p medium. Was hoping this would go into more detail on why some games on Ryzen had higher minimum frame rates like eteknix and linus tech tips witnessed rather than low detail gaming.
We seriously are not going to reiterate why 1080 tests matter. This has been beaten to death we don't need to change CPU testing methodology just to appease AMD fanboys. No one complained when Skylake or Kaby lake chips where tested at 1080.
 
I am not sure how future proof it will be. These tests allready show it is becoming CPU limited at 1440. people don't tend to upgrade their CPU every year and in two to three years this CPU could start dragging in games unless AMD can convince everyone to code Ryzen specific optimizations.

Totally agree with that. For the highest performance it's just not there and some games show that cpu limitation but it's still early. Maybe some things can be fixed in a second generation chip though. I'm trying to be hopeful and positive about it because we all know competition is always nice.
 
Not saying I don't want to know what the eventual performance we would achieve of a given product, but when did we start waiting for patches on Hardware weeks after it is being sold resale to make a call on if it is good compared to the competition?
 
LOL! Enjoy your quad core processor in 2022, I will be enjoying my 8 Core, 16 thread one without having to spend more money until then. :) I am already experiencing higher performance in everything I do. The only thing that is not directly improving is average fps at 4k in some games but, I would imagine that the minimums have greatly increased. (I am running 2 x Fury's.) These articles do not have to be popular nor to the have to be entirely without bias, which this one clearly is not. However, it is entertaining nonetheless. Looks like the 7700k is already bottlenecking at 1080p in GTA5 with this 1080 Ti.
 
Maybe, but when you compare it's price to the 7700k it only make sense as a production CPU no t a gaming CPU

Not really, it is a great cpu for doing both. I have never built a computer just to game on it and cannot imagine I ever would. Also, you must add the cost of a expensive cooler into the 7700k price if you expect to overclock to 5 Ghz, if it even can.
 
We seriously are not going to reiterate why 1080 tests matter. This has been beaten to death we don't need to change CPU testing methodology just to appease AMD fanboys. No one complained when Skylake or Kaby lake chips where tested at 1080.

So? You want to test at 1080, be my guest. I will game at the resolutions that work well for me and I do have a killer 1700X. :) The instant you bring out the fanboy card, your argument loses all credibility.
 
Not really, it is a great cpu for doing both. I have never built a computer just to game on it and cannot imagine I ever would. Also, you must add the cost of a expensive cooler into the 7700k price if you expect to overclock to 5 Ghz, if it even can.
Numbers at stock speeds are pretty good and you can get buy with a sub $100 AIO from people I know who have it. That being said yeah if you have needs that can scale well over 4+ cores Ryzen is probably fine for you.
 
So? You want to test at 1080, be my guest. I will game at the resolutions that work well for me and I do have a killer 1700X. :) The instant you bring out the fanboy card, your argument loses all credibility.
I think you misunderstood what I was saying. Testing at 1080 has to do with testing raw CPU performance in non gpu bottlenecked situations. Of course it is not going to be the most realistic but that is not the point. I pulled the fanboy card as literally no one has ever complained about 1080 cpu testing before ryzen no one literally no one has complained. i can not find one review of CPU prior to Ryzen filled with angry rants about 1080 testing pre ryzen.
 
LOL! Enjoy your quad core processor in 2022, I will be enjoying my 8 Core, 16 thread one without having to spend more money until then. :) I am already experiencing higher performance in everything I do. The only thing that is not directly improving is average fps at 4k in some games but, I would imagine that the minimums have greatly increased. (I am running 2 x Fury's.) These articles do not have to be popular nor to the have to be entirely without bias, which this one clearly is not. However, it is entertaining nonetheless. Looks like the 7700k is already bottlenecking at 1080p in GTA5 with this 1080 Ti.

If I want an 8 core CPU by 2022, I can 1. Buy a bargain bin Ryzen CPU then, 2. Buy whatever Intel is releasing in the next 5 years. If you need to continue justifying your purchase though, then have at it!

Seriously though, just enjoy your CPU if it works for you. Let everyone else bicker and bitch :)
 
All I learned here is that regardless of the resolution I run at, I'm going to be at or above 100FPS with a Ryzen on even the most PITA games to render.

Being a brand new motherboards/processor, Ryzen is only going to get a bit better, especially as developers start taking advantage of 6-core and 8-core processors as a norm in the future, and its already more than fast enough to meet my gaming needs even if I were to upgrade to the fastest GPU available on the market. Outside of gaming, its proven itself to be a powerhouse 16 threaded monster processor, that can render like nobody's business.

I ran into this type of issue before, where a faster dual-core processor was technically a better gaming processor than a slower quad-core, but the quad-core was just an overall more well rounded processor and soon enough more games started taking advantage of the quad.
 
All I learned here is that regardless of the resolution I run at, I'm going to be at or above 100FPS with a Ryzen on even the most PITA games to render.

Being a brand new motherboards/processor, Ryzen is only going to get a bit better, especially as developers start taking advantage of 6-core and 8-core processors as a norm in the future, and its already more than fast enough to meet my gaming needs even if I were to upgrade to the fastest GPU available on the market. Outside of gaming, its proven itself to be a powerhouse 16 threaded monster processor, that can render like nobody's business.

I ran into this type of issue before, where a faster dual-core processor was technically a better gaming processor than a slower quad-core, but the quad-core was just an overall more well rounded processor and soon enough more games started taking advantage of the quad.

i think the q6600 had a start to life that resembled that.
 
Numbers at stock speeds are pretty good and you can get buy with a sub $100 AIO from people I know who have it. That being said yeah if you have needs that can scale well over 4+ cores Ryzen is probably fine for you.

That still makes the 7700k cost over $400 if you want to overclock to 5 Ghz, if the particular process can. Now, I already had a Noctua NH-D15 for the past 2 years so I already had that for my 1700x. It appears that getting a 1700 non X would have been just as good though, since it is working quite well in my other computer, just with the Wraith Spire cooler. :D Also, I received $50 +tax back which made my 1700x cost me just $349 plus tax. (Microcenter sale price.) Why not post the minimums in that article, it seems it was critical to show back when Intel was being compared to the FX processors?
 
That still makes the 7700k cost over $400 if you want to overclock to 5 Ghz, if the particular process can. Now, I already had a Noctua NH-D15 for the past 2 years so I already had that for my 1700x. It appears that getting a 1700 non X would have been just as good though, since it is working quite well in my other computer, just with the Wraith Spire cooler. :D Also, I received $50 +tax back which made my 1700x cost me just $349 plus tax. (Microcenter sale price.) Why not post the minimums in that article, it seems it was critical to show back when Intel was being compared to the FX processors?
minimum FPS averages are useless without frame time analysis to see just how often it actually hits those minimums; plus some built in game benchmarks have built in hiccups that contribute to minimums that just aren't accurate.
 
i think the q6600 had a start to life that resembled that.
That's actually what I was thinking about, as I ended up building both a E6850 and a Q6600 machine back in the day. Since I use my systems for more than just gaming, I was ultimately long-term happier with the overclocked Q6600 (which really was a bargain, think I was running mine at 3.2Ghz), and ditched the E6850 sooner. I have a feeling that personally I'll end up with the same kind of buyers remorse had I gone with the faster but half the cores Intel, especially since I am excited to see AMD on the up and up and can run a full red machine for a change. :)
 
minimum FPS averages are useless without frame time analysis to see just how often it actually hits those minimums; plus some built in game benchmarks have built in hiccups that contribute to minimums that just aren't accurate.

So, then their testing can adjust for these problem benchmarks and they can already do a frametime analysis. I am pretty certain H already do that here.
 
So, they their testing can adjust for these problem benchmarks and they can already do a frametime analysis. I am pretty certain H already do that here.
[H] does unfortunately not a lot of sites due as it is a bit of a headache to set up FCAT, because it requires two PC to do it correctly. More and more sites are beginning to do it though. Digitial Foundry normally does.
 
A stock Intel Core i7-7700K processor was able to offer superior gaming performance over the overclocked AMD Ryzen 7 1700 processor. In fact, when it came to 1080P gaming performance the NVIDIA GeForce GTX 1080 Ti was found to be an average of 34% faster on the Intel platform on the three game titles we took a look at. The 1440P scores also showed that the average frame rate was 22% higher on the Intel platform, so when it comes to 1080P or 1440P gaming performance the Intel Core i7-7700K and NVIDIA GeForce GTX 1080 Ti pair together nicely.

Haha man this is terrible! Add another 5-10% to that if the 7700k was overclocked. I had hopes for AMD and the early benchmarks looked ok but damn these numbers don't lie.

Have fun with your spreadsheets Ryzen owners.
 
Haha man this is terrible! Add another 5-10% to that if the 7700k was overclocked. I had hopes for AMD and the early benchmarks looked ok but damn these numbers don't lie.

Have fun with your spreadsheets Ryzen owners.

I most certainly will have fun with all things my computer can do with my Ryzen processors, including gaming. Have fun with your buying a separate cooler, new board every other year when you want a new processor and other such stuff. Good luck, you will need it. :D
 
Yikes, Ryzen is holding back the Ti that the difference between a Ti and Non-Ti was minimal/identical sometimes. Still a good all-around chip though no doubt but gaming as we already knew goes to Intel.

I wish they could compare more games though.
 
I most certainly will have fun with all things my computer can do with my Ryzen processors, including gaming. Have fun with your buying a separate cooler, new board every other year when you want a new processor and other such stuff. Good luck, you will need it. :D

Ryzen is certainly a LOT better than prior AMD CPUs but AMD just doesn't seem to be able to quite strike the blow it needed with this launch. Certainly a great value for the cores and threads and in time I'm guessing the rough spots will get smoothed out. But AMD really needed something like the old Athlon XPs that clearly beat out Pentiums. Not that that was the expectation with Ryzen but I don't think Ryzen is good enough to put price pressure on Intel except maybe on their top end consumer CPUs which are low volume products anyway.
 
From that review it seems the higher the resolution the closer in performance the two become.

Well right because high res is GPU bound. Less FPS at 4k = less being asked of the CPU.

The way I read the article, it's both.

The 1080ti is definitely faster than the 1080. Approximately 25%, from [H]'s own tests. At the high end when the overall FPS is dropping, much of that will be GPU bound. Where the intel beats the AMD, clearly the AMD cpu is also hampering the 1080/1080ti's max performance also. That's showing that its also CPU bound at that point. And the 1080p results where the Ti is no faster than the 1080, that's all on the CPU.

The AMD is definitely not a Gaming CPU. It will do good in business pc's tho.
 
Back
Top