Intel challenges AMD: “Come beat us in Real World gaming”

I'm pretty sure these aren't the most popular titles among PC Gamers right now.

I play all of these games, and even late game with lots of stuff going on my 2700x does just fine @ 4k resolution.

I don't see Intel's point here. Yes, they are technically correct, but what's laughable is they are talking about 'real world' usage when the reality is that even though AMD's stuff may be marginally slower for single-core stuff - It doesn't matter because it's already so fast the difference doesn't matter.

So you spend less to get more cores with AMD, and generally more features on the motherboard as well for less money. Not to mention AMD's platforms last longer between CPU updates.
 
And for all you people that might try to argue that you need a new motherboard to get the gains from Zen 2, you have your head in the sand. This is not the case and has been verified. You get PCIe 4.0 from x570, but that is pretty much it and has a minuscule impact on almost all real world workloads.

And to boot you can still get PCIe 4 on x470 only in the first slot, apparently due to clocking and trace length and part of why the boards are more expensive for x570, re-timing I'd imagine.

AVX512 is like SSE-2 of old when A64 and Athlon was giving Intel asspain. Very little implementation (earlier this year it was 1 then three apps maximum..).
AMD will have AVX512 implementation in Zen2 IIRC as they doubled registers for it.
 
"I challenge you to challenge them, because them challenging us is challenging after being unchallenged for so long."
axXnqnY.png
 
And to boot you can still get PCIe 4 on x470 only in the first slot, apparently due to clocking and trace length and part of why the boards are more expensive for x570, re-timing I'd imagine.

AVX512 is like SSE-2 of old when A64 and Athlon was giving Intel asspain. Very little implementation (earlier this year it was 1 then three apps maximum..).
AMD will have AVX512 implementation in Zen2 IIRC as they doubled registers for it.

Yep, I plan on upgrading off my 2700x, and will get four more cores (Or double my core count if I just wait for the 16 core Zen 2), an IPC gain, a frequency gain, along with PCIE 4.0 on my GPU slot.

You'll never get this type of upgrade on an Intel platform these days.
 
There's more to overall performance than just IPC.

Fantastic. I didn't quote his statement about overall performance, I quoted his statement about IPC.

But this AMD presentation is for Zen 2, where they are expected to close that IPC gap. Intel is just preempting that wit BS tactics.

Intel has nothing except Ice Lake to exceed AMD's IPC, and that won't be available in more than 4 cores until the end of next year. This is the reason Skylake is getting yet-another-refresh, this time with TEN CORES (Comet Lake).

We don't yet know what Zen 2's average IPC will be yet. It will certainly be closer, but it is yet to be seen if it will match or exceed that of the *lake architecture. I expect it will, but again, nobody knows yet.

Nope.

Even techspot using pre-Intel security hole speed cheat patches in 2018 shows they are neck and neck outside of a few games.
Throw intel compiler advantage in most applications into that mix and you will have higher IPC on Zen in most scenarios. There is a CB run earlier this year with a 5GHz Zen+ vs a 5GHz whateverlake and the Zen+ is faster by about 5% in IPC alone...
https://www.techspot.com/article/1616-4ghz-ryzen-2nd-gen-vs-core-8th-gen/page4.html

Memory timings/latency also impact this a lot, so it depends on who does the benchmarking. Techspot was using XMP which is typically optimised for Intel use as well.

Now take that 5-15% away with recent patches and Zen+ is definitely higher IPC, let alone Zen2 with an added 15% IPC gain..

Did you even read the article?

From the conclusion:
In applications such as Cinebench R15 we see that the single core performance is down just 3%
We found that AMD was 3% slower in the Corona benchmark but much the same for our Excel, V-Ray and video editing tests.
Then while it was 15% slower in HandBrake it was also 8% faster for the PCMark 10 gaming physics test.

And from the gaming results:
So while the 2600X improves on the 1600X by an 8% margin in Ashes of the Singularity, it's still a whopping 11% slower than the 8700K.
Moving to Assassin's Creed Origins, we see a mere 2% increase for the 2600X over the 1600X while the 8700K is a further 14% faster.
The margin is slightly reduced with the high quality preset but still the 8700K is 12% faster than the 2600X when comparing the average frame rate.
When testing with Battlefield 1 using the ultra quality preset we see that the 2600X is 9% faster than the 1600X but still 7% slower than the 8700K.
Here the 2600X again offered a 9% performance increase over the 1600X but it now 10% slower than the 8700K which still appears GPU limited.
It's a similar story when testing with Far Cry where the 2600X is 10% faster than the 1600X, which is a huge improvement, but even so it's still 8% slower than the 8700K.

As for using XMP, it is simply an Intel spec for defining timings beyond the JEDEC spec. The timings themselves do not favor one platform over another, and without manual tuning, the end user will end up using the same timings no matter the platform they choose by enabling XMP/DOCP/whatever their motherboard of choice chooses to call those settings.

Finally, for the performance loss from the most recent security patches, that 5-15% is not typical for most users. Impact on gaming, for example, is a few percent on average and not enough to make up the gap between *lake and Zen 2.

Don't get me wrong here; I'm rooting for AMD. I buy AMD products when they make sense for me. I'm currently running a 1700X that I pre-ordered and an RX480. I have no axe to grind here beyond pointing out when people make idiotic statements that have no backing in reality.
 
Fantastic. I didn't quote his statement about overall performance, I quoted his statement about IPC.



We don't yet know what Zen 2's average IPC will be yet. It will certainly be closer, but it is yet to be seen if it will match or exceed that of the *lake architecture. I expect it will, but again, nobody knows yet.



Did you even read the article?

From the conclusion:




And from the gaming results:







As for using XMP, it is simply an Intel spec for defining timings beyond the JEDEC spec. The timings themselves do not favor one platform over another, and without manual tuning, the end user will end up using the same timings no matter the platform they choose by enabling XMP/DOCP/whatever their motherboard of choice chooses to call those settings.

Finally, for the performance loss from the most recent security patches, that 5-15% is not typical for most users. Impact on gaming, for example, is a few percent on average and not enough to make up the gap between *lake and Zen 2.

Don't get me wrong here; I'm rooting for AMD. I buy AMD products when they make sense for me. I'm currently running a 1700X that I pre-ordered and an RX480. I have no axe to grind here beyond pointing out when people make idiotic statements that have no backing in reality.

Those are @ 1080p. At any modern resolution like 4K, it's GPU holding you back, and there is no difference at that point between a 2700x and 8700k in these games.

Not to mention, even if you are still on a 1080p monitor, that 14% difference is the difference between like 100 FPS, and 90 FPS. Yes, that's a measurable difference, but it's not like it matters much. You aren't getting 144+ FPS either.

Now, if running benchmarks is your thing - Intel is still the king. However, Intel suggesting that they are better in the 'real world' is just a wrong statement. In the 'real world' there is almost no scenario where buying their platform/CPU will give you a better playing experience.
 
Uh sorry but without them, internet forums and Reddit etc would not even exist.

Please don't take that away.

Half the fun is watching the people making those statements flail around and move the goalposts every time you point out where they're wrong.

Speak of the devil! This was posted just as I was typing the above reply:
Those are @ 1080p. At any modern resolution like 4K, it's GPU holding you back, and there is no difference at that point between a 2700x and 8700k in these games.

We're talking about IPC. The above statement has nothing to do with IPC. It's not wrong, it's just has exactly ZERO RELATION to the thing we're discussing. Which, again, is IPC.

IPC.
 
Half the fun is watching the people making those statements flail around and move the goalposts every time you point out where they're wrong.

Speak of the devil! This was posted just as I was typing the above reply:


We're talking about IPC. The above statement has nothing to do with IPC. It's not wrong, it's just has exactly ZERO RELATION to the thing we're discussing. Which, again, is IPC.

IPC.

The original discussion is based off of Intel's statement of the 'real world'. I don't particularity care about IPC if it isn't actually allowing me to run a game in the real world better to the extent that I can do something different over AMD. In every case, even at low resolutions, it isn't so much faster that this happens.
 
The original discussion is based off of Intel's statement of the 'real world'. I don't particularity care about IPC if it isn't actually allowing me to run a game in the real world better to the extent that I can do something different over AMD. In every case, even at low resolutions, it isn't so much faster that this happens.

So, for your situation you won't notice a difference. Intel is saying that in gaming, they are the fastest. Which is related to IPC and clockspeed. So everything they are saying is correct.
 
The original discussion is based off of Intel's statement of the 'real world'. I don't particularity care about IPC if it isn't actually allowing me to run a game in the real world better to the extent that I can do something different over AMD. In every case, even at low resolutions, it isn't so much faster that this happens.

Yes, and I'm being a bad forum citizen by getting a little off-topic (as I often do), but reading is essential, people!
 
This isn't necessarily accurate. On the OS side, the patches for vulnerability mitigation are likely in place, but not necessarily on the firmware side.
That's true, I meant the OS side. Just wasn't clear.
 
Last edited:
I am for competition, not loyal to any corporation, they are damned sure not loyal to anything but a profit margin. It would be awesome if AMD can take the gaming performance crown. They havent yet. Not with cpu's or video cards. In video cards they dont even have the price/performance crown. It's disingenuous for anyone to come in here and say that 14% doesnt really matter because games are gpu limited in most 4k games. If AMD ever comes out with with a chip that beats intel by 1% you will never here any of that "gpu limited" noise in here, guaranteed.
 
I am tired of Intel's posturing and AMD's canned "benchmarks" give us actual benchmarks with price tags attached or STFU. I need parts I can buy, I have a lot of new equipment I have to purchase and deploy this year to staff (Laptops, Workstations, Citrix Servers) and as I have a deadline and budgets in place I will be going with who ever gets me what I need on time and within my ceiling.
 
AMD should challenge intel to a real world test

Intel gets its CPU, chipset, and the fastest Intel graphics.

AMD gets its CPU, chipset, and the fastest AMD graphics.

We can see who comes out on top?

You said real world gaming. No one uses integrated GPU's for that unless they only play titles that are a decade old.
 
AMD should challenge intel to a real world test

Intel gets its CPU, chipset, and the fastest Intel graphics.

AMD gets its CPU, chipset, and the fastest AMD graphics.

We can see who comes out on top?

And how exactly would that be a "real world test", being that in the "real world" no one cares about trying to match their CPU and GPU brand. Otherwise Nvidia would be in trouble I guess?
 
Anyone dropping 500 on a 9900k probably isn't gaming at 1080p.

This isn't totally true anymore. You have hundreds of thousands of people emulating twitch heroes like shroud who is most likely running a 9900K paired with a 240hz 1080P monitor. I play competitive games at 1080P 144hz as well, about to get a 1440P 144hz panel for everything else. I can't stand gaming on my 4k monitor unless it's something I'm playing with a controller (GTA V, TW3, Dragon Quest XI, etc).... I can't stand anything under 100FPS when I'm controlling the game with a mouse anymore..

I realize the difference between even a 4790k and 9900k in many games is marginal, but people want those extra frames... I may try out a 240hz panel myself soon.

so yeah I'll be spending $500 on my next CPU and still gaming at 1080P :p
 
I guess never having used a 240hz panel at 1080p don't know what I'm missing, but 1080p on a 4k panel is just so..... blurry.
 
I guess never having used a 240hz panel at 1080p don't know what I'm missing, but 1080p on a 4k panel is just so..... blurry.

it certainly is. It's definitely not eye candy. The trade off is purely smooth movement with no blur versus beautiful picture. This is why most people are going for 1440P 144hz (or at least 100hz ultrawide 1440).

My 1080P 144hz panel is 27" too, so a side by side comparison with my 27" 4k IPS panel makes you want to vomit :p
 
  • Like
Reactions: mikeo
like this
I own a 9900K but even I can see Intel is desperate af. If I were to build a brand new system in the next few months, there's no way I'd choose Intel. But if there's something Intel is exceedingly good at like NVIDIA, it's marketing.


This isn't totally true anymore. You have hundreds of thousands of people emulating twitch heroes like shroud who is most likely running a 9900K paired with a 240hz 1080P monitor. I play competitive games at 1080P 144hz as well, about to get a 1440P 144hz panel for everything else. I can't stand gaming on my 4k monitor unless it's something I'm playing with a controller (GTA V, TW3, Dragon Quest XI, etc).... I can't stand anything under 100FPS when I'm controlling the game with a mouse anymore..

I realize the difference between even a 4790k and 9900k in many games is marginal, but people want those extra frames... I may try out a 240hz panel myself soon.

so yeah I'll be spending $500 on my next CPU and still gaming at 1080P :p

So very true, casual gamers don't understand that no hardcore gamer will choose 4k. The best setup is 240 hz + 9900k@5+ ghz, 1080P with everything set on low. I run a hybrid with a 1440p + 144hz, 9900k@5 ghz + everything on low for fps games (like Apex). Tons of Gen Z's idolize streamers and watch these guys rather than TV, they're not like Gen X or Millennials, they were raised with iPads and have been plugged into twitch all their lives. They are the future of gaming and who Intel is correctly targeting. AMD has a winner with Ryzen but they should start influencing streamers with free builds to get the word out to Gen Z.
 
Last edited:
I own a 9900K but even I can see Intel is desperate af. If I were to build a brand new system in the next few months, there's no way I'd choose Intel. But if there's something Intel is exceedingly good at like NVIDIA, it's marketing.




So very true, casual gamers don't understand that no hardcore gamer will choose 4k. The best setup is 240 hz + 9900k@5+ ghz, 1080P with everything set on low. I run a hybrid with a 1440p + 144hz, 9900k@5 ghz + everything on low for fps games (like Apex). Tons of Gen Z's idolize streamers and watch these guys rather than TV, they're not like Gen X or Millennials, they were raised with iPads and have been plugged into twitch all their lives. They are the future of gaming and who Intel is correctly targeting. AMD has a winner with Ryzen but they should start influencing streamers with free builds to get the word out to Gen Z.

Slight veer from topic but how would you rate 240hz vs 144hz in real world experience? My go to game is still CS:GO so I could pump out those frames, but not too sure if I'd really notice it.

I have a feeling a 9900K all core @ 5ghz is still going to be outgunning a ryzen 3800x @ 4.6ghz (I assume that's around where most water cooled 3800x will land) in the gaming department, but very marginally and not worth the extra cost over a 3800x as of July.

3900x on the other hand.... gaming and streaming at the same time with ease.
 
So I guess Intel loses then

You still never explained why in your fictional convoluted scenario any gamer would feel the need to use an Intel IGP just because they have an Intel CPU, and how that has anything to do with "real world gaming". I guess it's just fun to make random stuff up so you can say "Intel loses" ;)
 
Link?

I did my google. Didn't see a direct response but you know Intel has to be feeling the heat now.

It was in the AMD live stream at E3. AMD basically showed its Ryzen 3000 series trading blows with everything from Intel with between 6 and 12 cores. The numbers were close enough that the difference was basically meaningless at 1080P and 1440P. They also showed a 12c/24t Ryzen 3900X utterly spanking Intel's 8 core 9900K while streaming and playing the Division 2.
 
It was in the AMD live stream at E3. AMD basically showed its Ryzen 3000 series trading blows with everything from Intel with between 6 and 12 cores. The numbers were close enough that the difference was basically meaningless at 1080P and 1440P. They also showed a 12c/24t Ryzen 3900X utterly spanking Intel's 8 core 9900K while streaming and playing the Division 2.

That was hilarious, I actually lost it at that point and had to laugh at how bad it was. Then they showed that 1.6 or 1.8 FPS and I laughed even [H]arder.
 
That was hilarious, I actually lost it at that point and had to laugh at how bad it was. Then they showed that 1.6 or 1.8 FPS and I laughed even [H]arder.

Yea that was the best part, basically they are within 5% of each other across various market segments in games and they mop the floor in multithreaded workloads and streaming.
 
Back
Top