R5 1600 vs i7-7800X 30 Game Showdown

Values as 245W and 253W are total platform consumption, not the power consumed by the CPU alone. Thus official TDP ratings aren't being violated. Toms measured 250W on the CPU when overclocked it to 4.5GHz and with AVX512 offset disabled. Correcting for clocks we obtain for stock settings

250W (3.3/4.5)^2 = 134W

which agrees with the TDP. In fact, AT measured 149W for the same CPU on stock settings, and the 9W difference being easily explained by measurement errors (including loses from circuitry). However, the difference between 129W measured for the 1800X on x264 and the official 95W is not the result of small uncertainties in the measurement apparatus or small loses from 95% efficiency circuits. This 36% gap between measured TDP and marketing TDP is a result of the chip violating the marketing value.
AMDs TDP does not Equal WATTS used but rather the average expected heat needed to be removed to maintain clocks. DOES that 1800X throttle during those tests when it hits that 129W? Like I have explained to you, when you can prove that a 95W cooler can not maintain clocks over normal desktop usage then you may have an argument. Till then statements like yours above are just ignorant rants of flustered posters grasping at straws.
 
AMDs TDP does not Equal WATTS used but rather the average expected heat needed to be removed to maintain clocks. DOES that 1800X throttle during those tests when it hits that 129W? Like I have explained to you, when you can prove that a 95W cooler can not maintain clocks over normal desktop usage then you may have an argument. Till then statements like yours above are just ignorant rants of flustered posters grasping at straws.

In the same test the 125W-rated FX-8350 consumed 125W, whereas the 95W-rated R7-1800X consumed 129W. This is happening because the TDP values that AMD gives for RyZen don't correspond to the usual concept of TDP. This has been noticed by many


Indeed, with consumption measured on the ATX12 at 128.9 watts, it is clear that the consumption of the Ryzen 7 1800X exceeds the 95 watts announced on the TDP (Thermal Design Power) side.

What are the TDPs, within the meaning of the consumption limit and therefore the maximum number of watts to be dissipated, of the Ryzen? AMD also communicates this value, less markedly: 128 watts for the 1800X / 1700X, and 90 watts for the 1700. These are the values that are most comparable with the TDP communicated by Intel.

Under a full load, in this case Cinebench R15, the 1800X requires 33 watts more power to get the job done when compared to the 7700K system. In fact, despite the 95 watt TDP, the Ryzen CPU uses about the same power as the 140 watt Broadwell-E processors.
 
In the same test the 125W-rated FX-8350 consumed 125W, whereas the 95W-rated R7-1800X consumed 129W. This is happening because the TDP values that AMD gives for RyZen don't correspond to the usual concept of TDP. This has been noticed by many

So instead of posting proof of the TDP/Wattage causing throttling as I pointed out you instead parrot the same trite you have multiple times.

Let me spell it out for you with an example you can take to the bank.

My 125W(later changed to 140W) FX-8350 was using a H55 from corsair. This is a 120mm rad which is paltry thin. Even at 4.6Ghz pulling Watts well in the mid to high 200s running Prime or IBT it never throttled nor did it exceed 65C. So seems to me if my huge nuclear powered 8350 TDP wasn't running into issues using a sub-ideal cooler then Any of the Ryzens will have no issues running 95W TDP-rated coolers, with no issues of throttling or melting or setting the owners house ablaze.

So again the TDP rating used by AMD for years is only in reference to the cooling necessary to maintain the rated clocks, NOT total power usage seen in a MAX thread-burning program. Also keep in mind this is the desktop segment and Excel documents and content creation ARE NOT the intended audience of that segment but rather HEDT and professional lines.

So in conclusion for TDP to be an issue you MUST first prove it in some way causes a performance loss whereas the stated clocks CAN NOT be maintained. For one reason or another you have somehow been misinformed that AMDs stated TDP is somehow a statement of its max power usage at said stock clocks.
 
Last edited:
Just gonna leave this here:

8225_45_intel-core-i9-7900x-series-skylake-cpu-review-png.31964


Notice how certain 10 core is adhering its rated TDP value and the other 10 core is not.
 
Just gonna leave this here:

8225_45_intel-core-i9-7900x-series-skylake-cpu-review-png.31964


Notice how certain 10 core is adhering its rated TDP value and the other 10 core is not.

Those numbers look very incorrect. Toms measured ~250W only when the 10-core CPU was overclocked to 4.5GHz, which implies about 140W at stock clocks. AT measured 149W on stock. HFR measured 150W on stock,...

Precisely HFR noted how the 10-core matches the official TDP, whereas a certain 8-core from another company do not

It should be noted, however, that the 7900X consumes a lot of charge. The use of AVX units on x264 plays a role here. In spite of everything, the 7900X remains in its TDP announced what is appreciable (the 1800X unnecessarily maltreats, as a reminder, the notion of TDP so as not to respect it in practice according to our criteria).

Indeed, the 1800X is a huge 36% above the marketing TDP. Note however that the measured 129W just agree with the non-marketing TDP of 128W.

getgraphimg.php
 
You know folks, this is all fun and games, but in regards to OP, i will just repeat my note:

4Ghz 160 and 4Ghz 7800X perform similarly (few outliers notwithstanding) and consume similar amounts of power. The real question is what happens with OCs and outliers.
 
You know folks, this is all fun and games, but in regards to OP, i will just repeat my note:

4Ghz 160 and 4Ghz 7800X perform similarly (few outliers notwithstanding) and consume similar amounts of power. The real question is what happens with OCs and outliers.

Overclocking the 7800x by 34% only brings 3% framerates gains. The CPU is being bottlenecked and not showing the true performance. Therefore the conclusions are incorrect.
 
Those numbers look very incorrect. Toms measured ~250W only when the 10-core CPU was overclocked to 4.5GHz, which implies about 140W at stock clocks. AT measured 149W on stock. HFR measured 150W on stock,...

Precisely HFR noted how the 10-core matches the official TDP, whereas a certain 8-core from another company do not

Indeed, the 1800X is a huge 36% above the marketing TDP. Note however that the measured 129W just agree with the non-marketing TDP of 128W.

It is a funny thing that I did not say a word about Ryzen's TDP on my post but you are still bringing it up all the time. As someone has pointed out already: TDP does not equal to WATTS.

You are also saying that Toms measured 250w watts after overclocking but when I am seeing this chart, I don't see a mention of overclocking anywhere on the page. But I do see this remark: "These numbers are generated using stock motherboard settings;".

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9ML1IvNjk1NDM5L29yaWdpbmFsL2ltYWdlMDAxLnBuZw==


And before you start rambling about unfair measurements as in using Prime95 which does not stress Ryzen properly, this isn't about AMD.

And toms really should switch to FIRESTARTER as a torture benchmark because it stresses Ryzen more than Prime95 atm.
 
It is a funny thing that I did not say a word about Ryzen's TDP on my post but you are still bringing it up all the time. As someone has pointed out already: TDP does not equal to WATTS.

You are also saying that Toms measured 250w watts after overclocking but when I am seeing this chart, I don't see a mention of overclocking anywhere on the page. But I do see this remark: "These numbers are generated using stock motherboard settings;".

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9ML1IvNjk1NDM5L29yaWdpbmFsL2ltYWdlMDAxLnBuZw==


And before you start rambling about unfair measurements as in using Prime95 which does not stress Ryzen properly, this isn't about AMD.

And toms really should switch to FIRESTARTER as a torture benchmark because it stresses Ryzen more than Prime95 atm.

Of course, "TDP does not equal to WATTS" because one is the physical quantity and the other is the unit.

The 250W is from Toms' later revision of power consumption on SKL-X:

Let’s start with the massive 250W, which Intel's Core i7-5960X reached at 4.8 GHz back when we reviewed it. Core i9-7900X gets there at 4.5 GHz under Prime95, and at 4.6 GHz with LuxRender in a console loop.

It is evident that if the i9 does 250W at 4.5GHz, then it cannot do 229.6W on stock settings, not to mention the other reviews that got 149--150W on stock settings.
 
The 250W is from Toms' later revision of power consumption on SKL-X:

It is evident that if the i9 does 250W at 4.5GHz, then it cannot do 229.6W on stock settings, not to mention the other reviews that got 149--150W on stock settings.

You mean earlier revision... :whistle:

The chart I posted is from 7820X review... which was published two days ago.

But I don't really wanna continue this this back and forth bickering. The biggest problem comes from mobo makers as there's a huge skeleton in the closet in the form of "stock settings", where said stock settings have all kinds of TurboCore enhancements enabled by default.
I'm pretty sure there's not a single high-end mobo which is actually running with stock Intel turbo settings out of the box. Even Asus, which is at least trying to adhere intel's 140w TDP guideline, has "Asus Multicore Enhancement" setting enabled by default.

And this also applies to AMD mobo's, but I guess to a lesser degree.
 
You mean earlier revision... :whistle:

The chart I posted is from 7820X review... which was published two days ago.

But I don't really wanna continue this this back and forth bickering. The biggest problem comes from mobo makers as there's a huge skeleton in the closet in the form of "stock settings", where said stock settings have all kinds of TurboCore enhancements enabled by default.
I'm pretty sure there's not a single high-end mobo which is actually running with stock Intel turbo settings out of the box. Even Asus, which is at least trying to adhere intel's 140w TDP guideline, has "Asus Multicore Enhancement" setting enabled by default.

And this also applies to AMD mobo's, but I guess to a lesser degree.

I mean "later" because the data for the i9 in the graph that you did bring is from their early review of the i9, before they revisited the issue of power consumption on the i9. Who knows what they did for the 7820X review? Maybe they will also a revision for the 7820X in future.
 
About the reason why the 30 game showdown shows 3% increase in the 7800Z overclocked at 4.7GHz, the author claims now that

45893e8c2a5ab66d859ec45be8eea8d01fe50cbfc08d6307b54f657fc955182e.png


but if the problem was on the SKL-X chip, the Kabylake chip wouldn't have the same problem. However, he overclocked the kabylake chip by 16% and performance only increased by 2%. The problem is not in SKL, the problem is in another part. He tested in frame-limiting and GPU-bound situations and that is why overclocking the Intel chips didn't bring any sensible performance.
 
Last edited:
About the reason why the 30 game showdown shows 3% increase in the 7800Z overclocked at 4.7GHz, the author claims now that

45893e8c2a5ab66d859ec45be8eea8d01fe50cbfc08d6307b54f657fc955182e.png


but if the problem was on the SKL-X chip, the Kabylake chip wouldn't have the same problem. However, he overclocked the kabylake chip by 16% and performance only increased by 2%. The problem is not in SKL, the problem is in another part. He tested in frame-limiting and GPU-bound situations and that is why overclocking the Intel chips didn't bring any sensible performance.
Wait hold the press... I am sure there are more excuses... That cant be the only one.

Seriously. Intel has some SERIOUS competition now and in the general consensus of fact, one can choose whatever the hell they want and expect great performance, for now with current releases. However the rest has yet to unfold: TR 12c/16c and SKL 12C/14C/16C/18C. I cant wait to see what TR brings and the aftermath if it happens to be good. And by the Alienware pretesting it looks real good.
 
About the reason why the 30 game showdown shows 3% increase in the 7800Z overclocked at 4.7GHz, the author claims now that

45893e8c2a5ab66d859ec45be8eea8d01fe50cbfc08d6307b54f657fc955182e.png


but if the problem was on the SKL-X chip, the Kabylake chip wouldn't have the same problem. However, he overclocked the kabylake chip by 16% and performance only increased by 2%. The problem is not in SKL, the problem is in another part. He tested in frame-limiting and GPU-bound situations and that is why overclocking the Intel chips didn't bring any sensible performance.


The Kabylake truly was hitting a ceiling on most of the games.

" This was another instance where the 7800X fell way behind the 7700K and the same is also true for the R5 1600. That said, whereas the 1600 was 15% slower than the 7800X at the stock clock speeds, overclocking both processors reduced the margin to 0, as both allowed no less than 61fps to be rendered." in regards to Hitman

So yeah, not always cpu limited.

The Ryzen was able to achieve over twice the gains when o/c despite a lower o/c %....
 
lol, looks like Skylake X maybe Intel's Bulldozer :LOL:. How in the hell does an 6 core skylake at 4.7ghz perform less in every game then a 4 core Intel chip at 4.9ghz and consumes so much more power too. This is so funny. Who would even consider a 7800x on the x299 platform anyways? Looks like the best gaming cpu for the money, including platform is RyZen 1600 and AM4.
 
lol, looks like Skylake X maybe Intel's Bulldozer :LOL:. How in the hell does an 6 core skylake at 4.7ghz perform less in every game then a 4 core Intel chip at 4.9ghz and consumes so much more power too. This is so funny. Who would even consider a 7800x on the x299 platform anyways? Looks like the best gaming cpu for the money, including platform is RyZen 1600 and AM4.

Hey, watch it, I loved my FX builds which worked very well. :) ;)
 
Hey, watch it, I loved my FX builds which worked very well. :) ;)
lol, I have two motherboards and two FX cpu's in my closet with ram onboard. FX days are over my friend - Time for RyZen.
 
lol, I have two motherboards and two FX cpu's in my closet with ram onboard. FX days are over my friend - Time for RyZen.

FX are still quite adequate depending on your needs. I'm using one right now.

Don't get me wrong though, the R5 1500X has me drooling but, I can't justify the purchase right now.
 
FX are still quite adequate depending on your needs. I'm using one right now.

Don't get me wrong though, the R5 1500X has me drooling but, I can't justify the purchase right now.
Agreed, had no issue with FX 9590 and 1070 and then 1070 SLI - only seldom in gaming was it limited. Normal stuff it makes virtually zero difference, rendering, video and stuff like that is when it really comes into play.
 
FX are still quite adequate depending on your needs. I'm using one right now.

Don't get me wrong though, the R5 1500X has me drooling but, I can't justify the purchase right now.
I was and still am a big supporter of FX series but after seeing my wifes 1600 (non-X) at stock effortly run I have to say if you can afford a Ryzen then go ahead and get it, worth everypenny. My FX 8350 will have to hold out a bit longer so I can afford it, more reasonably anyway.
 
The Kabylake truly was hitting a ceiling on most of the games.

" This was another instance where the 7800X fell way behind the 7700K and the same is also true for the R5 1600. That said, whereas the 1600 was 15% slower than the 7800X at the stock clock speeds, overclocking both processors reduced the margin to 0, as both allowed no less than 61fps to be rendered." in regards to Hitman

So yeah, not always cpu limited.

The Ryzen was able to achieve over twice the gains when o/c despite a lower o/c %....

I did mean the whole 30 games average, but thanks for mentioning this specific game, which highlights my point perfectly.

  • Overclocking the 1600 by 25% increased performance by 24%.
  • Overclocking the 7700k by 17% increased performance by only 2%.
  • Overclocking the 7800X by 34% increased performance by only 6%.

It is evident that both Intel chips are being bottlenecked by huge amounts, and that is why the RyZen was able to get close to the SKL-X when overclocked. On stock settings the 7800X was 22% faster than the 1600. The 15% he mentions refers to taking only minimum FPS (which reduces the performance gap) and then using the 7800X as baseline to reduce the percent even more. "15% slower than" (7800X baseline) equals to "19% faster than" (1600 baseline).

And let us don't forget he didn't even test a SKL-X retail chip, but a sample. This review is easily forgotten...
 
The bottleneck was the 7800x and not the game. Its like overclocking does nothing but add power and heat.
Even overclocked, it could not match a stock 7700k, so stop pretending all is fine.

The overclock on the 7700k on the same game did nothing either, thus stop pretending that the problem was on the 7800X side. And let me emphasize once again that the 7800X he used wasn't a retail chip, but an engineering or qualification sample.
 
As great as Ryzen is for AMD and consumers, let's not pretend this review isn't faulty. Here's another 7800x review where 1800x trails behind the 7800x, just like 99.9% of reviews out there except for this one that the red team keeps on using as ammo. You're telling me the 1800x is slower than a 1600? What?
 
As great as Ryzen is for AMD and consumers, let's not pretend this review isn't faulty. Here's another 7800x review where 1800x trails behind the 7800x, just like 99.9% of reviews out there except for this one that the red team keeps on using as ammo. You're telling me the 1800x is slower than a 1600? What?
You have to be very careful. A lot of sites use previous data and don't actually re-run the chip. Most 1800X data is from the original pre-bios fixes and therefore not accurate for current users.
 
RyZen ram settings and speed plays a huge difference especially the minimums in games. We are talking 20%-30% here. Memory Sub timings can have a dramatic increase in game performance of 5%-10% at the same DDR 4 3200 speed. So yes data can be all over the place. The 30 game review appears to be legit for the RyZen machine, no idea if some issue with the X299/7800x. So far anyone with an X99 platform has zero reason to purchase a X299 system. Gamers also have zero reason as well at this point. 7800x is basically crap at it's price point, limited PCIe lanes castrating the ability of the platform, cost more and performs worst than a 4 core in games.

I believe the review Ryzen setup could be improved further with better memory timings and speed as well, so the full potential of the system can be increased. Also talking about a 1080Ti at 1080p, the 7800x was the limiting factor - why? Because the 4 core 7700k pushed faster frame rates indicating the 1080Ti was being limited by the 7800x in these benchmarks.
 
Last edited:


Digital Foundry did a comparison between the 1600 and 1600K and its Intel price counterpart the 7600k. As you can tell by the title, DF endorsed AMD, the Ryzen 1600 to be specific. Looking at ultimate gaming performance, the 7700k is top of the board, but DF is focusing more mainstream gaming and budget, and here the 1600 shines. Mainstream means usually a GTX 1060, and 1080p, and the GPU-bound performance degrades Intel's advantage in FPS. If your waiting some time before building something, the video notes Coffee Lake is likely to overstep Ryzen across the board through. Still, speaking as a budget conscious gamer, DF offered plenty to consider between Intel and AMD.
 


Digital Foundry did a comparison between the 1600 and 1600K and its Intel price counterpart the 7600k. As you can tell by the title, DF endorsed AMD, the Ryzen 1600 to be specific. Looking at ultimate gaming performance, the 7700k is top of the board, but DF is focusing more mainstream gaming and budget, and here the 1600 shines. Mainstream means usually a GTX 1060, and 1080p, and the GPU-bound performance degrades Intel's advantage in FPS. If your waiting some time before building something, the video notes Coffee Lake is likely to overstep Ryzen across the board through. Still, speaking as a budget conscious gamer, DF offered plenty to consider between Intel and AMD.


Yep, nothing wrong with Ryzen for the budget gamer that doesn't care for absolute top performance. My Ryzen rig in my sig games just fine, but I like to have a rig that can push FPS a bit more. Hence having both a Ryzen and Skylake rig.
 
The 7700k is competitive with the R7 since it usually beats it in gaming.
However, the 7600k gets completely slapped around by the R5 in almost all benchmarks. A 6 core i5 was a MUST for Intel.
 


Not much talking, DF simply presents 3 benchmark runs comparing Ryzen 5 and 7 vs each other and their Intel counterparts, 7600k, 7700k, and i5 6500. Games tested were: Rise of the Tomb Raider DX12, Far Cry Primal, Crysis 3, The Witcher 3, Assassin's Creed, Unity, and the Division. Timestamps in the description.
 
Back
Top