Zen 3 is rumored to be flaunting monumental IPC gains in early testing

It's likely a combination of lower cache latency and higher clock speeds. Zen3 should have improvements in both.
 

Your link is broken.


https://wccftech.com/amd-ryzen-7-5800x-vermeer-8-core-16-thread-zen-3-cpu-benchmark-leaks-out/

What's more important to look at are the CPU framerates and here you can see the Ryzen 7 5800X completely crushing the Intel Core i9-10900K in terms of max framerate. In the Normal Batch run, the AMD Ryzen 7 5800X delivers up to 22% higher framerate than the Intel Core i9-10900K and 16% average in all batch results. We do not know the final clocks for the Ryzen 7 5800X yet but the Intel Core i9-10900K does feature more cores and threads and even runs at clock speeds up to 5.3 GHz.

It's a promising start but Ashes... really need something else to better gauge the gains.
 
Last edited:
Definitely intriguing, looks like AMD may have closed the gap on Intel finally. Looking forward to the reveal and reviews. I’ll be buying one as that was my plan since I built my current setup.
 
Hopeful this thing keeps beating cheeks in other titles. If these things are tits, I'm putting this 6700k out to pasture, I think.
 
I really hope October 8th is not just a paper launch...rumors are that the Oct 28th Big Navi announcement will also be a soft launch
 
I really hope October 8th is not just a paper launch...rumors are that the Oct 28th Big Navi announcement will also be a soft launch

I will be really disappointed if oct 8th launch is like nvidia's. I really really want a new system. Did the 3950x come out on launch or was it after the rest? If the new zen 3 chips are as rumored I want the 5950x.
 
I just hope the 5xxx APU series is still compatible with my B450 Asrock board, I couldn't care less if they aren't compatible with the higher tier CPU's.

But yeah. Zen 3 sounds promising and exciting, though that high TDP on the higher tier CPU's is bit unexpected for a mainstream CPU, but as long it can deliver performance nobody really cares.
150w is now considered high for a top end CPU? Lol, when did this happen? Someone should notify Intel about their 10600k, 10700k and 10900k.
10600k = 125w, PL2 = 182w
10700k = 125w, PL2 = 229w
10900k = 125w, PL2 = 250w

AMD 3700x = 65w, PPT = 88w
AMD 3900x = 105w, PPT = 142w
AMD = 150w will end up right around ~202w.

So, a 5900x 12/24 core/thread @ 150w is ~30w less than a 10700k and ~50w less than a 10900k. I don't think this is excessive or unexpected for a "mainstream" CPU (it's not really maintstream, the 5600x will be mainstream).

Also, there's a chance it was a typo and was supposed to say 105w ;).
 
I will be really disappointed if oct 8th launch is like nvidia's. I really really want a new system. Did the 3950x come out on launch or was it after the rest? If the new zen 3 chips are as rumored I want the 5950x.
AMD has had limited quantity on launch in recent launches, so I assume this will be similar. Not as bad as the 3080 launch, but still not a chip available for everyone that wants one. I would expect limited quantities though, but it also depends on which CPU as some may sell out faster than others.
 
anyone know if Walmart sells new AMD/Intel CPU's at launch?...I know they have them available but was was wondering if they sell it on Day 1...I have some Walmart credit so I was hoping to use that
 
AMD has had limited quantity on launch in recent launches, so I assume this will be similar. Not as bad as the 3080 launch, but still not a chip available for everyone that wants one. I would expect limited quantities though, but it also depends on which CPU as some may sell out faster than others.

Well that sucks, it has been a while since I bothered buying any computer hardware on launch. With the 3080 launch being so disappointed I don't think I'll be able to get a new system before Cyberpunk comes out.
 
Well that sucks, it has been a while since I bothered buying any computer hardware on launch. With the 3080 launch being so disappointed I don't think I'll be able to get a new system before Cyberpunk comes out.
Like I said, if the past is any indication, NVIDIA should have hard more quantity available as well (based on their past launches). But, if AMD is true to form, people WILL be able to buy stuff and not be sold out in 3 seconds. Of course, these aren't normal times either, so nobody really knows. Rumor is their release was pushed back from originally planned just to make sure they had some quantities (which may be why NVIDIA was in such a rush, thought AMD would release earlier). Hard to say for sure as they don't really share a lot of this information. We'll never know how low of quantities NVIDIA had this launch compared to last. They gave out stats like.."4 times more unique visitors to the site", well, bots are known to use VPN's with unique IP's to keep from being caught, so the 4 times more visitors could have simply been the bots. It also could have been the FE was cheaper and built better than most other AIB cards at MSRP. In the past, the FE cost more... so maybe more people were interested in the FE this time around due to those reasons. AMD will also not disclose information on how much supply they'll have, so we won't really find out how short of supply until release happens.
 
AMD has had limited quantity on launch in recent launches, so I assume this will be similar. Not as bad as the 3080 launch, but still not a chip available for everyone that wants one. I would expect limited quantities though, but it also depends on which CPU as some may sell out faster than others.
It was mostly the 3900X and 3950X that were limited IIRC, the 3600 and 3700X seemed to be OK around launch.
 
It was mostly the 3900X and 3950X that were limited IIRC, the 3600 and 3700X seemed to be OK around launch.
Yeah, that's why I said it also depends on which CPU ;). 3800x was not the most sought after since the 3700x had the same cores and was not far off in frequency. It's been a while, so I don't recall which sold out quickest, but I remember them being in stock here and there and a few having decent stock from the get go. I wouldn't swear to which ones though, lol.
 
If the 5700X or 5800X, whichever is the 8core single CCX CPU, really shows massive gains in gaming then I would expect that to be in short supply. It depends on yields too I guess, but those should be "OK" at this point. I really hope its not a paper launch, that would really suck.
 
If the 5700X or 5800X, whichever is the 8core single CCX CPU, really shows massive gains in gaming then I would expect that to be in short supply. It depends on yields too I guess, but those should be "OK" at this point. I really hope its not a paper launch, that would really suck.
If it's a single CCX die, it may be a bit more difficult to come by. Keep in mind, the 5600x can have up to 2 borked cores in a CCX and still make the cut. A 5700x/5800x single die would require all cores to work to whatever frequency they end up at, so higher binning for sure. I'm not positive if both will be single CCX, it's possible the 5700x could be dual CCX and 5800x could be a single. I really haven't heard to much, but I'm really hoping they're both single CCX full 8 cores, would REALLY put some pressure on Intel if they can close that last gap. Leaked benchmarks are showing well, but I'll wait for real benchmarks before I put to much stake in them.
 
If one is primarily using this new CPU for gaming and plans to use it for 4 years - does it make any sense to go with 12 cores vs 8? In other words, is there any evidence of games being able to leverage more than 8 cores within next 4 years?
 
I think civilizations 5 scales to 10 threads but I'm unable to confirm this since I don't have the game. You should be safe with 8c/16t cpu for sometime as game devs are slow to adapt to hardware changes to avoid losing sales to gamers with not so outdated hardware like 4c8t. With consoles moving to zen2 cpus and 8c/16t I'm guessing there will be more reason for devs to adapt to higher thread counts a bit faster in the near future.
 
If one is primarily using this new CPU for gaming and plans to use it for 4 years - does it make any sense to go with 12 cores vs 8? In other words, is there any evidence of games being able to leverage more than 8 cores within next 4 years?
The new consoles might necessitate this, with 8 cores including 2-way SMT for 16 threads.
 
Yea, I feel that too. AMD jumping up to 150w tdp seems like a serious regression.

Might have been needed for clock speed they targeted. However 150 watts is not horrible either, I can live with 45 more watts if it makes it substantially better in performance.
 
Might have been needed for clock speed they targeted. However 150 watts is not horrible either, I can live with 45 more watts if it makes it substantially better in performance.
I could live with it too, it's still less draw than a 10700k pulls, but I would think ~43% more power to get 100-200mhz is possibly a typo. New process should allow slightly better clocks at the same voltage, so it seems a 40% increase in TDP for a ~4% increase in clocks seems a little off, but who really knows until it's revealed. Rumors are just that, rumors, until they are in hand and benchmarked by independent reviewers, I won't be making a decision whether or not to upgrade anyways. At that point I'll take everything into account (performance where I need it, power draw, price) and make a decision on if I want to upgrade or not.
 
AMD Ryzen 9 5900X 12 Core & Ryzen 7 5800X 8 Core Zen 3 CPUs Could Potentially Launch As Early As October 20th

The potential launch date of AMD's next-generation Ryzen 9 5900X and Ryzen 7 5800X Vermeer Zen 3 CPUs may have been unveiled and the Ryzen 5000 CPU series could hit the market even before the introduction of AMD's RDNA 2 based Radeon RX 6000 series graphics cards...as for the launch date, both sources reported at least one day that matches and that's the 20th of October...

https://twitter.com/1usmus/status/1311196944811921410
 
I could live with it too, it's still less draw than a 10700k pulls, but I would think ~43% more power to get 100-200mhz is possibly a typo. New process should allow slightly better clocks at the same voltage, so it seems a 40% increase in TDP for a ~4% increase in clocks seems a little off, but who really knows until it's revealed. Rumors are just that, rumors, until they are in hand and benchmarked by independent reviewers, I won't be making a decision whether or not to upgrade anyways. At that point I'll take everything into account (performance where I need it, power draw, price) and make a decision on if I want to upgrade or not.

I think you may gain 4 cores that can clock that high might be why the bump in TDP.
 
I think you may gain 4 cores that can clock that high might be why the bump in TDP.
I guess that is a possibility, but still doesnt' really make sense... the 3950x is a 105w part and is 16/32 (and clocks higher than a 3900x)... so if they bump the 5900x to 16/32 it still doesn't make sense they'd need that much more power for minimal increases in frequency. Not saying it may not be true (maybe it runs much lower most of the time but can run up to 150w when pushed hard), just seems like a very large jump in power for almost no changes in frequencies.
 
It's a promising start but Ashes... really need something else to better gauge the gains.

I've been saying the same thing... it's a nice start, but it's also a benchmark that would also showcase Zen 3 under the best light and case scenario. I'd like to see some 1080p,1440p, gameplay benchmarks in games like RDR2 etc...
 
I've been saying the same thing... it's a nice start, but it's also a benchmark that would also showcase Zen 3 under the best light and case scenario. I'd like to see some 1080p,1440p, gameplay benchmarks in games like RDR2 etc...
Nah, picking a game that has always favored AMD would be best case. 3950 min quality was around 130fps... 10900k min quality was around 155... so it was about 20% faster than Zen2. Not exactly showing something in its best light when it was getting thoroughly beaten. Anyways, it's a single data point, and we can't make to much of it, but I certainly wouldn't call it best case, it was losing by 20% and now is shown ahead by ~15%... that's a pretty drastic jump in a single game (and that was top end zen2 against not top end zen3). The 3800x by comparison (since the new benchmark is showing the 5800x) was hitting 120fps, which makes it an even larger increase. I don't expect all games to gain >20% performance, but if it's ~10% on average, that pretty much closes the gap to Intel for the most part. I'm sure there will still be games that favor one vs the other so finding benchmarks for things you use/play is always recommended. For sure gives me some hope though.
 
good to see the rumors of a new X670 chipset for Zen 3 were false...so I can rest easy with the MSI X570 Tomahawk board I bought in August

it never made any sense as X570 is more then enough...save X670 for AM4 and DDR5
 
Last edited:
AMD has a bigger long-term question: will TSMC continue to be able to deliver?

What's the plan for when they inevitably do not?

What's the plan when TSMC decides that someone else is more deserving of their fab capacity?

This is why Intel stayed in the fab business; it's biting them in the ass today, but it has delivered long-term and will likely continue to deliver long-term.

At the very least, TSMCs current process family has been fairly successful, and thankfully while Intel is experiencing their rare stumble, AMD has something actually worth producing!
Fairly successful!?! They are absolutely the most dominant fab in the world and have been for years. The reports on 5nm and beyond paint a stunning picture for the next few years at least. Maybe Samsung can catch up? Intels roadmap for next year looks tough, another 14nm is ridiculous.
 
150w is now considered high for a top end CPU? Lol, when did this happen? Someone should notify Intel about their 10600k, 10700k and 10900k.
10600k = 125w, PL2 = 182w
10700k = 125w, PL2 = 229w
10900k = 125w, PL2 = 250w

AMD 3700x = 65w, PPT = 88w
AMD 3900x = 105w, PPT = 142w
AMD = 150w will end up right around ~202w.

So, a 5900x 12/24 core/thread @ 150w is ~30w less than a 10700k and ~50w less than a 10900k. I don't think this is excessive or unexpected for a "mainstream" CPU (it's not really maintstream, the 5600x will be mainstream).

Also, there's a chance it was a typo and was supposed to say 105w ;).
And quoting myself, it was a typo and it was announced @ 105w, so you are safe to not even have to come close to Intel level power draws ;).
 
And quoting myself, it was a typo and it was announced @ 105w, so you are safe to not even have to come close to Intel level power draws ;).
Yup, 5950X with 16c/32t will draw 40W less than the 6c/12t 10600k. That's mind boggling to me. Intel has to figure out how to get more than 4c from their 10nm process in large volumes or AMD will go unchallenged for the foreseeable future, which is as bad as the opposite. AMD may hit volume 5nm before Intel even ramp 10nm in volume...And I don't buy them jumping right to 7nm. Having worked in the industry, going to a smaller node never solves yield issues, it makes them much worse. The only way that's possible is solving whatever issue they're having at 10nm also fixes it for 7nm, and they'd have to solve it completely and not incrementally.
 
Yup, 5950X with 16c/32t will draw 40W less than the 6c/12t 10600k. That's mind boggling to me. Intel has to figure out how to get more than 4c from their 10nm process in large volumes or AMD will go unchallenged for the foreseeable future, which is as bad as the opposite. AMD may hit volume 5nm before Intel even ramp 10nm in volume...And I don't buy them jumping right to 7nm. Having worked in the industry, going to a smaller node never solves yield issues, it makes them much worse. The only way that's possible is solving whatever issue they're having at 10nm also fixes it for 7nm, and they'd have to solve it completely and not incrementally.
Yeah, I mean, I'm not overly concerned with power draw, but this is a pretty large gap. Now AMD just needs to figure out how to get the heat out of the chip faster, lol. Even my 65w chip stays pretty warm :).
Considering they already announced 7nm delays that "wouldn't affect the schedule" and then announced a delay to at least some of their contracts.... I'm not really sure why anyone would trust anything they say about schedules and process nodes. I'm not even sure how that's legal honestly, I'm surprised they haven't had a lawsuit for lying to investors for that long. I mean they've been saying it's on schedule since what, 2015? Then it was a slight delay, but it'll be out by 2016... then 2017, then 2018, I mean, at some point they knew it wasn't working and knowlingly where misleading investors and regulators. It's crazy because in the past you would have bet on TSMC missing their targets and Intel forging ahead, amazing to see how quickly tides can turn. I mean, TSMC is already on 5nm... AMD will be switching next year give or take once AM5, DDR5 and PCIE 5 come out. Heck, Intel is still struggling to get PCIE 4.0 working, which is pretty sad. It is funny to watch them benchmark their own PCIE products on AMD systems because they don't have anything to test it with. Hard to sell Optane drives with pcie 4.0 to data centers when that means the data center would have to run AMD EPYC's. Anyways, the IPC increases *sound* great from AMD, excited to see what happens in the hands of reviewers and to see what other models eventually come out, I'm probably going to hold out for a 5700x to replace my 3700x. I don't think it's worth it to step up to the 5800x, at least not at this time for me. If I still had my r5 1600, it would have made more sense.
 
Waiting for Flight Simulator 2020 tests. That game seems to be particularly sensitive to IPC performance.
 
Back
Top