Zen 3 is rumored to be flaunting monumental IPC gains in early testing

kamikazi

Gawd
Joined
Jan 19, 2006
Messages
659
as long as its a decent bin of b-die, 4133 shouldnt be an issue @ say something like 18-18-18-38 for starters @ 1.4+ Vdimm.. i'd keep what you have


Good luck, my B-Die G.Skill Trident Z that is XMP 4000 19-19-19-39 1.35v needs 1.4v for stability at 3600 16-16-16-36 with SOC one notch above 1.1v in BIOS so that it gets the full 1.1v. Latency in AIDA is 67.2ns and I can't get it lower at 3600. I got my RAM settings from someone's settings on a Gigabyte board. I couldn't get anything from the 1usmus DRAM calculator to work. At 3733 with similar timings, it needs 1.47v. It's not on my ASUS x570 Crosshair Hero VIII's QVL though and I think that's the issue. It's this RAM: G.SKILL TridentZ RGB Series 32GB (2 x 16GB) 288-Pin DDR4 SDRAM DDR4 4000 (PC4 32000) Intel XMP 2.0 Desktop Memory Model F4-4000C19D-32GTZR. I'll bet it would be fine on an Intel board.

I'm actually replacing it with 64 GB of Hynix C Die G.Skill memory that's on the QVL for my motherboard. It's this RAM: G.SKILL Ripjaws V Series 64GB (4 x 16GB) 288-Pin DDR4 SDRAM DDR4 3600 (PC4 28800) Desktop Memory Model F4-3600C16Q-64GVKC and it was 10% off on Monday. The 32 GB kit of the same RAM works well on my Asus x570 TUF Gaming Plus and I could actually use the 1usmus calculator. AIDA latency is 68.4ns.

Of course, your kit is an Trident Z Neo kit, so it should be on your motherboard's QVL when it comes out and it should work for you at least at XMP settings. Good luck.
 

newls1

Supreme [H]ardness
Joined
Sep 8, 2003
Messages
4,337
my last set of trident z neo's used DJR but it was the CL19 kit. so I had 4x16gb sticks @ 3735 @CL 19 @ 1.45
 

thesmokingman

Supreme [H]ardness
Joined
Nov 22, 2008
Messages
6,229
Good luck, my B-Die G.Skill Trident Z that is XMP 4000 19-19-19-39 1.35v needs 1.4v for stability at 3600 16-16-16-36 with SOC one notch above 1.1v in BIOS so that it gets the full 1.1v. Latency in AIDA is 67.2ns and I can't get it lower at 3600. I got my RAM settings from someone's settings on a Gigabyte board. I couldn't get anything from the 1usmus DRAM calculator to work. At 3733 with similar timings, it needs 1.47v. It's not on my ASUS x570 Crosshair Hero VIII's QVL though and I think that's the issue. It's this RAM: G.SKILL TridentZ RGB Series 32GB (2 x 16GB) 288-Pin DDR4 SDRAM DDR4 4000 (PC4 32000) Intel XMP 2.0 Desktop Memory Model F4-4000C19D-32GTZR. I'll bet it would be fine on an Intel board.

I'm actually replacing it with 64 GB of Hynix C Die G.Skill memory that's on the QVL for my motherboard. It's this RAM: G.SKILL Ripjaws V Series 64GB (4 x 16GB) 288-Pin DDR4 SDRAM DDR4 3600 (PC4 28800) Desktop Memory Model F4-3600C16Q-64GVKC and it was 10% off on Monday. The 32 GB kit of the same RAM works well on my Asus x570 TUF Gaming Plus and I could actually use the 1usmus calculator. AIDA latency is 68.4ns.

Of course, your kit is an Trident Z Neo kit, so it should be on your motherboard's QVL when it comes out and it should work for you at least at XMP settings. Good luck.

Using high densities will obviously reduce what latencies you can run and overclocks in general. You should know this?
 

kamikazi

Gawd
Joined
Jan 19, 2006
Messages
659
Using high densities will obviously reduce what latencies you can run and overclocks in general. You should know this?


I'm aware. I'm currently running 32GB on the Trident Z 4000 B Die kit, not 64. I guess I didn't actually state that in my post, sorry for the confusion. I'm less than impressed with the results as compared to the Ripjaws Hynix 32GB RAM kit at half the cost. Aida latency difference is 1.2ns. I realize that my new 64 GB Ripjaws Hynix kit will probably not run at the same timings and voltage as my current 32 GB kit, but I'll bet it will be close enough. If it sucks too bad, I'll pull half of it out and sell it. It will still be a matched set.

Also, I forgot to the mention that the Hynix 32 GB kit and 3900x seems to be happy with SOC voltage set to 1.05v in BIOS. That actually saves a little bit of power budget. I'm hoping to be able to do the same with my 3950x system.
 

KarVi71

n00b
Joined
Dec 14, 2016
Messages
31
I heard Zen 3 will have between 10-20% IPC gains for single threaded performance tasks, and as much as 10% when all cores are loaded.
Supposesdly this is in large part due to the fact Zen 3 has 8 cores per CCX now where Zen 2 maxed out at 4. As well

So what I don’t fully understand is how the majority of IPC increase comes from having 8 cores per CCX vs 4 on Zen 2. L3 cache has apparently remained the same, so are the improvements in IPC (mostly) due to the fact the L3 cache isn’t split anymore between 2x4 core CCX’s, rather its unified now?

Its hard to say where exactly the improvements will come from.

Zen 3 is supposed to be an even more from the grounds up rebuild of the architecture than Zen 2.

Zen 2 was their first chiplet design. There must be some low hanging fruit they have discovered, that can improve performance / IPC.

So on one side they have probably optimized things in the execution pipeline of the individual cores.
But they have also looked at the whole design and removed / reduced bottlenecks.

Personally I think they have optimized the cores, optimized and unified the caches, optimized the CCD and IOD communications (faster or lower latency Infinity Fabric (perhaps even direct communication between CCD's instead of through the IOD)), and improved the IO Die itself, for faster memory and lower latencies.

The L3 alone could be a significant upgrade, as all cores in a CCD now see one big cache pool. That means data wont have to go through the internal CCD IF, if data travels between CCX'es, and there will be less cache misses. A downside seems to be a slightly higher latency to the L3 cache. Perhaps this can be mitigated somewhat by higher clockspeeds?

These are just my best guesses of what they could have done.
 
Last edited:

Ready4Dis

2[H]4U
Joined
Nov 4, 2015
Messages
2,366
I heard Zen 3 will have between 10-20% IPC gains for single threaded performance tasks, and as much as 10% when all cores are loaded.
Supposesdly this is in large part due to the fact Zen 3 has 8 cores per CCX now where Zen 2 maxed out at 4. As well

So what I don’t fully understand is how the majority of IPC increase comes from having 8 cores per CCX vs 4 on Zen 2. L3 cache has apparently remained the same, so are the improvements in IPC (mostly) due to the fact the L3 cache isn’t split anymore between 2x4 core CCX’s, rather its unified now?

A few things really. A single CCX means less contention for the i/o. L3 is no longer split and all cores have access to all L3, so this should help a bit. The biggest thing this does is reduces latency to the i/o subsystem (due to less contention on the system bus/infinity fabric). If you compare the 3100 (2 CCX with 2 cores each) to the 3300x (1 CCX with 4 cores), you can see that even at the exact same frequency with the exact same memory the 3300x holds a noticeable lead.

The L3 can help in a few ways. First if a thread is running and it has some data cached, and is moved to another CPU, without the shared L3 cache it may or may not lose it's cache when moved! With shared cache, it can move between cores and it won't have to rebuild it's cache (regardless of which core it moves to). Second, if more than one core is working on similar data (aka, both need access to said cache) then they both have access to each others cache. Third, if you have a single threaded app, the one core doing the work now has double the cache it had before.

Also, with a new architecture typically comes some efficiency and performance improvements. So, hopefully with all of this going, it will finally catch up to Intel (or at least close the gap further) for single thread and pull further ahead in multi-thread. Of course, benchmarks will tell the true story, so until then, we're all just guessing and speculating, but we will see improvements, just not sure how much really. Could be almost none, could be 20%. My guess is we'll see around 10%-15% in most work loads, with some scaling much higher and a few barely scaling. Again, this is just my pure speculation so, salt shaker ;).
 

Ready4Dis

2[H]4U
Joined
Nov 4, 2015
Messages
2,366
It will. I don't think there will be enough CPUs. We shouldn't forget that we have still a pandemic going on and it seems that the manufacturing capacities are not at 100%. We all like to forget.
I second this... have you seen the 3300x? Yeah, me either, lol. I think manufacturing is back to full capacity for the most part, and Taiwan was not hit that hard by covid. The rest of the supply chain is crap though. Also, AMD has a lot going on right now so I think supply may be a bit thin in general (RDNA2, zen3, xbox, ps5). Hopefully it's not as bad as the nvidia launch, but I wouldn't expect to much normalized stock (from amd or nvidia) until Q1 2021 (normalized as in easy to find at or around msrp). Hopefully the more cut down versions come in stock a bit quicker as the binned parts are harder to come by.
 

BrotherMichigan

Limp Gawd
Joined
Apr 8, 2016
Messages
286
I second this... have you seen the 3300x? Yeah, me either, lol. I think manufacturing is back to full capacity for the most part, and Taiwan was not hit that hard by covid. The rest of the supply chain is crap though. Also, AMD has a lot going on right now so I think supply may be a bit thin in general (RDNA2, zen3, xbox, ps5). Hopefully it's not as bad as the nvidia launch, but I wouldn't expect to much normalized stock (from amd or nvidia) until Q1 2021 (normalized as in easy to find at or around msrp). Hopefully the more cut down versions come in stock a bit quicker as the binned parts are harder to come by.

You don't see a 3300X because the yields are good enough that they'd rather sell a perfectly good 3600 for more money at the same cost to them.
 

Ready4Dis

2[H]4U
Joined
Nov 4, 2015
Messages
2,366
You don't see a 3300X because the yields are good enough that they'd rather sell a perfectly good 3600 for more money at the same cost to them.
If the yields were that good, it'd be the 3100 that you couldn't buy because they'd be using all the perfect dies and using them in higher end cards ;). They would much prefer to put the perfect dies into higher end parts as long as they're selling them at higher profits. I understand how it works (3600 is a bad example because that is only 6 cores, not 8). More likely the 3300x die's where being pulled for the 3800xt or even thread rippers and such. Hopefully they have some stock with this coming release, but I doubt it'll be great. I'm patient though, I just bought 2 3700x's recently to hold me over, so if zen3 takes until Q2 next year to be available, it's not as if my computer will stop working. I was actually going to purchase a 3300x for my son but ended up buying the 3700x instead due to limited stock of 3300x, 3600 and 3600x at the time. I liked it so much I bought a second for my desktop. If zen3 is a nice bump in performance that I think would be useful, I've got no issues upgrading (always have a PC that can use newer/spare parts with 4 kids and my wife at the house!)
 

learners permit

Limp Gawd
Joined
Jun 15, 2005
Messages
330
Yea I'm trying to get another Ryzen build together for a friend and need the cpu asap so he can have my 3800 for his new PC. He saw the performance of my system and decided to go amd this time and spend the rest on a new vid card. Ready4 admit it bro your a pc junkie man. :)
My kids are grown and moved out so I don't have to juggle pc parts and constantly troubleshoot their pc issues anymore TG. So many hours doing that over the years turned my hair grey.
 

c3k

2[H]4U
Joined
Sep 8, 2007
Messages
2,215
I've got two Ryzen builds (check sig, the 3700x's) specifically made to be ready for the Zen 3 (5xxx series is what they're naming them?) to drop in. One of the 3700x's will then go into my 2700x build. I'll stagger the Zen3 drop in replacements. This is why I like AMD. Well, that, and the price/performance ratio.

That 2700x? Sigh. It'll join the rest of the parts on the Island of Misfit Parts.
 

Ready4Dis

2[H]4U
Joined
Nov 4, 2015
Messages
2,366
Yea I'm trying to get another Ryzen build together for a friend and need the cpu asap so he can have my 3800 for his new PC. He saw the performance of my system and decided to go amd this time and spend the rest on a new vid card. Ready4 admit it bro your a pc junkie man. :)
My kids are grown and moved out so I don't have to juggle pc parts and constantly troubleshoot their pc issues anymore TG. So many hours doing that over the years turned my hair grey.
Junkie? I'm pretty sure most everyone has 6 desktops + 4 laptops and a home server in their house... ;). If only I could get some decent internet it'd be so much easier, lol. My son starts college shortly, he's my oldest. My nephew was adopted and has moved out now (he was the handful), my daughters are 6 and 14, and right now a neighbors kid is staying at the house due to some issues he was having at home. Playing with my computers is the only thing that keeps me sane, haha.
 

Ready4Dis

2[H]4U
Joined
Nov 4, 2015
Messages
2,366
I've got two Ryzen builds (check sig, the 3700x's) specifically made to be ready for the Zen 3 (5xxx series is what they're naming them?) to drop in. One of the 3700x's will then go into my 2700x build. I'll stagger the Zen3 drop in replacements. This is why I like AMD. Well, that, and the price/performance ratio.

That 2700x? Sigh. It'll join the rest of the parts on the Island of Misfit Parts.
I have a 1600 I have no clue what to do with, that's what my 3700x replaced. Debating to just buy a cheap B350/450 board or something and finally retire my old a$$ dell server with something a bit more recent.
 

BrotherMichigan

Limp Gawd
Joined
Apr 8, 2016
Messages
286
If the yields were that good, it'd be the 3100 that you couldn't buy because they'd be using all the perfect dies and using them in higher end cards ;). They would much prefer to put the perfect dies into higher end parts as long as they're selling them at higher profits. I understand how it works (3600 is a bad example because that is only 6 cores, not 8). More likely the 3300x die's where being pulled for the 3800xt or even thread rippers and such. Hopefully they have some stock with this coming release, but I doubt it'll be great. I'm patient though, I just bought 2 3700x's recently to hold me over, so if zen3 takes until Q2 next year to be available, it's not as if my computer will stop working. I was actually going to purchase a 3300x for my son but ended up buying the 3700x instead due to limited stock of 3300x, 3600 and 3600x at the time. I liked it so much I bought a second for my desktop. If zen3 is a nice bump in performance that I think would be useful, I've got no issues upgrading (always have a PC that can use newer/spare parts with 4 kids and my wife at the house!)

You're assuming "not bad enough for a 3300X" means "good enough for a full-CCD part", and no matter HOW good the yields, that's almost certainly not true a good part of the time. A 3100 is actually less likely to be the result of manufacturing defects when yields are good because it requires defects in two CCXs instead of just one. Either way, we can be sure that AMD is throwing as many good chiplets into its higher-end parts as it can because it's currently selling everything it can make.
 

Ready4Dis

2[H]4U
Joined
Nov 4, 2015
Messages
2,366
You're assuming "not bad enough for a 3300X" means "good enough for a full-CCD part", and no matter HOW good the yields, that's almost certainly not true a good part of the time. A 3100 is actually less likely to be the result of manufacturing defects when yields are good because it requires defects in two CCXs instead of just one. Either way, we can be sure that AMD is throwing as many good chiplets into its higher-end parts as it can because it's currently selling everything it can make.
3300x uses a full 4-core CCX, meaning it can't have any MAJOR defects; This said, you are correct that it can still be slightly less power efficient or frequency limited, so it is possible to end up with something that is in between, even if unlikely. If it has a full 4-core CCX, they would check to see if it works well enough for a 3950x, 3800XT, then 3700x before even thinking about putting it into 3300x (that's assuming it's not needed for any server or thread ripper parts). I wasn't assuming anything, it's just the way their chips are. 3100 doesn't necessarily mean they aren't using a good 3/4 die and disabling one or something (or mixing a 1-core with a 3-core), but 3300x means it's a full undamaged die, which is better off selling for more money in a higher end in demand part.
Either way, the point was simply they will build higher end parts if their is demand before they will stuff them in lower end parts that don't have as good of margins. This happened when they were selling their server chips, workstation chips and desktop/consumer chips and it'll continue to happen as well (not just them, everyone, Intel did the same thing prioritizing their server chips when their supply was super low). Anyways, my point was there are a lot of products vying for better binned parts, while the 3100 is like the catch all :).
 

kamikazi

Gawd
Joined
Jan 19, 2006
Messages
659
3300x uses a full 4-core CCX, meaning it can't have any MAJOR defects; This said, you are correct that it can still be slightly less power efficient or frequency limited, so it is possible to end up with something that is in between, even if unlikely. If it has a full 4-core CCX, they would check to see if it works well enough for a 3950x, 3800XT, then 3700x before even thinking about putting it into 3300x (that's assuming it's not needed for any server or thread ripper parts). I wasn't assuming anything, it's just the way their chips are. 3100 doesn't necessarily mean they aren't using a good 3/4 die and disabling one or something (or mixing a 1-core with a 3-core), but 3300x means it's a full undamaged die, which is better off selling for more money in a higher end in demand part.
Either way, the point was simply they will build higher end parts if their is demand before they will stuff them in lower end parts that don't have as good of margins. This happened when they were selling their server chips, workstation chips and desktop/consumer chips and it'll continue to happen as well (not just them, everyone, Intel did the same thing prioritizing their server chips when their supply was super low). Anyways, my point was there are a lot of products vying for better binned parts, while the 3100 is like the catch all :).

How could a 3950x use a 4 core CCX? It has two 8 core chips.
 

BrotherMichigan

Limp Gawd
Joined
Apr 8, 2016
Messages
286
3300x uses a full 4-core CCX, meaning it can't have any MAJOR defects; This said, you are correct that it can still be slightly less power efficient or frequency limited, so it is possible to end up with something that is in between, even if unlikely. If it has a full 4-core CCX, they would check to see if it works well enough for a 3950x, 3800XT, then 3700x before even thinking about putting it into 3300x (that's assuming it's not needed for any server or thread ripper parts). I wasn't assuming anything, it's just the way their chips are. 3100 doesn't necessarily mean they aren't using a good 3/4 die and disabling one or something (or mixing a 1-core with a 3-core), but 3300x means it's a full undamaged die, which is better off selling for more money in a higher end in demand part.
How could a 3950x use a 4 core CCX? It has two 8 core chips.

Kamikazi has the right of it. The 3300X is a single, intact CCX, not a CCD, which is two 4-core CCXs. A defect in a single CCX rules the CCD out for use in the 3700X, 3800X/T, and the 3950X, but it could still be perfectly serviceable as a 3600, 3600X, or a 3900X/T by disabling a single core in the other CCX.

Either way, the point was simply they will build higher end parts if their is demand before they will stuff them in lower end parts that don't have as good of margins. This happened when they were selling their server chips, workstation chips and desktop/consumer chips and it'll continue to happen as well (not just them, everyone, Intel did the same thing prioritizing their server chips when their supply was super low). Anyways, my point was there are a lot of products vying for better binned parts, while the 3100 is like the catch all :).

This part I fully agree with and is what I meant above in my first response.
 
Last edited:

Ready4Dis

2[H]4U
Joined
Nov 4, 2015
Messages
2,366
Kamikazi has the right of it. The 3300X is a single, intact CCX, not a CCD, which is two 4-core CCXs. A defect in a single CCX rules the CCD out for use in the 3700X, 3800X/T, and the 3950X, but it could still be perfectly serviceable as a 3600, 3600X, or a 3900X/T by disabling a single core in the other CCX.
This part I fully agree with and is what I meant above in my first response.
Sorry your right, was getting myself confused, didn't sound right when I was typing it either, lol. Anyways, as we agree, they are goin to use the parts for the higher margins first if there is demand.
I think my brain is already on zen3 with a single ccx being the full die :p. My mistake, I'll move along now, nothing to see here.
 

Azrak

Gawd
Joined
Oct 4, 2015
Messages
968
If people were already worried about 105W TDP CPUs in their B450s and X470s, I don't think 150W TDP is going to alleviate their concerns.
In order for AMD to guarantee Zen3 CPUs will work in older MBs (older than X570 and B550) like they promised after the huge uproar, then they have to adhere to the maximum power ratings of the original designs of the MB VRMs, lest VRMs go up in smoke.
AMD could tune AGESA to clock Zen3 lower on older chipsets or maybe allow MB manufacturers tune the maximums per board via BIOS tweaks. Better VRM boards would get higher power limits. Wouldn't that piss people off! heh "I bought a 5900X expecting 5GHz on my B450, but it's only boosting to 4.7GHz!" Yeah, and your VRM is hot enough to cook an egg.
Or maybe they can ship a VRM cooler in the box instead of a CPU cooler.
 

KazeoHin

[H]F Junkie
Joined
Sep 7, 2011
Messages
8,197
If people were already worried about 105W TDP CPUs in their B450s and X470s, I don't think 150W TDP is going to alleviate their concerns.
In order for AMD to guarantee Zen3 CPUs will work in older MBs (older than X570 and B550) like they promised after the huge uproar, then they have to adhere to the maximum power ratings of the original designs of the MB VRMs, lest VRMs go up in smoke.
AMD could tune AGESA to clock Zen3 lower on older chipsets or maybe allow MB manufacturers tune the maximums per board via BIOS tweaks. Better VRM boards would get higher power limits. Wouldn't that piss people off! heh "I bought a 5900X expecting 5GHz on my B450, but it's only boosting to 4.7GHz!" Yeah, and your VRM is hot enough to cook an egg.
Or maybe they can ship a VRM cooler in the box instead of a CPU cooler.

They can call that extra 300MHz "Precision Boost 3.0" and say it's only a feature on 600 series boards.
 

Ready4Dis

2[H]4U
Joined
Nov 4, 2015
Messages
2,366
If people were already worried about 105W TDP CPUs in their B450s and X470s, I don't think 150W TDP is going to alleviate their concerns.
In order for AMD to guarantee Zen3 CPUs will work in older MBs (older than X570 and B550) like they promised after the huge uproar, then they have to adhere to the maximum power ratings of the original designs of the MB VRMs, lest VRMs go up in smoke.
AMD could tune AGESA to clock Zen3 lower on older chipsets or maybe allow MB manufacturers tune the maximums per board via BIOS tweaks. Better VRM boards would get higher power limits. Wouldn't that piss people off! heh "I bought a 5900X expecting 5GHz on my B450, but it's only boosting to 4.7GHz!" Yeah, and your VRM is hot enough to cook an egg.
Or maybe they can ship a VRM cooler in the box instead of a CPU cooler.
Why? They aren't making any gaurentees about x470 or b450 boards, just that they will allow the manufactuers to support them if they want to. Also, not all MB can always support all TDPs, heck they even struggled with 3950x. If you bought a cheap board for a cheap CPU, why would you expect it would support the highest end 2 CPU cycles later? Would you drop a 10900k into a B460 board? The gigabyte B460 gets to 103* with a 10900k installed :). I wouldn't consider that running properly. MSI Mortar *only* gets to 85* if you have decent cooling in your case. Seriously, who are these people who buy a $100 board and think it should run 3-4 generations of chips and be able to support a crap ton of thermal power. I have a cheap b450, it runs my 3700x fine. If it can run a 5700x, awesome, if not.. I for sure got my money out of it by now. Had I bought an expensive x470 with awesome VRM's to support a 3950x, then it would no doubt be up to the task. I didn't though, I bought a cheap B450 that was in the yellow for 3950x (aka, it's being pushed very close to the edge and only recommended with good VRM cooling). I'm under no delusion that it will handle a 125w CPU, but if the 5700x stays at 65w or even bumps up to 85w, I'm still fine and could upgrade if I feel the need.
 

Ready4Dis

2[H]4U
Joined
Nov 4, 2015
Messages
2,366
AMD Ryzen 7 5800X Vermeer 8 Core & 16 Thread Zen 3 CPU Benchmark Leaks Out – Faster Than Intel’s Fastest Core i9-10900K 10 Core Gaming Chip

get hyped!...

https://wccftech.com/amd-ryzen-7-5800x-vermeer-8-core-16-thread-zen-3-cpu-benchmark-leaks-out/
I'll (impatiently) wait on real benchmarks, but we all knew there was supposed to be a decent uptick in single thread/gaming work loads, so hopefully this is indicative of what we can expect. That said, if this holds true, it's going to really hurt to be Intel if they don't get a new chip out soon.

Edit: In case you guys don't read the article, here is a quote abouta leaked benchmark:
"What's more important to look at are the CPU framerates and here you can see the Ryzen 7 5800X completely crushing the Intel Core i9-10900K in terms of max framerate. In the Normal Batch run, the AMD Ryzen 7 5800X delivers up to 22% higher framerate than the Intel Core i9-10900K and 16% average in all batch results."

This would be a sad day for Intel if the 5800x can beat the 10900k in gaming while only hitting 4.6-4.7GHz ;). Like I said though, who knows, this could have been on LN2 with 5000mhz DDR4... but if it's indeed a stock 5800x, I have to think Intel needs to get their next Lake chips released and hope they don't have to push 300w to keep up.
 
Last edited:
Joined
Apr 1, 2008
Messages
2,652
I just hope the 5xxx APU series is still compatible with my B450 Asrock board, I couldn't care less if they aren't compatible with the higher tier CPU's.

But yeah. Zen 3 sounds promising and exciting, though that high TDP on the higher tier CPU's is bit unexpected for a mainstream CPU, but as long it can deliver performance nobody really cares.
 

-Strelok-

[H]F Junkie
Joined
Dec 2, 2010
Messages
10,353
Didn’t someone run a 3950X in an A320 board and it performed very close to maximum performance? People freaking out over VRMs is the biggest joke of AM4.
 

sirmonkey1985

[H]ard|DCer of the Month - July 2010
Joined
Sep 13, 2008
Messages
22,226
If people were already worried about 105W TDP CPUs in their B450s and X470s, I don't think 150W TDP is going to alleviate their concerns.
In order for AMD to guarantee Zen3 CPUs will work in older MBs (older than X570 and B550) like they promised after the huge uproar, then they have to adhere to the maximum power ratings of the original designs of the MB VRMs, lest VRMs go up in smoke.
AMD could tune AGESA to clock Zen3 lower on older chipsets or maybe allow MB manufacturers tune the maximums per board via BIOS tweaks. Better VRM boards would get higher power limits. Wouldn't that piss people off! heh "I bought a 5900X expecting 5GHz on my B450, but it's only boosting to 4.7GHz!" Yeah, and your VRM is hot enough to cook an egg.
Or maybe they can ship a VRM cooler in the box instead of a CPU cooler.

they don't have to adhere to anything. the board partners have to make sure they adhere to it by putting limitations in their bios to keep certain cpu's from running on their boards if the VRM's can't handle a certain cpu. this is how AMD has always done it.

at the end of the day though the only one that is at fault is the consumer and i don't feel bad for any of them if they thought they were going to take a 500+ dollar cpu and put it on a 100 dollar board from 3 years ago.
 

amd7674

[H]ard|Gawd
Joined
Jan 1, 2007
Messages
1,629
looks very promissing. As long as it trades punches with 10700k I'm going to buy it :) The question is when are these 5800x will be out? Is October 8 just an anoucement? I'm sure there would be already leaks of CPUs being manufactured, right? I have all my parts ready, except CPUs...
 

Bman123

Supreme [H]ardness
Joined
Dec 19, 2007
Messages
4,872
I don’t think the bot thing will be anywhere near as crazy with the cpus as it was with the video cards. People with ryzen 3000 cpus won’t need to upgrade, hell if your at 1440p or higher with a ryzen 2000 you really don’t need to upgrade either.
 

jamesv

Weaksauce
Joined
Mar 12, 2016
Messages
106
I have 2 x 4790k’s, i7 5775C, i7 8086k and a 3700X. All built for audio.

Cache is king on the new Tiger Lake 8 Cores, and Cezanne.
I’ll take one of each as long as 4GHz is the base.

Sure glad Matisse put Intel on their heels.
8086k was only a small step better than my 4700’s.
Matisse too much latency for real time workloads, but a great recording rig.

Tiger Lake and Cezanne look to be exciting next summer.
 

/dev/null

[H]F Junkie
Joined
Mar 31, 2001
Messages
14,976
I don’t think the bot thing will be anywhere near as crazy with the cpus as it was with the video cards. People with ryzen 3000 cpus won’t need to upgrade, hell if your at 1440p or higher with a ryzen 2000 you really don’t need to upgrade either.
Just depends what games you play & videocard.

I'm CPU limited in Far Cry 5/ND on my Ryzen 5/3600 with 3600-cas16 & PBO on. I know because when I'm driving my GPU usage falls as low as 65-75%...
 

FlawleZ

[H]ard|Gawd
Joined
Oct 20, 2010
Messages
1,105
Just depends what games you play & videocard.

I'm CPU limited in Far Cry 5/ND on my Ryzen 5/3600 with 3600-cas16 & PBO on. I know because when I'm driving my GPU usage falls as low as 65-75%...
That doesn't always mean you're CPU limited.
 
Top