Ryzen 4 Reviews Are Out.

ended up ordering this with slowest shipping lmao. I can wait till next Thursday. Newegg happy with wife’s email and account I guess she didn’t get banned Lmao. Looks clear than steel legend and no stickers.


Check this out on newegg: ASUS ROG STRIX X670E-A GAMING WIFI 6E Socket AM5 (LGA 1718) Ryzen 7000 gaming motherboard(16 + 2 power stages, PCIe 5.0, DDR5 support, four M.2 slots with heatsinks, USB 3.2 Gen 2x2, WiFi 6E, AI Cooling II, and Aura Sync )
https://www.newegg.com/asus-rog-str...m_mmc=snc-social-_-sr-_-13-119-585-_-09302022
 
Some Asrock boards may have issues with stickers on the RAM slots. Too much adhesive and the stickers are not coming off cleanly:

https://www.techpowerup.com/299450/asrocks-x670-motherboards-have-numerous-issues-with-dram-stickers

"This one is likely to go down ASRock's internal history as a failure of sticking proportions. Namely, it seems that some ASRock motherboards in the newly-released AM5 X670 / X670E family carry stickers overlaid on the DDR5 slots. The idea was to provide users with a handy, visually informative guide on DDR5 memory stick installations and a warning on abnormally long boot times that were to be expected, according to RAM stick capacity. But it seems that these low-quality stickers are being torn apart as users attempt to remove them, leaving behind remnants that are extremely difficult to clean up and which can block DRAM installation entirely or partially."
 
MicroCenter has free DDR5 now if you get a 7xxx series CPU. I'm guessing sales have been lower than expected. My store (MN) has had 25+ 7950X since launch day on the site.

View attachment 515105

Well I almost managed to not upgrade my 5900X ... just grabbed a 7900X along with this deal. Cambridge MA has had 25+ of everything as well since launch. The checkout person told me they opened an hour early for the launch and sold 3 that day.
 
Well I almost managed to not upgrade my 5900X ... just grabbed a 7900X along with this deal. Cambridge MA has had 25+ of everything as well since launch. The checkout person told me they opened an hour early for the launch and sold 3 that day.
Between this and what's happening with Nvidia, the market is going to have to capitulate at some point.
 
Between this and what's happening with Nvidia, the market is going to have to capitulate at some point.
As it should these motherboard prices are just cash grabs. $1300 for a motherboard is simply ludicrous. Total build cost of these boards is likely less than $350 so I'll just watch em flounder on sales until reality sets in if it takes years.
 
As it should these motherboard prices are just cash grabs. $1300 for a motherboard is simply ludicrous. Total build cost of these boards is likely less than $350 so I'll just watch em flounder on sales until reality sets in if it takes years.

It will be rare. Those boards are going to be loss eaters. Soon. These guys need to realize that insane pricing can only last so long.
 
ended up ordering this with slowest shipping lmao. I can wait till next Thursday. Newegg happy with wife’s email and account I guess she didn’t get banned Lmao. Looks clear than steel legend and no stickers.


Check this out on newegg: ASUS ROG STRIX X670E-A GAMING WIFI 6E Socket AM5 (LGA 1718) Ryzen 7000 gaming motherboard(16 + 2 power stages, PCIe 5.0, DDR5 support, four M.2 slots with heatsinks, USB 3.2 Gen 2x2, WiFi 6E, AI Cooling II, and Aura Sync )
https://www.newegg.com/asus-rog-str...m_mmc=snc-social-_-sr-_-13-119-585-_-09302022
Nice board. Just reserved the only one left at my local MicroCenter. Lots of bang there for $420!
 
First day working with my 7950x went well. Solidworks is single threaded most of the time due to how parametric modeling works, it does have bursts that slam a bunch of cores. I can tell a difference though. Far less hitches and pauses between commands and the frame rate in the view port is a lot higher on more demanding models.
I spun up visualize to test out rendering. I had cpu assist disabled due to it pounding my 9900k's ass so hard that it would heat soak my AIO if I spent a long time setting up a scene with raytracing on. With the cores locked at 5ghz the water temp would creep up into the low 40s and after that reboot.
Turned it on with the 7950 and spinning the model super smooth. Stop moving it and it denoises in a couple of seconds. Pretty amazing.
 
Last edited:
As it should these motherboard prices are just cash grabs. $1300 for a motherboard is simply ludicrous. Total build cost of these boards is likely less than $350 so I'll just watch em flounder on sales until reality sets in if it takes years.
They could be so low volume that once you count the R&D and everything cost by board, they still end up money looser even with very large gross margin by units that are halo publicity product subsidized by the Intel of the world.

For some of those models do they sells 10,000 even 5,000 world wide ? Almost some form of artisanal work.
 
Honestly, I'll probably buy an AM5 system, but if the recent past has taught me anything, it's wait until the 2nd Gen. X470/B450 were (are) still great systems that handled every CPU the platform had throughout its lifespan with the least amount of complications.

I'll probably upgrade my DDR4 based Z690 to a Raptor Lake in October and then build new again with 2nd gen AM5.
 
CPUs in high res gaming don't really make much of a difference, it mostly all video card doing the work then.

I recall reading a review a few years ago, comparing a brand new highest end $999 CPU vs. a 2 year old budget CPU at $199, and the benchmarks showed maybe a 5% to 10% difference at most, and the higher the resolution got, the less performance difference it was, at 4k it was like 2% to 3%, and 95% was the GPU.
 
Honestly, I'll probably buy an AM5 system, but if the recent past has taught me anything, it's wait until the 2nd Gen. X470/B450 were (are) still great systems that handled every CPU the platform had throughout its lifespan with the least amount of complications.

I'll probably upgrade my DDR4 based Z690 to a Raptor Lake in October and then build new again with 2nd gen AM5.
After seeing what 3d v-cache can do. I would honestly wait for those reviews. Imagine if the rumors are true and it really gets 20-30% more performance (in gaming at least). Thats a way better reason to invest in AM5 since Raptor Lake is basically EOL since there will no more CPU upgrades for Z690/790
 
CPUs in high res gaming don't really make much of a difference, it mostly all video card doing the work then.

I recall reading a review a few years ago, comparing a brand new highest end $999 CPU vs. a 2 year old budget CPU at $199, and the benchmarks showed maybe a 5% to 10% difference at most, and the higher the resolution got, the less performance difference it was, at 4k it was like 2% to 3%, and 95% was the GPU.
This, for almost all of them
After seeing what 3d v-cache can do. I would honestly wait for those reviews. Imagine if the rumors are true and it really gets 20-30% more performance (in gaming at least). Thats a way better reason to invest in AM5 since Raptor Lake is basically EOL since there will no more CPU upgrades for Z690/790
This seems to be the exception - there are places the extra cache can make a difference, if your main goal is gaming only (which I get - I've got one dedicated gaming/entertainment system). I'll be watching the V-cache release to see if it justifies a replacement of the Z490 box(es) or not. But it won't work for my other use cases, so that does make it a one-off (not enough cores, unless they do a 7950X3D).
 
This is pretty impressive - feels more or less like parity with my TR3 3960x at a fraction of the cost and a fraction of the power consumption. Hmm.
 
Please post your data. Becisse I’ve been searching and at 1440p and 1440p ultra wide which is my resolution it’s a wash between several processors due to GPU limitation. Sure you’re not making
No one is claiming any improvements at 4k. I didn't. No one is claiming it for 1440p either. All reviewers are saying is that the CPUs can drive the 3090 Ti harder. At 1080p the CPUs are getting the 3090 Ti to produce more frames. That is an indisputable fact. Hardware Unboxed showed 7 out 12 games showed more frames in 1080p. They haven't posted their 1440p graphs to Patreon yet. Anandtech showed 50% of the games they tested produced more frames at 1080 and didn't test at 1440p.
 
Honestly, I'll probably buy an AM5 system, but if the recent past has taught me anything, it's wait until the 2nd Gen. X470/B450 were (are) still great systems that handled every CPU the platform had throughout its lifespan with the least amount of complications.

I'll probably upgrade my DDR4 based Z690 to a Raptor Lake in October and then build new again with 2nd gen AM5.

I honestly went totally against this myself. Was going raptor lake but not changing out motherboard for next 3-4 years. That is the advantage I saw in x670e. They seem stable and have pretty much all the bells and whistles. IDK if AMD will release any chipset for another few years given how much seems to have gone in this one. Going raptor lake now on its last gen seemed like a bad move to me and brought me back to the table thinking it through. Don't wanat to rip and replace for a while. Probably will get x3d chips once they are released to upgrade to.
 
No one is claiming any improvements at 4k. I didn't. No one is claiming it for 1440p either. All reviewers are saying is that the CPUs can drive the 3090 Ti harder. At 1080p the CPUs are getting the 3090 Ti to produce more frames. That is an indisputable fact. Hardware Unboxed showed 7 out 12 games showed more frames in 1080p. They haven't posted their 1440p graphs to Patreon yet. Anandtech showed 50% of the games they tested produced more frames at 1080 and didn't test at 1440p.
Why would you use a 3090TI on a 1080 screen?
 
Nice board. Just reserved the only one left at my local MicroCenter. Lots of bang there for $420!
yea solid board indeed. looks like newegg is out too. lot of people were buying it when I pulled the trigger. It said over 30 people had it in cart this morning when I was buying. Looks like lot of people liked it.
 
I mean I guess, but god is it ugly to me at least. I’m ready for the next gen of 4K screens to leave 1440P behind.
I’m waiting for the 4xxx series to maybe consider 4K, lol. Why would I want to rock barely 60 FPS on 4K??

There’s a reason I don’t play Xbox or PlayStation…
 
I’m waiting for the 4xxx series to maybe consider 4K, lol. Why would I want to rock barely 60 FPS on 4K??

There’s a reason I don’t play Xbox or PlayStation…
I’m easily pushing well above that on the titles I do play in 4K; and that’s on my older 2080TI. The 3090 will easily be in the 80-90s for everything I’ve tried to run. Heck, even on my G9 (3080) I’m well over 60fps. All depends on the games you’re playing.
 
I’m easily pushing well above that on the titles I do play in 4K; and that’s on my older 2080TI. The 3090 will easily be in the 80-90s for everything I’ve tried to run. Heck, even on my G9 (3080) I’m well over 60fps. All depends on the games you’re playing.
That’s crazy.

I’ll go the other way - why do you have a G9 if you don’t want to push max frames? Wouldn’t it be better to have a less expensive monitor if you don’t have the hardware to push it?
 
That's not the point.
It is. If you want to eliminate all GPU constraints by running a very low resolution, then yes, you can discern a difference between CPUs. Now show me real world examples like [H] used to - at 1440P, 5160x1440 (neo g9), or 4K. Let me know if there’s a difference there. I’m moving up from 1440P. I want to know if there are notable minimum frame time differences at real resolutions - not that the X3d beats Intel by 15% at 1080P, because I’m not going to game at that resolution- especially not with a 3090!
 
That’s crazy.

I’ll go the other way - why do you have a G9 if you don’t want to push max frames? Wouldn’t it be better to have a less expensive monitor if you don’t have the hardware to push it?
The G9 is on a 10980XE with 128G of RAM and 10T of NVMe. It does a LOT of things - and was supposed to originally get a 6900XT (but Covid - ended up with a 3080 FTW for it instead). I can easily push enough frames with the eye-candy turned up for it to look great and play great, when I do use it for gaming - and when I find a game that works well with a 21:9 aspect ratio (few and far between, sadly).

My main gaming box is a 3090 + 10900K (all on custom water) with a 1440P G-Sync screen (one of the older, first gen high-end IPS+Gsync ones, can't remember the exact model). I REALLY want to jump this to a 32" 4k screen, but they're all a little... flawed right now, and I won't invest in "close but not quite". My 4k box is a x399 + 64G + 2080TI, because it mostly runs as a server. I play things like Tomb Raider or HorizonZD (think console-style games that you really want a controller for) on it, and the OLED screen it has is pre-VRR, so as long as it can maintain 60FPS, I'm golden. Disabling v-sync and I can easily push 80+ FPS on those titles - so I've got the headroom for 60Hz. That's all I need it to do.

(Did end up with a 6800XT - that's on the 3960X TR box - another server/workstation/gaming machine).

All of those systems (minus the 4k one - waiting for the true follow-ons to the CX series of OLED screens to justify upgrading the system) have Gsync or Freesync - as long as it's over 60FPS, it's smooth. I'm not playing competitive games, and I rarely play online (with one exception which isn't really PVP) - make it pretty, make it smooth. I'll take the eye candy over the extreme frame rates. My first 3d card was a Verite V1000 - I'm totally used to being happy when we finally hit 30FPS stable.
 
That’s crazy.

I’ll go the other way - why do you have a G9 if you don’t want to push max frames? Wouldn’t it be better to have a less expensive monitor if you don’t have the hardware to push it?
Oh, and flip it the other way again - why have a G9/etc if you're not going to crank all the visual goodies up? No card out there can do 140fps+ on those types of screens, so might as well take 80-90 and "pretty" instead.
 
It is. If you want to eliminate all GPU constraints by running a very low resolution, then yes, you can discern a difference between CPUs. Now show me real world examples like [H] used to - at 1440P, 5160x1440 (neo g9), or 4K. Let me know if there’s a difference there. I’m moving up from 1440P. I want to know if there are notable minimum frame time differences at real resolutions - not that the X3d beats Intel by 15% at 1080P, because I’m not going to game at that resolution- especially not with a 3090!
Fortnite

https://www.techspot.com/review/2451-ryzen-5800x3D-vs-ryzen-5800x/
 
Like I said - there are definitely titles where x3d makes a difference. I thought we were talking Zen4 vs Zen3 (v-cache or not) though? And I’d have called out more modern titles like war zone from that article; it also showed a difference at all 3 resolutions.

Don’t get me wrong - if you asked me to build the best gaming system today, I’d put a 5800X3D in it without even considering anything else. Damned fine chip that. Came out too late for me and my considerations though, at the time.
 
No one is claiming any improvements at 4k. I didn't. No one is claiming it for 1440p either. All reviewers are saying is that the CPUs can drive the 3090 Ti harder. At 1080p the CPUs are getting the 3090 Ti to produce more frames. That is an indisputable fact. Hardware Unboxed showed 7 out 12 games showed more frames in 1080p. They haven't posted their 1440p graphs to Patreon yet. Anandtech showed 50% of the games they tested produced more frames at 1080 and didn't test at 1440p.
Ok so you're saying if you happen to be CPU limited a more powerful CPU will help...

Thanks Captain Obvious for that wonderful insight.
 
CPUs in high res gaming don't really make much of a difference, it mostly all video card doing the work then.

I recall reading a review a few years ago, comparing a brand new highest end $999 CPU vs. a 2 year old budget CPU at $199, and the benchmarks showed maybe a 5% to 10% difference at most, and the higher the resolution got, the less performance difference it was, at 4k it was like 2% to 3%, and 95% was the GPU.
It does. 4k isn´t a magical resolution where you can stick an old, low budget CPU and call it a day for everyone. 4k just pushes the GPU more then resolutions below 4k. If you get CPU limited or not at 4K, depends on the game and on your framerate target. Before this, 1080p and 1440p was what some considered the "magical resolutions where CPU doesn´t matter".

Sure, if you run the GPU to the max on 4k and the CPU gets breathing room, you will not feel that much difference with a more powerful CPU. However, if you start dialing down the graphical settings to gain higher framerate, lets say you want to reach a solid 120FPS, your cheap CPU will crap on itself and not deliver. If you watch Spiderman remastered shown on Nvidias keynote regarding the 4090 launch, you'll notice the CPU crap on itself even when the GPU is pushed to the max (and DLSS 3 to the rescue).

My point is, CPU doesn´t make a difference in some games and in some scenarios at higher resolution, while in other games and other scenarios, it makes a big difference. Some do prefer 120fps @ 4k and not all CPUs can reach that. Though you might be golden having 60fps @ 4k on ultra or 30fps @ ultra, you might want to get 120FPS @ 4K and rather dial down to high or medium. Suddenly, you are out of 4K reviewer benchmarks at Ultra settings territory and into real world user scenario. Then the CPU benchmarks at 1080P starts to become useful.

Nothing magical about resolution, games and settings matter too and your target refresh rate. Not to mention when dips occur in game. You can cruise along with high refreshrate in some games, but when a firefight shows up and explosions are everywhere and you need that high refresh rate the most, it tanks due to you cheaped out too much on the CPU.

It all depends on you, your usecase and what you find good enough to answer if CPU in high res gaming makes a difference. For some it does, for others it doesn´t. :)

Look at this and think that you are not only thinking of resolution, but also framerate target. You might dial in settings to reach your targets. CPU matters then in some of the games:

 
Last edited:
Like I said - there are definitely titles where x3d makes a difference. I thought we were talking Zen4 vs Zen3 (v-cache or not) though? And I’d have called out more modern titles like war zone from that article; it also showed a difference at all 3 resolutions.

Don’t get me wrong - if you asked me to build the best gaming system today, I’d put a 5800X3D in it without even considering anything else. Damned fine chip that. Came out too late for me and my considerations though, at the time.
Not as many results for 7950X and Fortnite - but the gains are there - https://www.cgmagonline.com/review/hardware/amd-ryzen-9-7950x-cpu-review/. Fortnite is actually a very modern title as the engine scales like no other (by Epic Games and using Unreal Engine). Plus, if my use case and why I have a high Hz panel and an OLED one. :)
 
It is. If you want to eliminate all GPU constraints by running a very low resolution, then yes, you can discern a difference between CPUs. Now show me real world examples like [H] used to - at 1440P, 5160x1440 (neo g9), or 4K. Let me know if there’s a difference there. I’m moving up from 1440P. I want to know if there are notable minimum frame time differences at real resolutions - not that the X3d beats Intel by 15% at 1080P, because I’m not going to game at that resolution- especially not with a 3090!
You aren't. But many are. Everyone's situation, needs, and desires, are different.
 
Not as many results for 7950X and Fortnite - but the gains are there - https://www.cgmagonline.com/review/hardware/amd-ryzen-9-7950x-cpu-review/. Fortnite is actually a very modern title as the engine scales like no other (by Epic Games and using Unreal Engine). Plus, if my use case and why I have a high Hz panel and an OLED one. :)
It’s also a multiplayer game that I have zero interest in. Hence why I’m not running a 1080p screen with a 3090.

You do you, but I’ve never understood that- but I’m not doing competitive or online games anymore. Don’t have time for that these days. Unlike my friends 10 year old, I’d I played Fortnite 10 hours a day, the wife would kill me and then I’d get fired 😂. And I don’t want my ass getting kicked by said 10 year old either.
 
Last edited:
It’s also a multiplayer game that I have zero interest in. Hence why I’m not running a 1080p screen with a 3090.

You do you, but I’ve never understood that- but I’m not doing competitive or online games anymore. Don’t have time for that these days. Unlike my friends 10 year old, I’d I played Fortnite 10 hours a day, the wife would kill me and then I’d get fired 😂. And I don’t want my ass getting kicked by said 10 year old either.
Great! The 1080p/3090 Ti inquisition is closed. :)
 
You aren't. But many are. Everyone's situation, needs, and desires, are different.
True, but your original statement that started this debate was a blanket, massively over generalized one and made no mention of any specific use cases.
 
Back
Top