NightReaver
2[H]4U
- Joined
- Apr 20, 2017
- Messages
- 2,930
People jump all over the 13900ks, afterall.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
Wasn't the 4090 only a $100 increase over the 3090? It's the 4080 with it's increase from $699 (3080) to $1199 (4080) MSRP that shows what the truly overpriced card is.To be honest, if you have enough disposal income to buy an extremely overpriced 4090, i don't think people will stop at the small price delta of the x3d
I paid the same amount of money for my 4090 as I did for my 3090... about 5~6 months after release. So, no increase in cost for meWasn't the 4090 only a $100 increase over the 3090? It's the 4080 with it's increase from $699 (3080) to $1199 (4080) MSRP that shows what the truly overpriced card is.
You mean 13600KWho buys a 5800X3D for productivity? Also, who buys a 13900K for gaming? That's like buying the better 7950X just for gaming. The 13700K should be the one you'd go for on that dead end platform - save some money for when Intel releases a new socket.
Nothing, its in my other gaming PC now thats hooked up to my entertainment center... lolHow much money did you resell your 3090 for?
Curious what your experience is with 5800X3d Vs. 7700x, in VR?Having used a 5800X and 5800X3D and now 7700X and soon 7800X3D I just can't agree with you here but as always time will tell. Yes it may again depend upon the games engine how much improvement we'll see but all the DX11 VR flight sims I run will surely get a very positive result just as they did with the 5800X3D from the larger cache especially in the 1% lows which make or break immersion in VR.
Ryzen 7000 actually not thermally constraints: due to it's high TDP, it will always try to maximize the boosted clockspeed using any headroom left by your cooling capacity. It will boost until it reaches 95c.I'm also going to have a ton of radiator surface area for it in my custom loop, which should help with the thermal constraints somewhat.
More cooling means it takes more boosting to hit that max temp, no?Ryzen 7000 actually not thermally constraints: due to it's high TDP, it will always try to maximize the boosted clockspeed using any headroom left by your cooling capacity. It will boost until it reaches 95c.
Exception: Ryzen 7000 non-X, because the default TDP is only 65w, and it boosts a bit more conservative compared to it's X variants.
Slight correctionMore cooling means it will always boosts to hit that max temp
You made a grammatically correct statement grammatically incorrect, while still meaning the same thing.Slight correction
Sorry for that, I'm not a native speaker.You made a grammatically correct statement grammatically incorrect, while still meaning the same thing.
This old argument...2023 and we're still basing performance on $700 CPUs on 1080p benchmarks?
If the x3D chips were crushing Raptorlake, AMD would be yelling it from the rooftops by now.
I'll wait to see the 4k benchmarks, especially the lows, but I have a feeling the entire package may be a bit underwhelming.
I somewhat feel you there, although the bottleneck is really not noticeable outside benchmarks. At least with the games I play at 4K, I'd never know on my 4090 at 4K without staring at the FPS counter and 1% lows the entire game. Everything still runs smooth as butter.I've been holding out on the 7950X3D for my upgrade. My Pro Tools sessions which are surround mixes for post production utilize all 32 threads of my 5950X and I could even use a bit more juice in that department, and I also do gaming on the same machine. Plus, the 5950X apparently is a bottleneck when paired with a 4090 even at 4K. I also feel like my 5950X's memory controller has not been doing so well with my 64GB 3600MHz CL14 DDR4 kit. So for my use case, it's perfect. I'm also going to have a ton of radiator surface area for it in my custom loop, which should help with the thermal constraints somewhat.
Here is the main point from that reviewhttps://www.thefpsreview.com/2023/02/27/amd-ryzen-9-7950x3d-gaming-performance-cpu-review/
another review to take a gander at.
The intent of the AMD Ryzen 9 7950X3D is a CPU to bridge the gap between wanting to game with the best performance and having 16-cores/32-threads for productivity. Literally, this CPU is a “have your cake and eat it too” CPU. While it certainly fulfills this goal perfectly, we did not experience any advantages in using the Ryzen 9 7950X3D over the Ryzen 9 7950X for gaming. With a GeForce RTX 4090, we experienced no gameplay advantage at 4K, or with Ray Tracing. If you game at a high resolution or have a less powerful graphics card than the most expensive one possible, the advantages are not there.
Where are you getting that info?Even in situations where 7950X3D is outperforming a 13900K for average FPS, it's often times behind in the 1% and .1% lows, meaning Intel would have the more consistent frame times more often then not.
I have a different take. I'm actually quite pleased because I actively do want a 'best of both worlds' cpu. I do use my personal machine for work occasionally (software development), but even when gaming I have an inordinate number of other apps, browser tabs, videos going on another monitor. That in conjunction with the efficiency win puts this as big win for me. Bear in mind I am talking from the perspective of purchasing an entire new system, not a peicemeal upgrade from an existing system. YMMV if you are coming more from a partial upgrade perspective.Hard pass on these latest X3D CPU's. The non-homogenous CCD design is simply too complex for thread scheduling to have any semblance of reliability and consistency. In order to get the best gaming performance, half your cores have to be "parked" but what if you're gaming + streaming?
Even in situations where 7950X3D is outperforming a 13900K for average FPS, it's often times behind in the 1% and .1% lows, meaning Intel would have the more consistent frame times more often then not.
I have a different take. I'm actually quite pleased because I actively do want a 'best of both worlds' cpu. I do use my personal machine for work occasionally (software development), but even when gaming I have an inordinate number of other apps, browser tabs, videos going on another monitor. That in conjunction with the efficiency win puts this as big win for me. Bear in mind I am talking from the perspective of purchasing an entire new system, not a peicemeal upgrade from an existing system. YMMV if you are coming more from a partial upgrade perspective.
In terms of CCD allocation it appears generally that windows and the AMD drivers do a pretty good job of getting it right, but for those cases where it doesn't, AMD clearly has provided a brute force mechanism to manually direct the application. Perhaps not super elegant but at least an available option.
Personally I would save the money and go buy a 5800x3dI've been waiting patiently for the 7800x3d to force me into the AM5 hierarchy. Been running a 1800x since release and I'm finally ready to upgrade. New PSU and GPU already in, just need RAM, mobo, and CPU and then probably jumping over to m.2 drives and I'll be good to go for another 6 or 7 years
It's not a best of both worlds CPU though. If you enable both CCD's your gaming performance suffers significantly. If you enable half of them, your productivity/multitasking suffers. That's not the best of anything, that's a compromise.I have a different take. I'm actually quite pleased because I actively do want a 'best of both worlds' cpu. I do use my personal machine for work occasionally (software development), but even when gaming I have an inordinate number of other apps, browser tabs, videos going on another monitor. That in conjunction with the efficiency win puts this as big win for me. Bear in mind I am talking from the perspective of purchasing an entire new system, not a peicemeal upgrade from an existing system. YMMV if you are coming more from a partial upgrade perspective.
In terms of CCD allocation it appears generally that windows and the AMD drivers do a pretty good job of getting it right, but for those cases where it doesn't, AMD clearly has provided a brute force mechanism to manually direct the application. By default this is exposed via Windows XBox Game Bar. Perhaps not super elegant but at least an available option and the hooks are obviously there that could be taken advantage of by other software (it appears to just be some kind of whitelisting mechanism as the community guessed).
Regarding the 'parking'... I think there is some confusion here. As far as I am aware I dont think you literally have 8 cores disabled and essentially non-functional when gaming. That would be silly. I believe it just means your gaming process is walled off/wont access any of the cores on the other CCD. This seems to be corraborated by the Hardware Unboxed video where actually physically disabling the non cache CCD *did* show a small but measureable benefit for some games as well. If the non cache CCD was already disabled by 'parking' this wouldnt be the case.
Thefpsreview ones are always good.Is there actually a good review worth reading?
Thank God for me finding this site and Brent reviews again. After reading it even my 5800X3D seems good enough lol.https://www.thefpsreview.com/2023/02/27/amd-ryzen-9-7950x3d-gaming-performance-cpu-review/
another review to take a gander at.
I have a different take. I'm actually quite pleased because I actively do want a 'best of both worlds' cpu. I do use my personal machine for work occasionally (software development), but even when gaming I have an inordinate number of other apps, browser tabs, videos going on another monitor. That in conjunction with the efficiency win puts this as big win for me. Bear in mind I am talking from the perspective of purchasing an entire new system, not a peicemeal upgrade from an existing system. YMMV if you are coming more from a partial upgrade perspective
dude. I'm at a steady 200+ fps in world of warcraft on 1440@144 with mine. I've never experience such an upgrade like this since I went from on board graphics to a 256mb card for Half Life 1. its unreal.for my needs the 5800X3D would be the better upgrade...I also wouldn't need to upgrade my motherboard and memory