9900k or 3800x

Yeah, and 0.2% has a 2080Ti. I would game at 1080p on a 1060 too.

No disagreement- I don't find 1080p results useful for my personal gaming, but since its a common resolution it does provide a good indication for potential CPU scaling.
 
When you're talking over 144fps, it's largely irrelevant.. even if you're really keen.

Frankly the single best test I've seen for performance to date is the puget after effects test. Tests single and multithreaded in one hit, gives a good general productivity "score" .

From https://www.thefpsreview.com/2019/07/07/amd-ryzen-9-3900x-cpu-review/7/

10-2.png



https://www.thefpsreview.com/2019/07/15/amd-ryzen-9-3900x-cpu-review-new-bios-performance-tested/5/

10-4.png


I game at 4k/60hz. I'm an unusual outlier there, but then again, I don't really get into FPSes much anymore and when I do, that is sufficient.
 
Last edited:
I game at 4K / 60Hz too. All I play are shooters for the most part. The key is consistency in frame rates, input lag, and so on. I don't believe your a better player at 144Hz vs. 60Hz. I think there is a slight potential advantage using a faster display, but I think this is often over stated.
 
While the X570 boards offer nothing of value yet and are expensive, going the intel route is not cheaper, to run a 9900K at 5Ghz you need a top end board which is also very expensive and you also need a good cooling solution.

If you are just gaming then the 9900K should work for you there.
You don't need a $600 motherboard to get 5 GHz all-core with a 9900K. Plenty of $130-$150 boards on the market will do it. My board was $179 and it does it in addition to running my 4 sticks of B-die memory at 4000 MT/s.
 
People on the internet said:
No one games at 1080p so these benchmarks are irrelevant

This argument isn't an argument, it's a weak defence at best. In this case the 9900K is most definitely and very measurably faster in gaming. Benching at a 'lower' res helps expose this difference. For the record I have a 4K screen, but play Apex Legends @ 1080p@240hz for the framerate and I'm sure many competitive gamers choose this sort of compromise, so it's hardly a niche resolution.

The OP has said he is 70% gaming.

In my mind, the path is very clear here.
 
I game at 4K / 60Hz too. All I play are shooters for the most part. The key is consistency in frame rates, input lag, and so on. I don't believe your a better player at 144Hz vs. 60Hz. I think there is a slight potential advantage using a faster display, but I think this is often over stated.

Having played a bit of DGL the biggest difference to online shooters is your mouse and keyboard.

In a space of 5 years I went from a 23" LED 60hz monitor to a 1080P 24" 144hz TN to a 1440P 120hz OLED back down to a 1440P 120hz TN panel. I prefer the TN panels as a FPS gamer, anything over 1ms gives me eye fatigue and while OLED is pretty the 5-8ms response times are just gross.
 
You don't need a $600 motherboard to get 5 GHz all-core with a 9900K. Plenty of $130-$150 boards on the market will do it. My board was $179 and it does it in addition to running my 4 sticks of B-die memory at 4000 MT/s.
You don't need a $600 MB for either system. There are affordable boards for both.
In the AMD camp there are X570/X470 MB that cover every feature (some exclusive) and price point. To say otherwise is disingenuous.
 
This argument isn't an argument, it's a weak defence at best. In this case the 9900K is most definitely and very measurably faster in gaming. Benching at a 'lower' res helps expose this difference. For the record I have a 4K screen, but play Apex Legends @ 1080p@240hz for the framerate and I'm sure many competitive gamers choose this sort of compromise, so it's hardly a niche resolution.

The OP has said he is 70% gaming.

In my mind, the path is very clear here.

Some 60% of gamers are still on 1920x1080 according to AMD.
 
When you're talking over 144fps, it's largely irrelevant.. even if you're really keen.

Frankly the single best test I've seen for performance to date is the puget after effects test. Tests single and multithreaded in one hit, gives a good general productivity "score" .
I don't think there are really any good one size fits all tests. Ultimately you need to decide what you want. Mixing up single and multi threaded results where we don't even know how much one affects the score is not very helpful.

It is more important that a frame takes 9 minutes instead of 10 to render, than differences measured in the sub second range when performing various tasks in photoshop for example.
 
In my country the 3900X is 9200 and the 9900K is 11500, given that the 3900X works out the box there is almost no value point on the 9900k now which is why they have gone with huge retailer sales on it, writing off 2500 in losses per chip is crazy.

I am kinda pissed a couple days later a custom combo of ASUS Strix X470-F, 16gb trident X 3200 RGB and 3600 came out for 7000 which is dirt cheap
 
You don't need a $600 MB for either system. There are affordable boards for both.
In the AMD camp there are X570/X470 MB that cover every feature (some exclusive) and price point. To say otherwise is disingenuous.
True, but I didn't say anything about AMD. The person I was responding to said you needed a "top end board" to run the 9900K at 5 GHz when that is patently false.
 
It is more important that a frame takes 9 minutes instead of 10 to render, than differences measured in the sub second range when performing various tasks in photoshop for example.


After effects does this.. I digress. It is more important to have a stable platform that doesn’t have the potential to crash(and thus lose work/productivity) when you try to use an esoteric filter than taking an extra minute to render a frame.

As mentioned, I have faith AMD will get there, but a lot of people use NVidia cards for rendering (3D and video), and the recent whea stuff wouldn’t just be games... same goes for Linux in the programming field..
 
For your usage I would go for i9 9900k at this given time.That is if you can get at least some deal on i9 9900k.
I love new AMD CPU,s performance and price is incredible.But its new architecture and its 1st huge step forward from the AMD.I would give 2-3 years for them to mature driver and application support wise.

If AMD keeps up the good work AMD is definitely gonna be my next build in couple of years
 
For me it was simple. The most demanding thing I do with my computer is game. So I skipped to the gaming benchmarks. After reading and watching many reviews it was clear to go Intel this round for gaming. I was hoping to build an AMD computer this time but it didn't make sense. If I build a new computer it better last 4++ years so the CPU upgrade path didn't matter to me. Heck if you're upgrading your CPE every 1-2 years you probably should have spent that money upfront and got a better CPU to begin with. Or upgrade your GPU.
 
For me it was simple. The most demanding thing I do with my computer is game. So I skipped to the gaming benchmarks. After reading and watching many reviews it was clear to go Intel this round for gaming. I was hoping to build an AMD computer this time but it didn't make sense. If I build a new computer it better last 4++ years so the CPU upgrade path didn't matter to me. Heck if you're upgrading your CPE every 1-2 years you probably should have spent that money upfront and got a better CPU to begin with. Or upgrade your GPU.

Just nitpicking but:

1) Why can't AMD last 4 years plus?
2) X570 supports next generation PCIe 4.0 graphics cards thus lasting longer.
 
Same situation- running a 5820k @ 4.3GHz for ~5yrs and want to upgrade and not sure where to go. Mostly gaming with a 2080Ti @ 4k/60Hz. Daily use stuff, surfing, word processing, etc.

I realize gaming @ 4k with anything is going to be GPU limited. Should I just wait for Intel's next gen to see how it all pans out? I'm thinking this gen of AMD's CPUs will be mature by then.
 
Same situation- running a 5820k @ 4.3GHz for ~5yrs and want to upgrade and not sure where to go. Mostly gaming with a 2080Ti @ 4k/60Hz. Daily use stuff, surfing, word processing, etc.

I realize gaming @ 4k with anything is going to be GPU limited. Should I just wait for Intel's next gen to see how it all pans out? I'm thinking this gen of AMD's CPUs will be mature by then.

I ran a Core i7 5960X @ 4.5GHz for 4 and a half years. I don't know that you should bother with anything really. I plan on testing that out. I didn't get anything out of the move to a Threadripper 2920X as far as gaming goes. I've been meaning to hook up one of the test bench rigs with the 3900X and the 9900K to the same monitor and seeing what kind of performance I get.
 
After effects does this.. I digress. It is more important to have a stable platform that doesn’t have the potential to crash(and thus lose work/productivity) when you try to use an esoteric filter than taking an extra minute to render a frame.

As mentioned, I have faith AMD will get there, but a lot of people use NVidia cards for rendering (3D and video), and the recent whea stuff wouldn’t just be games... same goes for Linux in the programming field..
I use Nvidia for rendering, what does that have to do with AMD?
 
Just nitpicking but:

1) Why can't AMD last 4 years plus?
2) X570 supports next generation PCIe 4.0 graphics cards thus lasting longer.

Wasn't the point I was trying to make. AMD should easily last 4+ years. My point was the upgrade path advantage of AM4 didn't matter to me. PCIe 4.0 could very well be an advantage later. Depends if future graphics cards actually use more bandwidth than PCIe 3.0 16x can handle. From what I read a 2080 TI doesn't. I see it helping if running multiple GPUs.
 
I ran a Core i7 5960X @ 4.5GHz for 4 and a half years. I don't know that you should bother with anything really. I plan on testing that out. I didn't get anything out of the move to a Threadripper 2920X as far as gaming goes. I've been meaning to hook up one of the test bench rigs with the 3900X and the 9900K to the same monitor and seeing what kind of performance I get.
I guess I’ll hold out until there’s a demonstrable difference with CPUs for 4K gaming.
 
A 9900K oc’d to 5Ghz is still better in games. For everything else I’d go 3800x or hell even the 3900X
 
Behind in per-core gaming performance today, and will be behind in four years when AM4 is an afterthought.



This is meaningless. By the time PCIe 4.0 matters for gaming at all, AM4 CPUs will be obsolete for the purpose.


Same with Intel's current line up too. Single digit performance gain is not a game stopper. Why do you keep ragging about the performance of these SKUs? I feel this is the P4 era all over again and look how well those aged. I'll be sure to let you know how playable my Ryzen 3700x is in games in 4 years.


EDIT: If PC games are only based on 2-4 cores after 4 years time. There will be issues elsewhere or PC Gaming will be close to dead.
 
Single digit performance gain is not a game stopper.

It's usually more.

Why do you keep ragging about the performance of these SKUs?

I prefer to deal in facts? I'll recommend AMD or whatever else when it makes sense for the application.

I feel this is the P4 era all over again and look how well those aged.

Some are still running... not terribly different from Bulldozer really, though the P4 was closer to its competition.

Neither are really relatable to the current situation.

I'll be sure to let you know how playable my Ryzen 3700x is in games in 4 years.

I'd expect it to still play well in four years, unless AMD messed up something serious that affects longevity, and to be still slower than a 9700k in games.
 
Urr - 1080p? .. right.. few people game at 1080p these days..

This is simply not true. AMD even admits that some 60% of people still game at 1920x1080. Keep in mind, this is the resolution where AMD's weakest. If few people played games at such a CPU limited resolution, you better believe AMD would call attention to that. It would only make Ryzen 3000 series CPU's look like an even better value if that were the case.
 
This is simply not true. AMD even admits that some 60% of people still game at 1920x1080. Keep in mind, this is the resolution where AMD's weakest. If few people played games at such a CPU limited resolution, you better believe AMD would call attention to that. It would only make Ryzen 3000 series CPU's look like an even better value if that were the case.


And how many % are above 144hz? Yeah exactly the point... pointless argument..


Here's your own article if you need a refresher -

https://www.thefpsreview.com/2019/07/15/amd-ryzen-9-3900x-cpu-review-new-bios-performance-tested/6/
 
If the OP is not gaming on a 2080 or 2080ti class GPU, and primary use is 80% gaming (as stated), any CPU above $300 is an absolute waste.
Don't even consider anything over a 9700k/3700X unless you have above GPU.
 
If the OP is not gaming on a 2080 or 2080ti class GPU, and primary use is 80% gaming (as stated), any CPU above $300 is an absolute waste.
Don't even consider anything over a 9700k/3700X unless you have above GPU.

This assumes that they'd upgrade their CPU when they upgrade their GPU, and that's a very poor assumption.

Beyond that, OP hasn't responded since the first page...
 
That doesn't seem to be related to rendering. And they weren't even able to reproduce the issues themselves. I was worried for a moment, but this just seems hearsay at this point. I'll find out next week if there are any actual problems. Oh, and F AMD for holding out on the EU. I wanted a 3900X on release day, and still they say the earliest it will ship is the 23rd.



yeah.. they were.

Update: Manuel and the Nvidia driver team notified us that they were able to reproduce the problem and are looking into it, so no need to send anymroe information their way.
 
I prefer to deal in facts? I'll recommend AMD or whatever else when it makes sense for the application.

I always invest in the "smart money", and if you can name a better $200 value than the 3600 in the past DECADE - I'm all ears.

It's basically a $200 8700k which is crazy when you think about it....
 
This depends entirely on the rest of the system and the application, because I'd be comparing motherboard and memory cost too. Value is relative- what's the benefit, and what's the total cost delta?

Used MSI X470 Gaming Pro Carbon for $80 and 16GB 3200 for $70.

Giving brother my old backbone (2600k/mobo/ddr3) for $120.

Total investment for me being around $230 is pretty freakin' sweet.
 
I always invest in the "smart money", and if you can name a better $200 value than the 3600 in the past DECADE - I'm all ears.

It's basically a $200 8700k which is crazy when you think about it....

No it's not - This is the point I'm trying to get across, and have been trying to for about 2-3 pages.

The 8700k is a more mature platform, the bugs have generally been worked out of the system. The 3600 is still effectively in the "teething" stage. There are issues, they are being addressed, but they're not there yet.

It may take Nvidia a month or two to sort out their issues, and that may mean more "random WHEA errors".

AGESA fixes for destiny and Linux may come out soon, or be a few weeks/months out, no one in the public knows.

On that note, I pity the Linux users who migrated on week 1 of release who that bug affected.

I cannot deal with unknowns about when things will be fixed, time is money for me, and time to fix means less money for me. Whether this matters to you is another thing.
 
Last edited:
I don't know the percentage above 144Hz. I've seen absolutely no statistics on this. What's your point?


That yes most people game at 1080p.. 60% is an accurate number but most people still game at 60fps as well which every processor above $200 should be well exceeding that number in today's world. Competitive 240hz gaming.. I can see the need for that extra single threaded IPC. Maybe I missed your ultimate point??
 
That yes most people game at 1080p.. 60% is an accurate number but most people still game at 60fps as well which every processor above $200 should be well exceeding that number in today's world. Competitive 240hz gaming.. I can see the need for that extra single threaded IPC. Maybe I missed your ultimate point??

Nope. You didn't miss it. I didn't quite understand what you were getting at.
 

yeah.. they were.
I was talking about the authors of the article. And no it still doesn't say it is related to rendering on GPU. Just some mysterious bluescreens with no information on the circumstances or even the exact configurations. I'm not going to board up my windows just because someone says there is a tornado somewhere around the world.

I'm not saying those bsods didn't happen, I just doubt it is common if we get this little clarity on how and when did they happen.
 
Back
Top