The Worlds Best of Best 16 core gaming CPU.

Discussion in 'AMD Processors' started by Archaea, Jun 10, 2019.

  1. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    9,377
    Joined:
    Oct 19, 2004
    https://www.engadget.com/2019/06/10/amd-16-core-ryzen-3950x/

    Introducing the 3950x
    $750. 16 core 32 thread, 72MB cache, 4.7GHz boost speeds.

    Heralded by AMD as the best of the best. The first 16 core gaming CPU.

    Ummm

    Show me a game that benefits significantly from 16 cores and 32 threads.

    I’ll wager the 9900k is still faster in true dedicated gaming use the vast majority of the time because 8 core 16 thread and 5GHz boost speeds with Intels IPC advantage will still win out for current games when 6 cores is pretty much the most any mainstream AAA title currently uses (and many games still only need dual core or quad core).

    This AMD chip is fantastic, but why try to brand it something it’s not? a gaming CPU?
     
  2. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,352
    Joined:
    Jun 13, 2003
    Going to be slower than their twelve core part, in benchmarks at least.

    Will game better than a 2950X though, for a lower overall cost, so if you need both, run with it.
     
    Manny Calavera, Flexion and KazeoHin like this.
  3. jmilcher

    jmilcher [H]ardness Supreme

    Messages:
    4,268
    Joined:
    Feb 3, 2008
    You could have maybe snagged me at 600.
     
  4. N4CR

    N4CR 2[H]4U

    Messages:
    3,835
    Joined:
    Oct 17, 2011
    A game that benefits is when you are doing more than just running a game at the same time.
    The performance is so close or same now that the 9900 furnace is no longer very compelling, there's no need to cry about it though, we all win.
     
  5. PliotronX

    PliotronX 2[H]4U

    Messages:
    2,062
    Joined:
    Aug 8, 2000
    Now we cry tears of joy because we have choices again besides the same old 4C/8T plus or minus a pin.
     
  6. Gideon

    Gideon 2[H]4U

    Messages:
    2,285
    Joined:
    Apr 13, 2006
    Yes but you could game and do other things without one affecting the other. I doubt the 9900K is faster since you got the emergency 9900KS.
     
    Manny Calavera likes this.
  7. Digital Viper-X-

    Digital Viper-X- [H]ardForum Junkie

    Messages:
    13,728
    Joined:
    Dec 9, 2000
    Why complain? High speed 16 core chip = good thing, 9900k is just slightly faster boost, with half the cores, IPC is pretty close now as well, so we'll see how it actually games. AM4 is the "gaming" platform.
     
    Manny Calavera and dragonstongue like this.
  8. KonaKona040

    KonaKona040 n00b

    Messages:
    19
    Joined:
    Feb 6, 2019
    Plus or minus a mandatory motherboard update :)
     
  9. Verado

    Verado Limp Gawd

    Messages:
    188
    Joined:
    May 16, 2017
    All the "GAMING!!!!" hype is because it was presented at E3. Aka a gaming show.
    Both $500 and $750 is outside what I'll spend on a cpu, but really, the only ones this is bad for is intel and juan.
     
  10. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,843
    Joined:
    Sep 7, 2011
    I mean, anything that will push the bottleneck onto a GPU that costs significantly more is a great gaming CPU.

    Thus, if you're buying a mid-range system, a decent six-core that pushes the bottleneck to the 1660Ti is a good gaming CPU.

    If you're buying the best of the best, and you want to do some rendering, streaming or video editing, chances are this 16 core beast will push the bottleneck onto the 2080 Ti.
     
    dragonstongue and N4CR like this.
  11. Calavaro

    Calavaro Whiskey & Honey

    Messages:
    8,103
    Joined:
    Apr 11, 2001
    A 16 core Godzilla chip is literally all I have been asking for. Granted, a little pricier than I want, but FINALLY I can build 1 PC and 1 PC only for all my needs. No more Intel BS with crippled cores at ridiculously high prices. I'll toss all my 3 PCs with Intel chips in the bin the day Ryzen 9 3950X is released.
    Having 2 PCs to game and stream at the same time is just dumb. Having 3 PCs to game, stream and encode video at the same is ludicrous. I am so very happy I can now just assign cores on a single machine and do it all... and still have a core or 2 left over for background tasks. And for a very reasonable price.

    Edit: typo.
     
    Last edited: Jun 11, 2019
  12. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,790
    Joined:
    Jul 29, 2009
    You would have to thank Intel for the great support over the years in keeping everything single threaded :)
    Now that we have 16c32t and the market is changing maybe we can get some better support in the long run.

    The only thing I wish is that the clock speed on the higher end products were better and the prices slightly cheaper that you would feel that you are robbing yourself if you did not buy it.

    Maybe that is a good reason for having 16 cores and 32 threads. :)
     
    PliotronX likes this.
  13. Verado

    Verado Limp Gawd

    Messages:
    188
    Joined:
    May 16, 2017
    To the people saying we dont have anything that needs 16 cores.
    The reason nearly nothing needs them is that they were never available on a desktop platform.
    Uses for it will come, eventually.
     
  14. Calavaro

    Calavaro Whiskey & Honey

    Messages:
    8,103
    Joined:
    Apr 11, 2001
    Uses are already here. We have just been using several PCs to do the job of 1 machine the past many years thanks to Intel. No more. Now all of our workloads can be moved to 1 PC and done quicker, more efficiently and without having to use KVMs or networked screen connections, or similar unoptimized solutions. Intel did really screw over the consumer the past decade. They can go pound silicon.
     
  15. thebufenator

    thebufenator [H]ard|Gawd

    Messages:
    1,205
    Joined:
    Dec 8, 2004
    So for this thing, can I tell Handbrake to encode using #threads, while I play games? :)
     
  16. Revenant_Knight

    Revenant_Knight Limp Gawd

    Messages:
    264
    Joined:
    Nov 18, 2011
    At OP: The same thing was said about the 2600K vs the 2500K back when they came out. Look at how that worked out. 2600K can still put up decent numbers in modern games. 2500K...not so much.
     
  17. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    9,377
    Joined:
    Oct 19, 2004
    Console gaming development is the major cause to move the needle. Next gen will be 8 core Ryzen 2 on Xbox, and an 8 core Ryzen ? for Sony too. That should summarize and fufill the next 5 years of game engine progression/needs. By the time 16 cores are needed in gaming, this thing will be outdated.

    Calavaro might have a use case if he's gaming, encoding, and streaming at the same time, but that's a really niche user case.

    Also I'm not arguing that eventually we'd get to the need for 16 cores. I'm just saying we certainly aren't there now. I"m as excited about this chip as the next guy. And I probably would have bought one if they were $500.

    However - they are billing this as an ultimate gaming CPU in the marketing - if you read my original link. That seems to be pure fluff marketing for several reasons.
     
  18. tangoseal

    tangoseal [H]ardness Supreme

    Messages:
    7,512
    Joined:
    Dec 18, 2010
    I'm not going to defend unreleased reviews, but AMD shows some pretty significant ass kicking from thier slides.

    I wouldn't make such a bold claim. If you actually read the way AMD rewired the cores and changed the cache layout and numbers, etc...

    Any enthusiast worth thier salt can see how Intel is sweating bullets right now. If a 16 core shows higher IPC at a lower clock rate than an Intel 8 core with higher clocks, dont you think that is something worth worrying about if your Intel?

    Preliminary marketing jazz is showing AMD with a lead in both the 8 core 12 core and 16 core chips, even the 6 core chips in gaming and productivity. Even Intels own HEDT products are being bested by the 12 core so far as were shown. The 8 core AMD was shown beating the 9900k. In fact If the slides are right the AMDs can support ram speeds in excess of the Intel parts with less fuss.

    I'm not telling you that AMD is beating Intel 9900k but it sure damn looks that way given what we know now.

    So if your mad that your 9900k is going to be beat, dont be, it's still going to be in the top 1%. Who the hell cares. Innovation is great for everyone.

    I fully understand your stance on calling it a gaming CPU. Since when was the 9900k a gaming CPU? What is a gaming CPU? Cpus are just cpus. Some are faster than others. But there is no such thing as a gaming CPU. It's just marketing bullshit that the masses inhale like dope.
     
  19. tangoseal

    tangoseal [H]ardness Supreme

    Messages:
    7,512
    Joined:
    Dec 18, 2010
    Not without managing affinity in windows or using process lasso.

    16 cores is not what you folks think it is. I can run a single handbrake or 4 at a time but you cant game while encoding too well. The problem is the massive amount of cache hits that video encoding presses on the CPU. It really makes feeding GPU's harder. Games stutter more etc...

    Everyone thinks because you have 16 cores you can do x or y but in reality yes running 4 hand brakes is nice as long as you're not gaming.

    Trust me I have a 2950x and a 2080ti. Gaming while running heavy encoding works but not super glass smooth.

    And it's going to be a rude awakening when people shell out 750 to be disappointed if they think gaming while doing heavy batch encodes is going to be a good experience.
     
    Manny Calavera, AlexisRO and mikeo like this.
  20. Nobu

    Nobu 2[H]4U

    Messages:
    3,193
    Joined:
    Jun 7, 2007
    Anyone who will do this is either dumb or knows what they're doing...
     
  21. Revenant_Knight

    Revenant_Knight Limp Gawd

    Messages:
    264
    Joined:
    Nov 18, 2011

    I've been saying that consoles form the baseline for a long time now. However, you forgot one key aspect: threads. The new consoles are both 16 threads. Being able to actually run 16 threads on 16 cores will be useful. In addition, OS and other programs are increasingly being designed with multi-threaded designs. This will only expand in the future. This CPU gives tremendous overhead for that.

    That said, I've got no plans to upgrade my 8086K any time soon. Yes it's marketing hype to call it the ultimate gaming cpu. Anytime you say that it's marketing hype. However, when it launches, it could very well be the fastest cpu for games.
     
    dragonstongue and Fuzzy_3D like this.
  22. Lepardi

    Lepardi Limp Gawd

    Messages:
    205
    Joined:
    Nov 8, 2017
    Won't they have 3 thread SMT? So that would mean 24 threads.

    24 thread 3900X seems a good option in this regard, and of course 4 more real cores thrown into the mix is even better.
     
  23. Revenant_Knight

    Revenant_Knight Limp Gawd

    Messages:
    264
    Joined:
    Nov 18, 2011

    3 thread SMT would be neat, but neither Intel or AMD has announced it. It's still 2 threads per core.
     
  24. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,352
    Joined:
    Jun 13, 2003
    They are eight core, sixteen thread parts.
     
  25. Boil

    Boil [H]ard|Gawd

    Messages:
    1,381
    Joined:
    Sep 19, 2015
    Three or four way SMT may be for Zen3...



    Pretty sure Revenant_Knight was implying that, with both the Playstation & Xbox moving to 8C/16T & games on them being developed to utilize the available 16 threads, we may see PC ports of these games retain their multi-threading...

    Thereby making the 16C/32T Ryzen 9 3950X the Worlds Best of the Best 16-core Gaming CPU...! ;^p
     
  26. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,352
    Joined:
    Jun 13, 2003
    I'd agree that the potential is there, absolutely, but with the massive increase in IPC as well as clockspeed over the Jaguar cores, developers also have the opportunity to slack off.

    Where they don't slack off, we should see the benefits on the desktop.
     
  27. Boil

    Boil [H]ard|Gawd

    Messages:
    1,381
    Joined:
    Sep 19, 2015
    The 'age old' problem, lazy devs who refuse to code for multi-core / SMT...

    With Moore's Law at an end, and the 5GHz ceiling, this is when devs need to actually start thinking about multi-threading the shitte out of code going forward...

    The future of CPUs from here is not faster clocks, but multi-threading; 2-way SMT is the norm, 3 & 4 way SMT is the next step; but there is no reason to take that step if devs don't start coding for heavy SMT usage...
     
    dragonstongue likes this.
  28. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,352
    Joined:
    Jun 13, 2003
    Wider, yes- but smarter too. A big part is going to be applying machine-learning techniques to compilers, and for run-times.
     
  29. idiomatic

    idiomatic [H]Lite

    Messages:
    72
    Joined:
    Jan 12, 2018
    The problem is if you have a true use case for 16 cores, you probably have a case and ROI for 32, or 64.
     
  30. Rockenrooster

    Rockenrooster Limp Gawd

    Messages:
    309
    Joined:
    Apr 11, 2017
    over $1000 is a lot of money.....................
    Mainstream = for home users. (Power users that is lol!)
    I would love to get me one of these and pop it in my X370!!!!! (Would need to keep it at stock though)
     
  31. Lepardi

    Lepardi Limp Gawd

    Messages:
    205
    Joined:
    Nov 8, 2017
    I remember in 2003 there were talks that 3GHz is the ceiling, and it's not going above that, lol.

    But looking at how succesfully modern titles like BF5 absolutely need those 8 threads or you get -50% performance hit, it's looking good. Should be seeing those 16 threads utilized.
     
  32. chameleoneel

    chameleoneel 2[H]4U

    Messages:
    3,017
    Joined:
    Aug 15, 2005
    The use case is already here: Streaming.

    I recently tried streaming, for the first time. Most streaming happens at 3,000kbps bitrate or less. Because that's realistic bandwidth, for most people to be able to watch. As such, you have to turn on a bunch of quality settings, to maximize that bitrate. But, that takes a lot of hardware power. Which is why people build a second 6 or 8 core machine, to have a high quality stream.

    I have a 7600k which sits at 5ghz daily and tried streaming at 2,000kbps. After hours of tweaking, the best quality video I could muster, looked worse than Diablo 2's FMV cinematics. Tons of blocking and artifacts, with H.264's "faster" setting and a few custom settings added. giving a little here, taking a little there.

    AMD showed off their 12 core playing a game and streaming on the "slow" preset for h.264. Being able to use the "slow" setting, for a real time stream, on one system, is nuts.
     
    Manny Calavera and N4CR like this.
  33. Rockenrooster

    Rockenrooster Limp Gawd

    Messages:
    309
    Joined:
    Apr 11, 2017
    Finally someone pointed this out!
    I wonder how well it would do with x265....
     
    Manny Calavera likes this.
  34. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,352
    Joined:
    Jun 13, 2003
    Should be pointed out that the transcoding capability on the CPUs and potentially GPUs should be used to defray the load.
     
  35. Randall Stephens

    Randall Stephens Limp Gawd

    Messages:
    462
    Joined:
    Mar 3, 2017
    When your 9900k falls off the throne, what do you do?


    You throw more coal on it and make the fire burn faster.

    For those of us living in the ghetto, the KS will likely cause the lights to dim.
     
    Manny Calavera and Gideon like this.
  36. Rockenrooster

    Rockenrooster Limp Gawd

    Messages:
    309
    Joined:
    Apr 11, 2017
    GPU encoding works (quicksync, nvec amd vce), but quality is crap compared to the slow preset (CPU) at the same bitrate...
     
  37. daphatgrant

    daphatgrant Moderator Staff Member

    Messages:
    17,919
    Joined:
    Jun 15, 2003
    I'm still running a 2600K, I'll be watching the 3950X vs 9900K comparisons closely.
     
    N4CR likes this.
  38. chameleoneel

    chameleoneel 2[H]4U

    Messages:
    3,017
    Joined:
    Aug 15, 2005
    At this time, GPU encoding quality is roughly equivalent to the "very fast" setting in X.264 (in actual testing, my AMD card is a touch better at fast motion handling, but less good at slower motion ). This is totally fine when you can use a bunch of bitrate for a local recording. and then slowly compress it later, for upload.


    But as I said, basically all streaming is under 6000kbps. and the majority of it is under 3,000kbps. In order to keep that looking good, you have to start stacking quality features in the higher presets. Features which aren't present in GPU encoding. I wish AMD and Nvidia would focus more R&D into their GPU encoders. But, they either aren't interested or maybe its just not possible? I dunno. Seems like all that shading power could be made to work for it. AMD did briefly mention that the 5700 cards have a new encoding engine. But they didn't focus on it much. So I'm assuming its still not great for the low bitrates of streaming.
     
    N4CR and IdiotInCharge like this.
  39. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,352
    Joined:
    Jun 13, 2003
    I'm more wondering about Intel's transcoder, though NVENC is in play as well. Intel's is leveraged quite a bit by stuff like Plex, and used in NAS devices for that purpose for Plex and others.
     
  40. chameleoneel

    chameleoneel 2[H]4U

    Messages:
    3,017
    Joined:
    Aug 15, 2005
    Plex or that type of streaming, is a different situation from game streaming. Game streaming is real time. Plex is not.

    Plex can therefore buffer your stream several seconds and take the extra time to transcode the file with better quality, before it sends you the next bit. The "Slow" preset should be fine for Plex on my 7600k, becuase it can take 5 seconds to do it, before it sends it to me. With game streaming, there is no pre-existng file. You are recording the game as it is happening. there is no 5 second buffer time to transcode before the stream.

    Even with plex, it still remains that CPU transcoding should have greater quality at lower bitrates. The image quality considerations do not change. I would say the big advantage with GPU encoding on Plex, is possibly lower power usage. Or, the server better able to handle multi-tasking, because the video streams are on qucksync. and other server work is free to use the main CPU cores. Also, GPU encoded streams on plex seem to have faster seek times.

    It also seems like a GPU doesn't necessarily offer more simultaneous streams from Plex, than a quad core CPU. Therefore, more CPU cores would probably enable more overall streams from Plex, than a GPU.
     
    Last edited: Jun 13, 2019