Are there any reasons you WOULDN'T buy a high Core Count i9(12c+) discounting Money?

Discussion in 'Intel Processors' started by LaCuNa, Jan 2, 2019.

  1. LaCuNa

    LaCuNa [H]Lite

    Messages:
    111
    Joined:
    Jun 3, 2014
    What do U guys think!! I wanna hear it ALL!! all your reasons <3
     
  2. N4CR

    N4CR 2[H]4U

    Messages:
    3,429
    Joined:
    Oct 17, 2011
    Zen2
    edit to add: 14nm++++++++ and shitty thermals to boot.
     
    Last edited: Jan 2, 2019
  3. DrLobotomy

    DrLobotomy [H]ardness Supreme

    Messages:
    5,360
    Joined:
    May 19, 2016
  4. ryan_975

    ryan_975 [H]ardForum Junkie

    Messages:
    13,844
    Joined:
    Feb 6, 2006
    Higher core counts usually mean lower clocks and more cost. If the applications you're running don't scale well with more cores, then you're just spending more money for less realized performance.
     
  5. kirbyrj

    kirbyrj [H]ard as it Gets

    Messages:
    23,723
    Joined:
    Feb 1, 2005
    The thing is, you can't ever ignore the cost, and very few people are going to pony up $1000+ for stuff they don't need.

    Unless you really need a HEDT platform, the average user is much better served by a mainstream platform from the cost/performance perspective. Even high core count mainstream platforms (8C/16T) are pushing it for the average user.
     
    Armenius and LaCuNa like this.
  6. LaCuNa

    LaCuNa [H]Lite

    Messages:
    111
    Joined:
    Jun 3, 2014
    Nobody can deny what is said. For me, coming from an X58 + Xeon, and having a really competitive asshole 'acquaintance', is making me want to adopt an x299/i9 combo. Srsly.
     
  7. kirbyrj

    kirbyrj [H]ard as it Gets

    Messages:
    23,723
    Joined:
    Feb 1, 2005
    At the very least, I wouldn't buy anything until you see what AMD has to say at CES.

    The X58 with a 6C/12T Xeon is still pretty good bang for the buck. Mine does 4.2Ghz all day ;).
     
  8. bwang

    bwang Gawd

    Messages:
    948
    Joined:
    Aug 6, 2011
    The Windows scheduler has problems with high core count platforms - even the i9 and its not-quite-uniform cache architecture is enough to cause issues, and Windows likes to bounce threads between cores to balance loads (on average, this leads to more uniform core temperatures and therefore more aggressive turbo), but migrating an entire thread leads to dips in FPS in games.
    Getting consistent performance is also trickier - i9's nominally have incredibly high turbo ratios, but in practice, the maximum turbo ratios are rarely achieved because of background processes. You really need good knowledge of your motherboards' BIOS tweaks and power limit behavior to dial in a usable overclock - merely typing in 49x and 1.3V won't do it on the larger parts because you end up looking at 500+W under heavy loads, a point at which things like ambient temperature become an issue for stability.
     
  9. Orddie

    Orddie 2[H]4U

    Messages:
    2,356
    Joined:
    Dec 20, 2010
    Just get your dick out & show them it’s bigger. No need to spend a grand to prove it

    My work loads are different than most. My daily drivers are all quad core (over kill but eh...). Gaming is quad as well in the 4+ ghz range. 3x VMWare hosts are 6 core HT.

    We just put together a streaming box. The quad core struggles to put out 2x 1080p@60 with two web cams when the source is 4K. But it barely struggles in that we only see 10% frame drop every 30-45 mins for less than 30 seconds. Since we are not Ninja Rn yet... that little pig will do for now.

    Unless you have friends that share the need for bigger is better I agree with others... it’s not needed or wanted (highly due to price)
     
  10. TheFlayedMan

    TheFlayedMan Limp Gawd

    Messages:
    249
    Joined:
    May 29, 2015
    I doubt i'll ever have use for more than 8C16T cpu. I may buy a 12C24T one if the AMD leaks are true though just coz why not? Well mainly for the higher clockspeed of the 3700X
     
    kirbyrj likes this.
  11. Kardonxt

    Kardonxt 2[H]4U

    Messages:
    2,778
    Joined:
    Apr 13, 2009
    Because I'm apparently the only person on the planet that can't notice fps differences beyond a stable 60fps and my i5 6600k still does that with no problems.
     
  12. TheFlayedMan

    TheFlayedMan Limp Gawd

    Messages:
    249
    Joined:
    May 29, 2015
    I really need a new rig for gaming. Stable 60fps sounds great. Last time I was playing a fps game (I don't game often) it was Shadow Warrior 2 and the fps tanked to 0 fps for around 4 minutes lol
     
  13. Brian_B

    Brian_B 2[H]4U

    Messages:
    2,247
    Joined:
    Mar 23, 2012
    I gotta admit I'm still put off by Intel's shifting definition of TDP here.

    I mean, I can understand it - but it definitely breaks with tradition and wasn't what anyone expected.

    Now, is that reason enough to not buy a chip, especially when I was probably going to overclock it just like Boost is doing now automatically? No, probably not. But that, coupled with the chipset bullshit, certainly makes me pause before throwing my money at a company that practices like this.

    And I'm a sucker for the underdog...
     
  14. UnknownSouljer

    UnknownSouljer [H]ardness Supreme

    Messages:
    5,780
    Joined:
    Sep 24, 2001
    Not trying to call you out, but that's a really short sighted way to view computers and technology.
    You really couldn't see ANY reason at all to get more than 8 cores... even in an incredibly long time like say 25 years? NEVER?
    You might just be speaking hyperbolically, but I think a lot of people don't actually think about this stuff. Modern computing is barely 25 years old (the first generation Pentium came out in 1993, just barely over 25 years at this point). There has been a massive quantum leap in computing power and capacity on multiple levels since then.

    I might be cynical, and obviously programming has been slow as all get out to utilize cores and get more applications to be multi-threaded. But I'm still 100% certain that eventually it will happen as adding more cores becomes more and more viable over frequency (at least for the time being). Certainly far past the point of using 8c/16t.

    ===

    As for the OP: no. Because as others have stated, cost is always a factor. Until Intel is giving me processors for free the hypothetical doesn't matter. It doesn't matter if Intel makes a 18 core desktop "non-HEDT" part if I can't afford it. Desire and viability is always linked to cost. Intrinsically.

    If cost really wasn't an issue, than obviously we'd all just get the HEDT processors now and just skip desktop processors. Skylake-X might be "slower" in frequency related apps but if you actually do use multi-threaded applications then it's the clear winner.
     
    Last edited: Jan 4, 2019
  15. TheFlayedMan

    TheFlayedMan Limp Gawd

    Messages:
    249
    Joined:
    May 29, 2015
    I think there would need to be some new compelling use case for me to use more than 8 cores at home. My first home pc was a 486dx2 so I'm aware of the increases in computational power. The only use I have now for a faster PC is gaming which I find myself not as interested in these days. I dabble in programming a little C# at the moment but visual studio seems to run alright on my i5-3570k. Other than that the usual web browsing and netflix streaming is all I really do on my PC. I could probably make a case for me not needing 8C16T at all lol.
     
  16. ZodaEX

    ZodaEX 2[H]4U

    Messages:
    3,633
    Joined:
    Sep 17, 2004
    There's a difference between not needing something and never needing something. You were shown to be wrong and now your backtracking.
     
    {NG}Fidel and IdiotInCharge like this.
  17. TheFlayedMan

    TheFlayedMan Limp Gawd

    Messages:
    249
    Joined:
    May 29, 2015
    I have my opinion and you have yours. This reminds me of the debates when the Q6600 came out. It wasn't very interesting then and it isn't now.
     
  18. LaCuNa

    LaCuNa [H]Lite

    Messages:
    111
    Joined:
    Jun 3, 2014
    Q6600 was the bomb what U talkin' bout!

    EPEEN chip <3
     
    criccio, Solhokuten, Mav451 and 2 others like this.
  19. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,388
    Joined:
    Jun 13, 2003
    As long as I can keep single-thread performance up, sure.
     
  20. calikool

    calikool [H]ard|Gawd

    Messages:
    1,158
    Joined:
    Aug 9, 2006
    Long live my Ivy Bridge I-7. I haven't had to upgrade since 2012.
     
    DLGenesis likes this.
  21. M76

    M76 [H]ardForum Junkie

    Messages:
    8,742
    Joined:
    Jun 12, 2012
    The best way to do it is get a mainstream computer, and show them in benchmarks that yours runs things faster and you paid half for it.
     
  22. mikeo

    mikeo Limp Gawd

    Messages:
    286
    Joined:
    May 17, 2006
    Or the AMD X2 4400+ Did most people need two cores back then? Probably not, but it was pretty awesome to have. I think that was my last AMD build, Zen 2 is long overdue.
     
  23. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    16,182
    Joined:
    Jan 28, 2014
    This is the correct answer. We're deep into the core count wars, but what out there actually scales past 4 or even 2 threads? We are just starting to get games that utilize 6 or more, and they're still few and far between. From a productivity perspective I don't use anything currently that scales past 8.
     
    juanrga and mikeo like this.
  24. XoR_

    XoR_ Gawd

    Messages:
    614
    Joined:
    Jan 18, 2016
    It is always to have more cores than SW that is being used can utilize.
     
  25. ZodaEX

    ZodaEX 2[H]4U

    Messages:
    3,633
    Joined:
    Sep 17, 2004
    What if the software only uses 20 -50% of one core?
     
    juanrga and Armenius like this.
  26. jamesv

    jamesv [H]Lite

    Messages:
    101
    Joined:
    Mar 12, 2016
    Like my apps.
    I need the low latency of a Dual Core but a Quad is what I use.
    6 Cores is where latency starts to take its toll in real time usage.

    This is why I just bought 2 more ASRock Z97m WS/i7 4790k combos and Modded them for Supermicro CSE 500f-441B 1U Chassis.

    Once I see “new” cores instead of these incremental moves requiring new chips and boards, Im 100% stable with the lowest latency possible and ZERO driver issues.

    I really think by the time this happens I will need an i3 Quad, because AMD doesn’t look like it needs to “stoop” like that.
    The latest i3 9350k has a base of 4GHz and 65 watts. That’s pretty nice, but I want 4.5GHz base ( all cores ) and 65 watts.
    Guess 10nm can do this.

    If not I’m making plenty of cash on my 4790k rigs.
    550 for the CPU, DRAM 32GBs and motherboard.
    More money for bigger Samsung Pro SSDs...
     
  27. bigbluefe

    bigbluefe Gawd

    Messages:
    550
    Joined:
    Aug 23, 2014
    Intel can't even keep 8 core CPUs cool. CPU temperatures are absolutely absurd. Really, a lot of computer technology is basically in a place where it isn't even acceptable as a consumer product. It's basically prototype hardware that gets sold like it's actually ready for prime time.
     
  28. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,388
    Joined:
    Jun 13, 2003
    Intel has been keeping eight-core CPUs cool for years :ROFLMAO:
     
    juanrga and Armenius like this.
  29. darrpara

    darrpara Gawd

    Messages:
    687
    Joined:
    Apr 26, 2011
    Because the biggest difference in system performance is the GPU. It is a waste of money, like going to 64 GB of RAM in a gaming system.
     
  30. XoR_

    XoR_ Gawd

    Messages:
    614
    Joined:
    Jan 18, 2016
    AMD Faildozer's "Eight-Core Processor"s ran much hotter, especially those that had 5GHz turbo XD
     
  31. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,388
    Joined:
    Jun 13, 2003
    Saying 'biggest' throws off the perspective; having not enough memory or CPU makes a huge difference. It's just that once you do have enough (capacity and cores), the GPU starts to scale.

    The other side of this is how it is tested: we should be looking at maximum frametimes, which correlate with minimum framerates but allow us to identify real issues that will be 'felt' by a player.
     
  32. kllrnohj

    kllrnohj [H]ardness Supreme

    Messages:
    6,863
    Joined:
    Apr 1, 2003
    If it's just a dick measuring contest then get the 32c/64t Threadripper? x299/i9 can't match that, and you can brag about all 64 PCI-E lanes you have as well. Fuck it, run 12 NVME drives in raid 0 because you can. That's your untouchable bragging rights there if you don't care about price at all and don't have any particular workload in mind.

    Otherwise money & usage definitely matters, and just to hit 12c+ you may be compromising on the things you actually care about. Intel's X-series don't clock as high, primarily, meaning things like gaming will be slower on the more expensive chip. As will many other things that don't scale out as well.
     
  33. tangoseal

    tangoseal [H]ardness Supreme

    Messages:
    6,760
    Joined:
    Dec 18, 2010
    At this point unless sheer desperation plagues your veins I would wait until Zen 2 is released with its beloved 7nm process. It claims to make massive gains in performance etc...

    I have a 2950x and love my 16 cores. I can crunch videos at max while playing Oculus games and it doesn't even feel like my processor is being worked.

    I would wait and get as many cores as you can for what you can afford. Unless your purely gaming. Then I would go with a 6 or 8 really high clock speed chip rather than a lower clock speed high core count monster like a Threadripper or the coming $SOUL priced 28 core Intel

    I also have a 2600x 6 core and it feels lovely. I can't imagine what a 3600x is going to feel like with an brand new process, lower power usage, high clocks, higher IPC etc... but I find that 6 cores is the min I would get in a gaming centric processor moving forward.
     
  34. kirbyrj

    kirbyrj [H]ard as it Gets

    Messages:
    23,723
    Joined:
    Feb 1, 2005
    At least they listed them as a 225W processor unlike Intel's "95W" :p
     
    Master_shake_ likes this.
  35. XoR_

    XoR_ Gawd

    Messages:
    614
    Joined:
    Jan 18, 2016
    Still needs at least 2 core processor otherwise overall system performance will be terrible.
    Modern OSes (all Windows NT) have terrible performance on single core processors in general imho
     
  36. PhaseNoise

    PhaseNoise [H]ard|Gawd

    Messages:
    1,255
    Joined:
    May 11, 2005
    Completely agree, and that will actually continue to be the case. As we've discussed before, there are a few main categories:

    1) Single thread
    2) Lightly parallel
    3) Embarrassingly parallel

    Once you truly have a workload which scales up well past 8-wide, you're realistically heading for GPU territory. Batch processing of huge data sets, no data dependencies between workloads, and usually very little branching. Perfect for a GPU.

    For other things, Amdahl's Law hits you very hard, very fast. Yes, some software is getting better at using more cores, so it isn't like more cores has no value. But for most home users, 4 is decent, 6 is good, 8 great. Beyond that, you're getting sharply diminishing returns, and should consider strongly what your workloads are and where that money is best spent.
    For example - maybe an ultra-low latency SSD, or a beefier GPU. Or heck, RGB lights on your fan - that makes things faster I Read Somewhere. :)
     
  37. Master_shake_

    Master_shake_ Little Bitch

    Messages:
    7,787
    Joined:
    Apr 9, 2012
    massive security holes?

    unknown performance degradation from fixing them?
     
  38. juanrga

    juanrga Pro-Intel / Anti-AMD Just FYI

    Messages:
    2,412
    Joined:
    Feb 22, 2017
    Fixed it for you.
     
  39. PhaseNoise

    PhaseNoise [H]ard|Gawd

    Messages:
    1,255
    Joined:
    May 11, 2005
    <nod> Fair enough, that's a good generalization.
     
    Last edited: Jan 21, 2019
  40. Raghar

    Raghar Limp Gawd

    Messages:
    215
    Joined:
    Jun 23, 2012
    You can use it to play Witcher 3. (Of course RAM controller was on chipset and it only supported DDR2 which had bad write rates.) But it was great for emulation. These in poverty overclocked E5xxx, or E7200/E7300. These who had money to make custom water cooling overclocked Qxxx. PS2 emulation required CPU power and Intel CPU allowing non standard compliant handling of denormals. (Aka DNZ.)