The Desktop CPU Isn’t Dead, It Just Needs A Swift Kick In The Butt

Discussion in '[H]ard|OCP Front Page News' started by Megalith, Jan 8, 2017.

  1. Megalith

    Megalith 24-bit/48kHz Staff Member

    Messages:
    8,816
    Joined:
    Aug 20, 2006
    This author thinks that the answer to making desktop processors exciting again is to add more cores, like Intel has done for their enthusiast line. The trouble with that is cost, but he hopes that potential competition from AMD will ultimately help with that.

    …adding more cores is probably the easiest and best way to boost performance at the high-end and convince consumers to replace that three-to-five-year-old PC. Enough apps are built to take advantage of multiple cores that users would see benefits immediately. Intel’s Turbo Boost tech (particularly Turbo Boost 3.0, introduced in the aforementioned Broadwell-E CPUs but not available anywhere else yet) can maintain good performance for single-threaded or lightly threaded tasks. If the leaked roadmaps we cited before are to be believed, Intel may be planning to do this when the “Coffee Lake” processors are released in 2018.
     
  2. Exavior

    Exavior [H]ardness Supreme

    Messages:
    8,148
    Joined:
    Dec 13, 2005
    The real problem with this is why? I can understand at the enthusiast line. But if you remove people that are doing high gaming or running specialized programs that have high cpu needs why does anyone else really need this? Your normally office drone is fine with today's computers to run some spreadsheets and surf the web, memory is probably an issue long before they are maxing out the CPU. Most home users don't need 32 cores as again they are surfing the internet or doing basic things. It used to be about every 2 - 3 years a computer needed to be replaced because there was something much better out there that would help everyone by making the change. Now, unless it is dead most people probably have no need to change out their computer, and when they do they don't need to buy anything extreme cpu wise.
     
    Vader1975 likes this.
  3. T_A

    T_A Limp Gawd

    Messages:
    238
    Joined:
    Aug 4, 2005
    Desktop PC`s today are overkill for pretty much anything thrown at them , start making applications/games that use ALL cores.
     
    Vader1975, Armenius, John721 and 3 others like this.
  4. Azphira

    Azphira Gawd

    Messages:
    792
    Joined:
    Aug 18, 2003
    What the desktop cpu needs is less lazy devs that don't use all cores.
     
  5. Probleminfected

    Probleminfected Gawd

    Messages:
    858
    Joined:
    Dec 20, 2013
    CPU these days are usually more powerful than the software demands. So more cores???

    I'm running 6c 12t @4.5ghz and I feel like it's overkill.
     
    Vader1975 likes this.
  6. Zarathustra[H]

    Zarathustra[H] [H]ard|News Staff Member

    Messages:
    21,322
    Joined:
    Oct 29, 2000
    I wouldn't mind faster CPU's, but more cores is not the solution.

    We've already hit a wall in multithreading. Most of what is able to be multithreaded already has been. More cores is never going to be the solution unless you do rendering, encoding or scientific computation type of work.

    Increases in per core performance is what is important.

    There is no point in just adding cores that 99.999% of people will never use.
     
    Vader1975, Armenius, Shintai and 2 others like this.
  7. Hallucinator

    Hallucinator Limp Gawd

    Messages:
    359
    Joined:
    Nov 1, 2006
    I'm fine with only one Opteron 2.5ghz 8c processor on a dual socket mobo.

    But my ex roomie had a lust for more speed and more cores. I've helped him build a brand new PC with 4ghz 8c AM3 on full sized MSI atx and 256gb M2 SSD and it installed Win 10 LTSB in 5 min.

    Yet my little cube (another PC) sporting only Pentium 3.1 ghz 2c / 4t on a H67M-ITX with 2 Crucial 64gb in RAID 0 and this baby installed Win 10 LTSB in 4 minutes!

    And the install? Both from an USB3.

    I told him that having more core/higher ghz is overkill just for his surfing the net. His answer? Don't like waiting - needed it right there just like you flick light switch ON. :rollseye:
     
    blkt likes this.
  8. Compwiz

    Compwiz n00bie

    Messages:
    56
    Joined:
    Feb 14, 2014
    always ready for more power! but honestly, i don't do much video encoding these days and games are for the most part gpu limited so meh. not so say i'm not excited about a more competitive market in the near future. i'm kicking myself for not getting the i7 4790k when i spec'd out my current system a few years back. i don't think the price has changed at all, still going for ~$300 used on ebay, maybe if enough people jump to amd with ryzen the price will come down enough to justify it.
     
  9. Sovereign

    Sovereign 2[H]4U

    Messages:
    3,029
    Joined:
    Mar 21, 2005
    I agree with the author that competition is important, but my concern's price, not performance.

    When Intel and NVIDIA can set new records for rent-seeking/price gouging (Founder's Edition, I'm looking at you), it means the market's dysfunctional.

    Hopefully Ryzen/Vega will fix that.

    Remember the 939 days when Intel was running scared?
     
    John721 likes this.
  10. ManofGod

    ManofGod [H]ardness Supreme

    Messages:
    6,705
    Joined:
    Oct 4, 2007
    That is not the way it will always be, however. Things will advance, including basic functions, which means more resources will be needed. Otherwise, we could still be on a single core cpu from the 1990's. This is not a put down of what you are saying so much as a different point of view on the same point.
     
  11. Zarathustra[H]

    Zarathustra[H] [H]ard|News Staff Member

    Messages:
    21,322
    Joined:
    Oct 29, 2000
    Yep, back then the roles were reversed. AMD was the one charging $1,000 for CPU's...
     
    Bandalo, Armenius, Shintai and 3 others like this.
  12. cyberguyz

    cyberguyz n00bie

    Messages:
    25
    Joined:
    Aug 28, 2014
    By the time I am ready to update my Skylake-based system Ryzen just might be the AMD architecture that can finally lure me away from my last 15 years of expensive Intel builds.
     
  13. SvenBent

    SvenBent [H]ard|Gawd

    Messages:
    1,056
    Joined:
    Sep 13, 2008
    i love how sa many ppl that probably never had done any kind of low level optimizing of code suddenly can call out all devs for being lazy, like Multi-threading is just some magic wand you apply and then BOOM performance.

    Some tasks need information from the previous calculations before it can do the next. . such tasks can't be multi-threaded. not every task can easily be split and calculated out of order.
    Another example of this is dictionary compression. progam like 7-zip has to "Cheat" by dividing the data at a cost of efficiency to be able to multi-thread it. this increase the memory usages s you are basically running the programs twice.
    its also only possible because we don care for the order of the results.

    In games we do much care for the order of the results and there is a lot of task that need the results from previous operation so multi-threading becomes harder then it would with E.G Video encoder that can easily divide it up into multiple task's.
    So please if you haven't touched deep level optimization or multi-threaded coding, don't insult the devs.

    But hey that way you you cant look all smug on a forum....



    also stating CPU are fast enough only counts for you. you are not the entire world and there is plenty of software that can use and need more core and more performance. Games are not the only thing Computer are for.
     
    Last edited: Jan 8, 2017
  14. Verge

    Verge [H]ardness Supreme

    Messages:
    4,728
    Joined:
    May 27, 2001
    Bring back NetBurst IMHO.


    Who needs cores, bleh!
     
    blkt and Compwiz like this.
  15. Zarathustra[H]

    Zarathustra[H] [H]ard|News Staff Member

    Messages:
    21,322
    Joined:
    Oct 29, 2000

    Excatly. The whole "lazy devs" argument is only made by people who do the have a freaking clue about software development.

    There there are some kinds of tasks (maybe even most kinds) that can NEVER be properly multithreaded, because how the data depends on order of things.

    Games engines are notorious for this. Apart from the rendering portion, which threads pretty well, if you try to multithreaded them you will just have thread locking and all kinds of other problems slowing things down rather than speeding it up.

    Most games "cheat" and split the game into multiple threads (one for game engine logic, one for physics, one for sound, etc. etc.) and do get the game to utilize multiple cores, but this isn't even true multithreading, it's just many separate single threaded parts.

    The main game engine is usually still the beefiest, and least able to be threaded, which is why you usually have one core maxxed and some smaller stuff going on on the other cores.

    I guess, long story short, in 2017, if something isn't threaded, more likely than not, it's because it CANT be threaded. It's against the computer science equivalents of the laws of physics. NOT because inadequate time and effort has been spent on the problem.
     
  16. Compwiz

    Compwiz n00bie

    Messages:
    56
    Joined:
    Feb 14, 2014
    I was only commenting on the average desktop cpu user. Obviously there will always be a market for more performance for professionals and enthusiasts.
     
  17. Cyraxx

    Cyraxx 2[H]4U

    Messages:
    4,091
    Joined:
    Feb 21, 2005
    We need to make Prescott great again!
     
  18. blkt

    blkt Limp Gawd

    Messages:
    285
    Joined:
    Oct 9, 2009
    Only so much code can be multithreaded. Though we should always have the mentality of doing more with less, it is far easier said than done.
     
    Last edited: Jan 8, 2017
  19. SvenBent

    SvenBent [H]ard|Gawd

    Messages:
    1,056
    Joined:
    Sep 13, 2008
    I totally agree. basic Office/internet/e-mail had not had any resone to upgrade for many many years.
    gamers need for CPU power has also dwingled to a slower pace then in the past.

    but there is still a bunch of CPU power hungry software out there. i have task that still runs for week at a 96-100% CPU usage non stop. i have to split it over multiple Computer to get somehow a decent performance.
     
    Compwiz likes this.
  20. cyberguyz

    cyberguyz n00bie

    Messages:
    25
    Joined:
    Aug 28, 2014
    over the last 30 years of software development code is only getting more and more complex. the ability to multithread becomes more and more difficult, while at the same time more and more necessary. Modern web servers really could not function without having a at least thread handle each socket connection. Same goes for database servers. Both of those are heavily multithreaded. In the world of applications, there is always a need to multithreading. Any time there is a lengthy task needs to be done, common programming practice is to spawn a thread for it and notify the main thread when it is done. When data needs to be accessed across threads, more often than not shared data is protected by semaphores or locks (at least any programmer worthy of the title does).

    Sure there are a bazillion things devs can do to write tighter code, but when you have worked in the modern software development world, the onus is on the developer to get their code out there fast, no matter how they can, because development cycles are getting shorter and shorter. When that happens the bosses don't care if the code is efficient or tight. They only care that it works and you are moving on to your next line item.
     
    Brian_B and blkt like this.
  21. Riptide_NVN

    Riptide_NVN [H]ard|Gawd

    Messages:
    1,410
    Joined:
    Mar 1, 2005
    I like how he thinks. :)

    Truly until it is as fast as flipping a switch. There is room for improvement.
     
  22. Zarathustra[H]

    Zarathustra[H] [H]ard|News Staff Member

    Messages:
    21,322
    Joined:
    Oct 29, 2000
    Poor comparison though.

    Netburst failed not because it was a single core design (they could have easily added more cores) but because the long execution branch resulted in a ton of branch prediction errors, dropping the effective IPC into the toilet, to the point where even the relatively high clocks it achieved were unable to overcome the deficit.
     
  23. pxc

    pxc Pick your own.....you deserve it.

    Messages:
    35,300
    Joined:
    Oct 22, 2000
    That's not a problem with laziness, but with how much things can be made to run in parallel (see Amdahl's Law), and also how easy it is to make code closer to bug-free. Making things needlessly complex to extract another 5% in performance isn't worth it from a development standpoint. The bigger problem is that even if you optimize the hell out of most software, it's still not going to keep all cores busy.
     
    jwcalla likes this.
  24. Exavior

    Exavior [H]ardness Supreme

    Messages:
    8,148
    Joined:
    Dec 13, 2005
    do you also argue against virtualization because one day you might need all that power of one physical machine for a single server?

    don't get me wrong, I am not saying that they should stop working on new technology. I am simply saying that we don't need to find a way to make the base line cpus out perform the current top level cpu. At the low end 4 cores is fine. We don't need 16 cores to be the standard for a low end computer on 2017. While trying to get 32 - 64 cores for gamers. You are looking more at there needing to be a split like there is for desktop vs laptop vs servers, and us add in average pc in there as a separate line of cpus.

    the reason people are at 3 - 5 years for upgrades has zero to do with cpu speeds slowing down with how quickly they are upgraded. People go 3 - 5 years because they aren't maxing out their pcs anymore so they aren't pushed to upgrade every 1 - 2 years.
     
  25. nutzo

    nutzo [H]ardness Supreme

    Messages:
    5,357
    Joined:
    Feb 15, 2004
    No, the reason I still have people at the office using 5 year old laptops, is because the new laptops where only 5-10% faster.
    There is no reason to spend $1,500 for a CPU that is only 5% faster.
     
  26. DigitalGriffin

    DigitalGriffin 2[H]4U

    Messages:
    2,303
    Joined:
    Oct 14, 2004
    Try opening 20 tabs in Chrome each running some type of active animation.

    Or more specifically in my case, try compiling 500,000 lines of source code in release which takes 2+ hours on i7's

    Then there's ripping DVD or BluRays at the highest quality ratings. Lets not forget re-codes on Plex servers streams

    Then there's rendering software

    Database servers etc etc...Entity framework can be slow. I've seen jqueries to entity take over a second per query sometimes. (They are complex queries on extremely large datasets, but when running web apps, and you get concurrent hits....)

    Go [H]ard or go home.
     
    Ranulfo, NixZiZ and Red Falcon like this.
  27. DigitalGriffin

    DigitalGriffin 2[H]4U

    Messages:
    2,303
    Joined:
    Oct 14, 2004

    Try paying $600 for a PII-450Mhz back in the day. And you think $330 is expensive?
     
  28. CacaSapo

    CacaSapo Limp Gawd

    Messages:
    327
    Joined:
    Feb 22, 2010
    What CPUs need is higher clocks. As others have mentioned, not every operation can be multithreaded, but all software benefits from higher clocks, multithreaded or not.
     
  29. SvenBent

    SvenBent [H]ard|Gawd

    Messages:
    1,056
    Joined:
    Sep 13, 2008
    but reversed back then you could get at celeron 300a and overclock it to better performance for a lot cheaper. I have no recollection of price though but i remembe running 2 of those for around the price of a P2-450mhz but i might be wrong
     
  30. WhoBeDaPlaya

    WhoBeDaPlaya [H]ard|Gawd

    Messages:
    1,454
    Joined:
    Dec 16, 2002
    I paid closer to $900 for a PII-400 back in 1998.
    Throw in a Canopus Spectra 2500, Canopus Pure3D II, miro DC30, and 3x 9GB Seagate Cheetahs and I could buy a sweet X99 dual TitanXP setup now.
     
  31. westrock2000

    westrock2000 [H]ardness Supreme

    Messages:
    7,394
    Joined:
    Jun 3, 2005
    I thought this is why it failed:

    [​IMG]
     
    NixZiZ likes this.
  32. westrock2000

    westrock2000 [H]ardness Supreme

    Messages:
    7,394
    Joined:
    Jun 3, 2005
    Not dead. Just out of reach.

    [​IMG]
     
  33. Ducman69

    Ducman69 [H]ardForum Junkie

    Messages:
    9,916
    Joined:
    Jul 12, 2007
    If more people had CAT6 in every room, I think a single powerful desktop that integrates with their smarthome, and just use "dummy" devices everywhere else linked to it would make sense for having a powerful processor. That way you don't need multiple consoles or cable boxes or home camera hubs and so forth, just have one powerful desktop that is running the TV in one room, the music server for another, transcoding for tablets, while streaming a video game to the living room.

    But otherwise, who needs that much processing power?

    What are they using it for?
     
  34. ShuttleLuv

    ShuttleLuv [H]ardness Supreme

    Messages:
    6,757
    Joined:
    Apr 12, 2003
    Bah bring back MMX, 3dnow, and GLIDE!
     
  35. ShuttleLuv

    ShuttleLuv [H]ardness Supreme

    Messages:
    6,757
    Joined:
    Apr 12, 2003
    This is todays thinking with millenials...basically enough to get the job done is enough. The guys at extremesystems are rolling around in their virtual graves. :(
     
  36. ShuttleLuv

    ShuttleLuv [H]ardness Supreme

    Messages:
    6,757
    Joined:
    Apr 12, 2003
    300A @ 464-504 was great for gaming, even with the smaller 128k cache vs the P2's 512k. Just like today with cores, cpu speed outweighs threading/cores most of the time. Things never change lol.
     
    qb4ever likes this.
  37. Mong00se

    Mong00se [H]Lite

    Messages:
    65
    Joined:
    Nov 23, 2013
    I 'think' the secret sauce to making "desktop processors exciting again" is making people feel like they are getting something for nothing. A.M.D. is on the right page making their entire line of RyZen CPU's overclockable. My first unshared CPU that was all my own was a K6 266 that I managed to OC all the way to 400 for almost 6 years untill a power surge killed it.
     
  38. ShuttleLuv

    ShuttleLuv [H]ardness Supreme

    Messages:
    6,757
    Joined:
    Apr 12, 2003
    Agreed I think intel really dropped the ball with the focus too much on cores and silly stuff along with too many variants some non oc'ed and pricing. The old intel was just better for the enthusiest if you ask me. Maybe not so much for business though.
     
  39. westrock2000

    westrock2000 [H]ardness Supreme

    Messages:
    7,394
    Joined:
    Jun 3, 2005

    maybe you are on to something. They could make CPU's that have working, but un-utilized features that would get turned on by moving some small resistors. Remember how excited people were to do that stuff back in the day? It would have to be undocumented of course so that people think they are making out on the deal. But it could be like a multiplier bump or a frequency bump or maybe unlock additional pipelines to increase IPC or even a whole 'nother core.

    Or maybe shorting a pin or something on the motherboard. You know, something that people "shouldn't" be doing, is easy to do, but has just enough fear of damage that you feel risky doing it.
     
    NixZiZ likes this.
  40. ryan_975

    ryan_975 [H]ardForum Junkie

    Messages:
    13,705
    Joined:
    Feb 6, 2006
    When companies strive for higher clocks it tends to come at the expense of IPC, and we get things like Netburst and Bulldozer with promises of super high clocks speeds (10GHz in 10 years anyone?) to make up the difference in IPC.

    What we need is more instructions that do an equivalent amount of work executed per unit of time. That's getting harder to achieve with how processors are currently designed and manufactured. That's why Intel abandoned the tick-tock strategy.
     
    CacaSapo and Armenius like this.