Is computer hardware finally plateauing?

Drawmonster

2[H]4U
Joined
Jun 9, 2007
Messages
3,300
Maybe it's just me, but I remember when keeping up with gaming hardware was a constant battle. I can remember building a "high end" box, just to have a new game a year or so later obliterate it.

But it really seems that things have slowed down a lot in the last couple years. My gaming rig, in my sig, can still play anything I throw at it without any issue at all. I keep waiting for something to come along that will make me upgrade, but it's just not happening like it used too. I keep browsing all the hardware section on Newegg pondering a super high end build, only to read comments on how nothing is that much of an upgrade from 2 year old hardware.

Are we finally getting to the plateau where hardware is so far ahead of software that we can get 5+ years out of each build? 10 years maybe?
 
You've got it backwards.

Software is plateauing, hardware is still getting faster and more efficient.
 
I guess Skillz is just rewording officer's observation--hardware specs seem to be going up faster than people can program applications to take advantage of it.

officer, I don't think your rig would hold out that well on max settings at 1440p. Or let's try 4k?

I don't think anything is plateauing. I think that the hardware needed to provide enough power for 99% of average users tasks is already there. Because of this, a lot of the money in hardware innovation is in making things cooler, quieter, more efficient to provide that same level of hardware power in a less obtrusive way to the user.

This trend might create the illusion of plateauing in a few different ways. One might say hardware is plateauing because raw clock speeds on hardware isn't really going up the way it was 10 years ago, but I think that is largely because of the market-demand considerations I mentioned above. The observations you make in your post give the illusion that software is plateauing--i.e. your a couple-year-old hardware can still handle current-day software, but I guess I would just respond to that by saying the needs of the average user (i.e. not a serious over clocker, gamer, etc) haven't changed much over the past few years and so industry is shifting to support the average user, which is where most of their market is. 10 years ago, neither hardware nor software was powerful enough to support the demands of the average user in a way seamless enough to make an average joe want to use a computer and so that required industry to push the limits of hardware and software in that way.

I think you also have to put it in perspective. Consumer hardware/software is only a small fraction of the market. I think there is actually more money in business-to-business hardware/software and there you see loads far beyond what current hardware or software can support. I think the problems that come up in these scenarios are sparking a lot of innovation which may not be fully visible (or obvious) to consumers yet. I mean you might not realize it, but if you use facebook or google on a daily basis, you are still completely ignorant of the level of software/hardware innovation needed to keep those services working and improving as they scale to even bigger markets.

I will admit there is at least one sense in which software is plateauing--we still aren't good at writing multithreaded software. That is why things like games aren't good at taking advantage of those 8 core systems out there. It's a hard problem. People are working on it, but even more people are just working around it. Eventually either the work arounds will get good enough so no one will care or someone will finally show the world how to write good, painless, parallel code. We'll have to see what happens there.

tl;dr: I don't think anything is plateauing. The market has just shifted a bit and so I think the forms of innovation going on are less visible to consumers.
 
cinohpa has some excellent points. Computers are fast enough for most people right now, so you don't see the monumental increases like we used to. From the business/enterprise computing perspective however, there is still a lot going on. The average person is logging into Facebook, Google+, etc. They are not researching what keeps those massive operations going and that is where there continues to be a fast-moving market for improvements.
 
This trend might create the illusion of plateauing in a few different ways. One might say hardware is plateauing because raw clock speeds on hardware isn't really going up the way it was 10 years ago, but I think that is largely because of the market-demand considerations I mentioned above.

This is not quite the whole story. market demands are not going down for every class of user.

There are always market demands for more processing power (just look at the workstation and server markets, which are not going anywhere). The reason we have slowed clock increases is entirely due to power limits.

The power limit used to not exist because die sizes were fairly large for a given power consumption. Most chips were pushed to physical timing limits instead of power.

Each time we had a new process node, we were able to reduce the voltage of the chip along with shrinking the transistor size. This allowed designers to fill the new tighter die with fancy new features, and still clock it faster while keeping power in-check.

But now die sizes are so small that you can't actually power-on the entire chip (sustained) at once. And voltages have fallen from their heights of 5v down to sub 1v operational, and they're not going to get much lower. This means the ONLY source of power reduction for the last five years has been the PROCESS NODE shrink.

So no, it's not just 'demand' that explains why we have hit a very real clock speed wall. Intel for one believes they can get better performance/watt and performance/sqmm by making their chip more parallel internally (HT, AVX/AVX2), and AMD has attempted similar improvements by adding more cores. So it really is software that is outdated.
 
Last edited:
@default: I'm in software, not hardware (or low level software for hardware) so I wasn't aware of that. Looks like a good read, thanks.
 
Back
Top