When did golden age of computing end?

Nerva72

Weaksauce
Joined
Feb 18, 2016
Messages
66
I built my first computer in 1991 -- it was a 486DX/25, which really blew the doors off the 386/33's the other computer nerds in the dorm had at the time. I've built countless systems for myself since then -- back then, it was assumed that clock speed would just keep increasing, just like transistor count -- but of course Intel eventually hit the GHz wall with the Pentium 4. But multiple cores took over and kept the performance increases going, at a reduced rate.

Then about a year ago Intel announced they were changing to having feature sizes shrink once every three years instead of every two years, because they are nearing the quantum limit for silicon. I assumed that meant we were approaching the end of the golden age of computing -- but now that I'm building another system, I'm starting to wonder if it actually ended some years ago -- and I'm not sure why.

I paid $315 for an i7-2600K almost six years ago. Today, the highest performing CPU for about the same money ($324) is the Ryzen 1700. The PassMark scores for the 2600K and Ryzen 1700 are 8486 and 13797, respectively -- only a 63% improvement in SIX YEARS -- that works out to a whopping 8.4% annualized increase in performance/dollar per year. It's even worse for Intel -- for roughly that money the best you can get is either an i7-6700K or an i7-7700, both of which score around 11,000 -- only a 30% improvement, or 4.4% annualized. These days I think muscle car horsepower is increasing faster than that.

I was talking to a former Intel employee a few months ago that was at retirement age, and I mentioned that I'd heard the boom years were the 70's and 80's. She said the 90's were good as well, but things changed after 2000. By an interesting coincidence, that's when the Pentium 4 came out. I'm wondering if anyone has a chart of desktop processor performance vs. year -- I have to think the rate slowed after 2000, and then more recently it slowed to a crawl.
 
The last big IPC jump was the Pentium Pro. After that it has been tiny and only via clock increase and ever increasing power consumption.

Everyone ask for perf/watt the last 10 years or more. And for those who may still be left out, the desktop is dying. Its all about mobility and servers.

Passmark is quite pointless, since the average consumer cant use it for anything. Just as you would hate to have a 20+ core system at 2½Ghz for example for everything usage. An FX8350 scores ~9000. Would you trade it for your 2600K? And the 7700K scores 12200. And no, a muscle car is nowhere near. Also that muscle car would have to zip fuel like a scooter or something.

Then there are challenges like Amdahls Law Gunthers Law and Gustafssons Law. Serial code, negative scaling with more cores and so on.

Laptops in the same period have increased performance quite significantly, and we have gone from a standard 35W laptop class CPU to something between 7.5 and 15W.

Servers has gone from 8 and 10 cores, to 28 cores with significantly higher clock and IPC.

So to answer your question as you see it, it died around 2003 and the patient got declared dead in 2006.

But in reality, the golden age is still there. The focus have just changed from raw speed to perf/watt, integration, mobility and so on.
 
Last edited:
It seems like up until the mid 2000's there was real competition between AMD and Intel. At the same time, the PC was still the mainstream device for most households. With the use of tablet type devices and phones these days (particularly phones), the PC is far less utilized as a primary device.

I would say the golden age of computing began it's decline in 2007 when the iphone was introduced. I remember my friends with their new phones saying they never used their PC anymore, they "did everything on their phone".
 
Graphics card performance is still scaling quite well. And since GPUs are still more of the bottleneck for gaming than CPUs there is still a lot of performance to be gained there.

We are still roughly at 1300mhz to 1600mhz range for GPUs so over time can easily go another 3xs faster on core speeds before we start hitting the current practical silicon limits of 4-5ghz.
 
GPUs already reached 2Ghz. And I doubt you see more than 3500Mhz peak as of current.

But GPUs got an inherit benefit, they scale out completely with close to no penalty due their parallel nature. However as with CPUs, perf/watt for GPUs are more than ever important and they are shifting radically to mobile.

The biggest danger to progression is the lack of transistor cost reduction for anyone than Intel below 28nm. And for Intel to archive that, it pretty much cost billion $ IC designs.

NXP, Avago, AMD and MediaTek are examples of companies barely hanging on. While other are already off the train. Not many can afford leading edge nodes in the near future for consumer devices. They can most likely be counted on a single hand. Intel, Apple, Qualcomm if they dont destroy themselves, Samsung and maybe a joker like Nvidia if they can keep the revenue flow up.
 
Last edited:
It seems like up until the mid 2000's there was real competition between AMD and Intel. At the same time, the PC was still the mainstream device for most households. With the use of tablet type devices and phones these days (particularly phones), the PC is far less utilized as a primary device.

I would say the golden age of computing began it's decline in 2007 when the iphone was introduced. I remember my friends with their new phones saying they never used their PC anymore, they "did everything on their phone".
They obviously dont have a job :)
 
LOL indeed.

Yes, I'm aware of the shift to mobile after the iPhone was introduced -- I was certainly conscious of it at the time. My question was restricted to desktop computing, since that can be evaluated apples-to-apples. In 1981 my dad bought the 6th IBM PC sold in our metro area and I used it to play Adventure, I got my first computer in 1991 and used it to play Civilization, and I still play games on my desktop (if far less often).

As people point out, the GPU sector still seems alive and well -- I love that all of the most powerful supercomputers for national security code-breaking are based on GPU technology developed for nerds playing first-person-shooters.
 
To me the golden age of computers died alongside the LAN PARTY!

Once consoles went mainstream and everything was internet based people stopped needing to build big bad gaming PCs and contrary to modern society where social media is about as humanly anti-social as one can get, Lan Parties etc... were a very social event where people still met in person. I have friends on facebook and other media that I haven't seen in a decaded. And due to so called social media I may never develope the urge to physically visit them again due to the ability to keep constant tabs on their statuses etc..... social media has really killed the soclal in humanity. And the console and the smart phone have killed much of what the PC used to be.

This is just a gamers perspective.

I think the term PC is dead but not the concept.

I honeslty miss watching the progressive loading of graphics streaming down my 28.8kbps dial up and hearing "You've got mail!"

To me that was the golden age. When one could use a phone line and take a break from socializing. Now you have to take a break from constant technology and be lucky if you are able to socialize with another human like old times.

It was almost as if it were designed as so. The PC had it's place in starting an addiction to connected technology. Then comes along the smart phone.

So yeah is the PC dead..... sure it is as it served it's intended purpose. Won't be long and the smart phone will be dead as everything shifts to biologically augmented enhancements that allow augmented reality everywhere you go. A true dystopia in my opinion. And the F'ing PC started it all.
 
Last edited:
Once COD Modern Warfare 2 was released. PC's died.

Let me clarify. They removed Dedicated servers and LAN support. This allowed them to make more money, and not have to worry about dedicating any resources to the server feature. In turn, they were able to release shitty games every 2 years for the console boom. Leaving PC gamers in the dust, thus killing LAN parties.
 
Last edited:
I completely agree about LAN parties... went to a lot of those. Back then you never built a steel full-tower PC because it was too big and heavy to lug to LAN parties -- aluminum mid-tower ATX was the way to go, with a 15" CRT. Doom, Quake, Warcraft, good times.
 
They obviously dont have a job :)

Ha, maybe not. I was referring to home use more than business use. But that being said, lots of people I know now use their phone more than a desktop for business purposes as well.
 
I completely agree about LAN parties... went to a lot of those. Back then you never built a steel full-tower PC because it was too big and heavy to lug to LAN parties -- aluminum mid-tower ATX was the way to go, with a 15" CRT. Doom, Quake, Warcraft, good times.

alright mr money bags..I couldn't get an allunimum case unil 2001, before that it was all steel baby with 17" crt :)
 
Now that I think about it more, I'm pretty sure I got my first aluminum case in 2001 as well, although I had been doing short towers prior to that. I got a 15" CRT in 1991 and used it until 2001, when I got a 21" CRT (IBM with a Trinitron tube) that was my main monitor until 2008, but it still got a lot of use until 2015. In 2008 I didn't want to go LCD so I got a slightly used 24" Sony FW-900 -- the last, best CRT monitor ever made. I'm still using it, although it is very burnt-in -- I am holding out for an OLED monitor. Glad to see Dell is coming out with one, although I'll wait for prices to drop and be more gaming-oriented.
 
Last edited:
I'm with the old guys. To me it was the end of the lan party era. My "gaming" machine is a 2500K and GTX 580. I'd rather play around with old 8 bit machines these days.
 
  • Like
Reactions: spine
like this
The last major upgrade was C2D and then SSD drives. After that, there hasn't been any major advances because there really isn't any need.
 
Yeah I remember the aluminum case cost a fortune -- I'm not sure when I got it, but it must have been the mid or late 90's. I got a 15" CRT in 1991 and used it until 2001, when I got a 21" CRT (IBM with a Trinitron tube) that was my main monitor until 2008, but it still got a lot of use until 2015. In 2008 I didn't want to go LCD so I got a slightly used 24" Sony FW-900 -- the last, best CRT monitor ever made. I'm still using it, although it is very burnt-in -- I am holding out for an OLED monitor. Glad to see Dell is coming out with one, although I'll wait for prices to drop and be more gaming-oriented.

Yeah I spent a shit ton of money back then getting odds and ends. I actually had a fw-900 for a while also but been with lcds since 2003, I just had the fw-900 for gaming. My first allunimum case was a lian li pc back in the day but as mentioned that was 2001, I rememebr putting my thunderbird 1.33 in it.

Concerning the post about the golden age of pc's...I still think we are kinda in it but making advances on different regards. Just like people tend to be nostalgic about music (by weeding out the shit that obviously did come out when we were younger.), I think there's some nostalgic stuff about pcs now. Mostly taking into consideration basic thigns like moores law and stuff. Some of the stuff now is incredible but it's not super fancy knock your socks off home run stuff, more like base hits.

Example...the Geforce 1050ti. On paper it doesn't seem much, but if you take into consideration that it's running off bus power and has the same power as a High end card that cost $500 5 years ago. That's crazy to think of.
 
Yes, there's still progress being made with GPU's and even more so with SSD's -- we're in a golden age of storage throughput like nothing we ever had with hard drives, and it makes a huge impact on system performance.

Monitors, on the other hand, are stuck in a dark age. Resolutions flatlined around the time when LCD's took over -- my FW900 from 2002 can do 2304x1440... that's pretty much what most high-end LCD's could do until 4k arrived just recently. And despite LCD's dominating the market for over a decade, they are still CRAP -- backlight bleed, lousy color accuracy, poor playback of 24p, limited contrast, and sometimes poor viewing angles, response time, and input lag. OLED monitors can't get here fast enough.
 
Sony Trinitron 19" or KDS (can't remember the model). Booyah Lugged that 50lb beast to every lan party.

I remember when we had a major multi hundred person LAN party at a university. I was about 17 years old maybe ... anyways they had a T1 Line @ 1.544 mbps and we were like BLOWN away by the speed and latency.
 
Times changed, games used to be sharable (maybe not playing online, but at a LAN). Now everything is locked up, download only. Some sort of cancerous hell-hole.

I reached out to my old LAN buddies, its been 10 years but hey why not...after quick discussion, most people didn't want to spend $50 in games for the night since they had no interest in them later down the road. They've isolated the PC crowd while keeping things open on the consoles. That's just downright anti-consumerism at its finest.

I'd say when AMD went through its trouble was the end of the golden era for me. My first intel rig was sandybridge. I'd love to go back over to team RED. Time will tell.
 
I built my first computer in 1991 -- it was a 486DX/25, which really blew the doors off the 386/33's the other computer nerds in the dorm had at the time. I've built countless systems for myself since then -- back then, it was assumed that clock speed would just keep increasing, just like transistor count -- but of course Intel eventually hit the GHz wall with the Pentium 4. But multiple cores took over and kept the performance increases going, at a reduced rate.

Then about a year ago Intel announced they were changing to having feature sizes shrink once every three years instead of every two years, because they are nearing the quantum limit for silicon. I assumed that meant we were approaching the end of the golden age of computing -- but now that I'm building another system, I'm starting to wonder if it actually ended some years ago -- and I'm not sure why.

I paid $315 for an i7-2600K almost six years ago. Today, the highest performing CPU for about the same money ($324) is the Ryzen 1700. The PassMark scores for the 2600K and Ryzen 1700 are 8486 and 13797, respectively -- only a 63% improvement in SIX YEARS -- that works out to a whopping 8.4% annualized increase in performance/dollar per year. It's even worse for Intel -- for roughly that money the best you can get is either an i7-6700K or an i7-7700, both of which score around 11,000 -- only a 30% improvement, or 4.4% annualized. These days I think muscle car horsepower is increasing faster than that.

I was talking to a former Intel employee a few months ago that was at retirement age, and I mentioned that I'd heard the boom years were the 70's and 80's. She said the 90's were good as well, but things changed after 2000. By an interesting coincidence, that's when the Pentium 4 came out. I'm wondering if anyone has a chart of desktop processor performance vs. year -- I have to think the rate slowed after 2000, and then more recently it slowed to a crawl.

CPU-Scaling.jpg


Somewhat between 1995 and 2003. First we hit IPC wall, then clock and power walls. The IPC wall was predicted, much before happening, in a number of academics papers analyzing the amount of ILP (Instruction Level Parallelism) on code, it was then predicted that chips would soon hit an IPC wall due to increasing technical difficulties to find and execute ILP on serial code. Those predictions were the reason why HP and Intel developed EPIC (Explicitly Parallel Instruction Computing), where the compiler was able to find ILP on code and optimize the code for faster execution. Unfortunately it was too experimental approach, and it required a too smart compiler that wasn't available and it failed miserably. Some recent experimental designs are reconsidering this. Check for instance development of The Mill architecture. Their goal is to execute 20 or 30 instructions per cycle. It is based in some recent advances in compilers and there is serious debate among experts: some claim Mill CPUs will work fine, others claims they will follow same route than HP/Intel Itanium CPUs.

The IPC wall was predicted and expected. The clock wall wasn't. Engineers didn't see this coming. Using former silicon scaling laws, engineers predicted 10GHz in the near future. It didn't happen, because those laws ceased to apply and engineers found that shrinking silicon changed the behavior of the material until we finally hit a clock wall. The above graphic is not exclusive for Intel engineers. Any engineer hits the three walls: clock wall, IPC wall, and power wall. I don't have the above graphic for other companies, but have one with clocks for anyone

horowitz_fig7.png
 
The golden age of computers died when people stopped caring about Computer Shopper magazine. That was the great Wild West and Oklahoma Rush mashup times for computers. Ordering from computer stores located in the backs of trucks, slow as hell shipping for outrageous prices, flipping actual pages to compare PC specs, reading Jerry Pournelle's Chaos Manor articles, attending computer swaps in the parking lot behind a grocery store, figuring out which V-LB video card to get, cleaning out the mouse's ball rollers, games that actually blew your mind with how fun they were to play... The memories....
 
I think that Intel has been the least significant of any issues causing the "golden age of computing" to have passed. Stuff like still being able to use a 6+ year old 2500K and have it still be a viable gaming CPU is pretty amazing. I don't consider that a bad thing at all.

I really miss the era where you could spend ~$300 on something like a 9700 Pro AGP and basically have the fastest videocard available. To do the same these days would involve spending several grand on overpriced cards like the Titan XP.

The biggest issue by far is the smartphone. I don't have any issues with the smartphone itself or what a smartphone does, but rather the way the phone companies have usurped the smartphone for their benefit ($$$), and the way people have so willingly given up their computing freedom. Phones with tons of apps that you can't uninstall even if you want to. Apps that collect tons of info that have nothing to do with the purpose of the app - and people becoming accustomed to the idea that that is normal and okay. Only being able to install apps that your phone company approves of and puts in the app store. Your phone only being updated when your phone company feels like it, and older phones rarely getting updates, even if the hardware is still quite capable. More and more phones and tablets not even having replaceable batteries. Artificial product segmentation by disabling basic features that shouldn't be disabled. The list goes on. People owned their PCs. For most people a phone is more of a disposable device that they are simply renting/leasing. That is not a good thing overall.
 
Wow, lots of great points being made by the people here, much of it I agree with word-for-word.

Thanks for posting those plots -- they really illustrate the evolution of CPU's. I hadn't realized the IPC flat-lined so long ago. Then Intel tried to make up for it with higher clock speeds, but that didn't get them very far.

The one plot I'd like to see that isn't there, is MFLOPs or some other metric for processing power vs. year and processor family.

Am I right in thinking multi-core processors are sort of a simplified version of EPIC? They depend on compilers written to take advantage of multiple cores and hyperthreading, so I'm wondering if some of the work that's gone into that could be applied to the EPIC problem that they weren't up to solving previously.
 
Last edited:
Yes, there's still progress being made with GPU's and even more so with SSD's -- we're in a golden age of storage throughput like nothing we ever had with hard drives, and it makes a huge impact on system performance.

Monitors, on the other hand, are stuck in a dark age. Resolutions flatlined around the time when LCD's took over -- my FW900 from 2002 can do 2304x1440... that's pretty much what most high-end LCD's could do until 4k arrived just recently. And despite LCD's dominating the market for over a decade, they are still CRAP -- backlight bleed, lousy color accuracy, poor playback of 24p, limited contrast, and sometimes poor viewing angles, response time, and input lag. OLED monitors can't get here fast enough.

I hope OLED can truly deliver what it's promising but so far every offering has been very expensive and suffers from burn-in problems (a major problem particularly as a computer monitor where a lot of the UI is static for long periods of time). If you end up having to constantly replace very expensive monitors then OLED isn't really much of a solution either.

I agree right now we just have a bunch of trade-offs to consider depending on the technology though. TN colors usually aren't as good and viewing angles aren't great, IPS has backlight glow problems and generally slightly less responsive pixels. Both IPS and TN have pretty limited contrast. VA has good black levels but usually slow pixel response and input lag. High refresh rates have pretty much solved the 24p problem at least, although not if you go 4K since those aren't available at high refresh rates yet. Gsync/Freesync can theoretically fix the issue too even without high refresh rates, but there seems to be very little effort to either lower the minimum limit below 30hz or double the framerate of 24p content on the software side.

I actually think too much of the "progress" in displays has focused on MOAR PIXELS though. Basically none of those problems you listed are solved by adding more pixels, and adding more pixels makes it harder or impossible (depending on your budget) to get decent framerates in games.
 
Speaking of runaway pixel count... can anyone explain to me why phones now have higher resolution than most monitors?

The problem is, 24p isn't "solved" with higher refresh rates... they need backlight strobing or screen blanking to properly display 24p as the cinema gods intended.

Burn-in is certainly undesirable, but it can be worked-around, even on computers -- my HTPC is connected to a plasma HDTV, and to avoid burn-in I just keep the taskbar hidden and have a big library of wallpapers that chance every couple minutes.
 
In terms of OLED, if you watch TV reviews its really not pretty in terms of what the hyped had promised.

OLED however gives great black, great pixel response and then it more or less dies off completely. Worse colours, fast degrade, burn in, bad uniformity etc.
 
Am I right in thinking multi-core processors are sort of a simplified version of EPIC? They depend on compilers written to take advantage of multiple cores and hyperthreading, so I'm wondering if some of the work that's gone into that could be applied to the EPIC problem that they weren't up to solving previously.

There is a relation in the sense that multicores are about using TLP (Thread Level Parallelism) whereas EPIC and similar VLIW microarchitectures including the Mills, are about using ILP (Instruction Level Parallelism), and there is some overlap between the techniques compilers use to identify and extract ILP and the techniques used to identify and extract TLP.
 
I'm with the old guys. To me it was the end of the lan party era. My "gaming" machine is a 2500K and GTX 580. I'd rather play around with old 8 bit machines these days.

Lol, I'm not quite there just yet, but I hear you. :)

I think what killed it for me was the lack of crushingly demanding software and games. When we burned Crysis for being too demanding, we sent a message "hardware must lead" to game publishers, burning ourselves in the process, as combined with the success of consoles, the market for faster hardware for the sake of it has become critically too small. The 7700K says it all. The 580 (Fermi) was the last massive crazy-ass chip from nvidia, and the 2600K was intel's hilarious over-reaction to Bulldozer, the legacy of which has spanned the better part of a decade. Though I heard intel does have plans to re-invent again like AMD just have, but they're in no rush is the problem.

I've no doubt in my mind that if Intel wanted to make a truly ground breaking gaming CPU, purely as one, it'd run at 6ghz stock and boost to 7 with current gaming IPC. We see the iGPU as an addon, but really it's more that that part of the CPU is now being used for graphics.

Kudos to AMD for saying F that noise with Ryzen! :D
 
I've no doubt in my mind that if Intel wanted to make a truly ground breaking gaming CPU, purely as one, it'd run at 6ghz stock and boost to 7 with current gaming IPC.

Laws of physics say "no".
 
I do miss the golden age. Before my old 2600k build, I'd never had a single build for longer than 2 years. Performance doubled every couple years, and made it worth the upgrade. Today, we still haven't doubled the 2600k, except by cheating with more cores. And even then, it's not consistently doubled, just in some things.

GPUs have carried the torch since then, though. We've seen a lot of movement in that space, which is nice. Though eventually we'll hit a wall there, too.

As others have stated, there are a lot of problems with trying to push performance much higher than it is, save for slapping on more cores (which, again, is of limited utility). And I just don't think there's enough profit in what we call the PC market to drive the revolutionary (not evolutionary) changes that would be needed to get around the current wall. At least, not yet.

The funny thing that Ryzen proved to me is that today, overall CPU performance is so tightly clustered that it's more about market positioning than about absolute performance. Ryzen found itself a niche by offering a lot of cheap cores. 7700k has its niche by offering fastest single-thread and lightly-threaded performance. Hell, even the rogue Pentium 2c/4t CPUs found a niche by offering bargain basement respectable performance. It's all about respinning existing products into different markets and price points. That, and efficiency gains can still be had for a while, I think.

But we need revolutionary changes, I suspect, to get out of the rut we're in. There aren't many process shrinks left.
 
Did someone say..... G-O-L-D ???



Ordering shit from the back of a Computer Shopper magazine... Man, those were the days.

I feel old.
 
People are asking when the golden age of computing ended, 2 weeks before Intel is about to announce its response to the biggest competition in a decade. 2 weeks before AMD is set to announce an even bigger threat to Intel's higher end systems. 2 weeks before some of the most powerful consumer processors ever released are announced. 2 weeks before one of AMD's biggest GPU launches ever. Not to mention all the great small form factor stuff that is going to come out, and maybe a bunch of Nokia phones and other things.

If this isn't a golden age, then I imagine the Computer golden age was when a computer would suck your........stuff. Don't remember any of my PC's doing that so must have been before my time.
 
I think we're in the golden age now, with pc building becoming a mainstream hobby now with celebrities and YouTube channels and stuff.
 
The golden age died when AMD couldn't keep up with the Core 2 CPU's. With no competition intel has sat on its lazy ass
 
I think we're in the golden age now, with pc building becoming a mainstream hobby now with celebrities and YouTube channels and stuff.

  • And Steam bringing game publishers back to PC.
  • New form factors making old ATX behemoths seem like huge dinosaurs.
  • Boutique cases for both new and old form factors.
  • Computers are actually cool now.
I remember 10 years ago when it looked like gaming on PC was dead. This is a newer, better age. Whether or not you want to call it golden...I leave to you.
 
For me, the golden age of PCs ended with manufacturers locking down their processors and forcing buyers to pay for the segment leading models in order to be able to OC. I have fond memories of OC'ing my cheaper/lower tier Celerons, various Athlon XPs, Athlon64, Athlon64 X2s, and Opteron X2. ...The very reason why this site came into existence: getting the most out of out PC hardware, especially making the lower end stuff perform like (or better than) the higher end stuff.
 
The golden age died when AMD couldn't keep up with the Core 2 CPU's. With no competition intel had to sit on its ass as to not erase AMD as a "competitor"

FTFY

For me, the golden age of PCs ended with manufacturers locking down their processors and forcing buyers to pay for the segment leading models in order to be able to OC. I have fond memories of OC'ing my cheaper/lower tier Celerons, various Athlon XPs, Athlon64, Athlon64 X2s, and Opteron X2. ...The very reason why this site came into existence: getting the most out of out PC hardware, especially making the lower end stuff perform like (or better than) the higher end stuff.

CPUs have almost always been locked down. A lot of overclocking in the early days were mods and hacks to bypass rudimentary blocks to enable higher clocks; for example, the pencil trick and/or using a thin short wire between 2 socket pins to create a short.
 
Back
Top