When did golden age of computing end?

The golden age died when AMD couldn't keep up with the Core 2 CPU's. With no competition intel has sat on its lazy ass

Rubbish, mobility and datacenter have progressed amazingly. Even the desktop had lots of improvement since Core 2.

The real issue is software, just adding more cores isn't innovation.

ARM is without "competition" due to being a monopoly in its instruction set, and their next step is this:
https://liliputing.com/2017/05/arm-cortex-a75-cortex-a55-and-mali-g72-details-leaked.html
 
I don't think the golden age is done by a longshot.The 'golden age'=The wild west , and we are still firmly there.
My first computer was a Timex-Sinclair 1000.
Yes, a lot has changed, but we are a long way from computers being seamless in every aspect.
----
How do I define the golden age?
Everything before borg.

:D
 
Just my 2c, while the past was great having basically a compelling reason to upgrade every 9-12 months (talking about the race to 1ghz through dual core stuff), we need to consider what we can get with our dollars these days. I currently have a dual core haswel g3260, (44.95 retail from MC 2 years ago), 16gb of ddr3 that I think I paid about 60$ for from visiontek, a ecs itx b85 mobo for $30 AR, and an AMD RX 460 for $60 AR a few months ago. Basically a $200 htpc that can play most stuff at 1080p... Probably 60-80% as capable as a high end i7, 1080gtx monster for gaming and can rip though any productivity kinds of apps. The cpu destroys anything AMD pre Ryzen, and the gpu I feel is better than my old 7970 / 680 gtx. I think this is the golden 20% cost for 80% functionality rule at work.

Perhaps the high end has hit a wall where it takes tons of capital to go make iterative gains, but the trickle down stuff is pretty amazing, and I would consider this more of a golden age for consumers in the mid range of stuff. I am kind of pissed that Intel has monetized these gains to the max, requiring essentially a new cpu, mobo and even hdd tech (sata --> msata -->m.2 sata -->m.2 nvme), for minimal platform gains. I mean usb 3 is cool, and 4k60p is cool over HDMI, and a one off tech is cool, but do we really need a new platform / chipset each step of the way? Intel looks to be doing it again with 2011v3 to x299 ;)
 
It was around 2005~2006 when single threaded performance improvement took a nosedive, and we've seen only marginal CPU improvements each year since then.

I know Nvidia and Jen-Hsun say a lot of things, but I mostly agree with this crude graph (CPU is blue):

AI-Revolution-Moores-Law-768x458.png


For me, the "Golden Age" of computing and computer games was in the mid 90s to early 00's.

My first family PC was a Pentium 166MHz, which was close to top-of-the-line in early 1997 when we got it. In mid 2001 we upgraded to a computer with a 2.0GHz CPU. In only 4 years, that jump was incredible.

I also miss going into just about any gaming store and there being shelves upon shelves of big-box PC games in them.
 
Last edited:
For me it was the launch of the x2 procs from AMD and no longer having to buy server boards for light multitasking. I never cared much then about relative single core performance. When I bought my first dual socket Skull trail board, which I still use, I was in heaven. I think Zen is calling my name if I can get a 16 core with enough RAM support. I might even be willing to fire up my water to water deep well chilling loop again.
 
It died when we could no longer use our lead pencils to unlock faster processor speeds :peeking:

Ah, back then I had cathode tubes under lighting my case ...
 
It died when the US Government decided not to break up Microshaft under the Clinton administration, and we got stuck with Windows 4ever Edition. The software has always lead micro code, which has always lead hardware development. If better micro code and software were developed, more advanced hardware would become mainstream. Computers need a "killer app" to be relevant in the mainstream or home computing, now that "killer app" is becoming home mobile integration and automation, which does not require much root hardware advancement, only peripheral advancement.

As far as the performance plateau of existing manufacturing tech, it was only natural considering the molecular limitations of the materials involved.
 
To be fair, it was the Clinton administration that went after Microsoft, and it was the Bush administration that dropped the effort, although the obvious anti-MS bias of the judge's statements in the case was an unforced error that gave MS an easy appeal.

As a former OS/2 fan, I hate MS and monopolies in general, although in hindsight, it is unfortunate that there was such a strong parallel between IBM in the 70's and MS in the 90's in terms of both of them being threatened with being broken up for anti-trust reasons, which they were ultimately able to resist but the effort preoccupied them enough for them to "lose their mojo", which caused IBM to lose the personal computer and operating system wars, and MS to lose the smartphone, music-player, and internet browser wars.

I'm too young to have seen IBM's dominant era, but I'm old enough to have seen Microsoft's inferior products controlling the industry.
 
Well, in some ways, with the internet. I loved my Commodore 64, originally had a tape drive, then the 1541 disk drive. Typed many thousands of lines from magazines, thank jebus for check sums.

Long story short: From 1983 to 1995, except for BBS stuff, I was happy to play on my various computers.

Now, if the internet is down, my computer is somewhat useless, and actually angers me.
 
Obviously, 2011 when AMD released Bulldozer and didn't provide competition to Intel for the next 6 years.
 
To be fair, it was the Clinton administration that went after Microsoft, and it was the Bush administration that dropped the effort, although the obvious anti-MS bias of the judge's statements in the case was an unforced error that gave MS an easy appeal.

As a former OS/2 fan, I hate MS and monopolies in general, although in hindsight, it is unfortunate that there was such a strong parallel between IBM in the 70's and MS in the 90's in terms of both of them being threatened with being broken up for anti-trust reasons, which they were ultimately able to resist but the effort preoccupied them enough for them to "lose their mojo", which caused IBM to lose the personal computer and operating system wars, and MS to lose the smartphone, music-player, and internet browser wars.

I'm too young to have seen IBM's dominant era, but I'm old enough to have seen Microsoft's inferior products controlling the industry.

While you are not incorrect about monopolies, imagine a world with a dozen orso popular OS' and programmers having to optimize their stuff for all of those or having to run a handfull of them to be able to use all the software/games you want to, thinking about what would happen to driver support alone almost gives me nightmares.
 
Once machine learning takes off, it will be really interesting and scary.
 
While you are not incorrect about monopolies, imagine a world with a dozen orso popular OS' and programmers having to optimize their stuff for all of those or having to run a handfull of them to be able to use all the software/games you want to, thinking about what would happen to driver support alone almost gives me nightmares.
It would have been easier than having to write for different operating systems AND hardware, which was the situation in the 1980's -- applications were written for the Apple II, C64, IBM PC (DOS), TRS-80, and then later the Amiga and Macintosh -- and the market survived all that diversity just fine.

But what really bothered me about MS wasn't the market dominance they had (built more or less legitimately), but the way they used it in anti-competitive ways -- the undocumented API's, the license deals that were based on total number of systems shipped regardless of OS, the "giving away for free everything Netscape is trying to sell" so they can't be a viable business competitor, etc.

And I agree with Steve Jobs' critique of MS that "they didn't bring any culture to their products -- and I mean that in a big way, not a small way". By "culture" he meant their products didn't change the way people did their work or lived their lives or saw the world, and that's very true -- most MS products were "good enough" copies of their competitors that only succeeded because of MS's control of the OS -- they made a lot of money but didn't revolutionize anything the way the Macintosh and Amiga did (and I never owned either of those).
 
Last edited:
If you work in datacenters...the golden age of computing has just started.

.oO(Silly private consumers....thinking they are the important market...and they matter...if only they knew...)
 
  • Like
Reactions: noko
like this
I think the golden age of the 'internet' has passed. Going forward, there will be faster speeds, more content and access to materials, but with more limitations as well.
 
The real Golden age has just begun (Quantum computing). GPU speeds outstripped CPU capability a long time ago, is on a continued increase that is faster than any CPU increase in the past. More and more stuff, heavy stuff is being done on GPUs. Computing now is way more interesting, versatile at a ever increasing rate. My thoughts.
 
The Golden Age of Computers died with the birth of modern cellphones, social media, and gaming consoles. Before the internet there was the sneakernet. Our computer user group would rent several hotel rooms near computer shows and the geeks would arrive from all parts of the country schlepping 15" CRTs, computer systems, and cases and cases of floppy discs up several flights of stairs... a geek's version of the Swap Meet. Everyone left with lots of stuff they didn't have before. LOL
 
The real Golden age has just begun (Quantum computing).

https://phys.org/news/2017-08-hype-cash-muddying-quantum.html

GPU speeds outstripped CPU capability a long time ago, is on a continued increase that is faster than any CPU increase in the past. More and more stuff, heavy stuff is being done on GPUs. Computing now is way more interesting, versatile at a ever increasing rate. My thoughts.

GPUs are throughput machines. So performance rises almost linearly with number of transistors. CPUs aren't throughput machines. So performance isn't lineal with number of transistors.

Scaling up GPUs is much easier than scaling up CPUs; however, GPUs are only good for a very limited subset of workloads and then they suck at everything else. There are no alternatives to CPUs but there are several alternatives to GPUs. GPU computing is overhyped.
 
What a giant load of hogwash!
Golden age?
Boom times?
All in the past?

Nonsense. Today is golden. More capability. More power. More functionality. Lower cost.

What is it about ancient 80s, 90s, 00s computers that you find so attractive? Those days were not golden at all.

Let's see you tolerate a computer that can run ONLY ONE app at a time ... and CRASH ten times a day.
Golden indeed!
 
  • Like
Reactions: noko
like this
I think the last wow performance upgrade on a processor was the I7-920 generation with x58 motherboards. It definitely seems more incremental since then.
 
What a giant load of hogwash!
Golden age?
Boom times?
All in the past?

Nonsense. Today is golden. More capability. More power. More functionality. Lower cost.

Did you pay attention to the graphs in #20 or even to the OP?

Golden was when someone in the group of best friends purchased a next-gen machine and everyone was impressed because it was 2--4x faster, with more colors and better sound. Now, no one of us purchases a next-gen machine because each new core is only 5--10% faster than the cores in our machines. We have to wait many many gens to get a decent upgrade.

What is it about ancient 80s, 90s, 00s computers that you find so attractive? Those days were not golden at all.

Lots of things. Including it was all much more DIY. I have had much more fun playing some ancient games than many modern games, which are visually appealing but terribly boring. This golden epoch also applies to movies or songs; classic movies are 1000 times better than most modern 'blockbusters'.

Let's see you tolerate a computer that can run ONLY ONE app at a time ... and CRASH ten times a day.
Golden indeed!

Funny indeed, when I never had a crash in any older machine, but I did learn what is a crash when I suffered by the first time a blue screen on a modern machine.
 
Last edited:
Well said juanrga.

Yeah, I'm dumbfounded when I read people saying that we're in a golden age now because the Ryzen Threadripper has been launched, etc. Step back and look at the trends, and the story is obvious -- there was a golden age when Moore's Law and the upward march of clock speeds meant your computer's hardware was only able to run the latest software (which was evolving at the same blistering pace as the hardware) for a couple years at most. But the effect of upgrading to a new system was jaw-dropping. I was one of the few people in my computer dorm that could run Falcon 3.0 at a playable framerate, because I had a 486 with the required FPU coprocessor. Once 3D accelerated graphics were the norm, the rule of thumb for me and my friends was to upgrade the video card every 12 to 18 months and the CPU/motherboard every 2 to 3 years.

I also think there's two separate issues here -- the golden age of computing vs. the golden age of computer gaming. The former is more easily understood as I described above -- all you have to do is look at the performance vs. year and the story is obvious. The latter is more debatable. I feel bad for kids these days playing Destiny on their PS4's, knowing that in 20 years when they're feeling nostalgic, they won't be able to "go home again" and play the game, because the Destiny servers and even the PS4 network will be shut down.
 
The golden age ended when computer cases and peripherals were offered in colors other than beige.
That's the end of the age of the classical nerd with the glasses. When computers were only considered a fad and most people had no interest in them. Now everything is a computer, including your thermostat, your fridge, perhaps your damn toilet bowl as well. So the golden age is very much here.
 
https://phys.org/news/2017-08-hype-cash-muddying-quantum.html



GPUs are throughput machines. So performance rises almost linearly with number of transistors. CPUs aren't throughput machines. So performance isn't lineal with number of transistors.

Scaling up GPUs is much easier than scaling up CPUs; however, GPUs are only good for a very limited subset of workloads and then they suck at everything else. There are no alternatives to CPUs but there are several alternatives to GPUs. GPU computing is overhyped.
GPU computing is putting many CPU computing tasks to menial labor stuff for example:
  • https://www.kdnuggets.com/2017/08/rise-gpu-databases.html
    • Something that was totally CPU in the past, CPU today can't even come close and are irrelevent. Today the world is moving to more data driven solutions - age of Data so to speak
  • Rendering
    • Another area which are putting CPU's out to pasture
  • Video editing
  • AI
    • Good luck getting any cpu or cluster of them that will drive a car
  • My favorite - Mining :D
    • Golden age CPU's all put together would not equal one medium size mining farm
For those who think the golden age was so golden - go get a vintage PC or several of them years apart - The gains may be great at the time but in the scheme of things going from black and white to 16 colors then 256 colors really was not much at all.

Quantum computers, kinda like GPU's will allow processing of things that were not feasible or even remotely considered before. Today CPU's are more a general laborer vice a specialist.
 
My favorite - Mining :D Golden age CPU's all put together would not equal one medium size mining farm
You weren't referencing "golden age" but GPU vs. CPU, so I'm not sure why you dropped that in.
Anyways, there are crypto's that have specific mining techniques. Take Burst for example. Not CPU or GPU.
 
You weren't referencing "golden age" but GPU vs. CPU, so I'm not sure why you dropped that in.
Anyways, there are crypto's that have specific mining techniques. Take Burst for example. Not CPU or GPU.
Thats true, storage mining is on the ups so seeing much higher prices for SSDs may start happening.
 
Thats true, storage mining is on the ups so seeing much higher prices for SSDs may start happening.
Mining is not the golden age of computing, it's the damn middle ages. It's medieval as hell. Using electricity to complete virtual transactions in virtual currency. It creates zero value. It's pointless waste of resources for the short term gains of a few ignorant/indifferent individuals.
 
In some ways the "golden age" died with Windows 95 and the Internet.

With that said, Windows 95 created what was needed for Linux to succeed (cheap memory). So maybe the Platinum Age began then?
 
Just my opinion, it happened about the time when overclocking was fully within the BIOS. ...no dipswitchs, jumpers, hard modding, etc, etc.
 
I feel like ranting about this.

Maybe I'm just old. But for me, the golden age ended when my ridiculous overclocking schemes ended. I remember the Gold Finger Device trick with the Athlon 650. The pencil trick with my Tbird. I fried a bunch of cheap-ass Durons, too, in the pursuit of OC stupidity (hence my name). I remember my PII 400 that I OC'd with a box fan and a stack of, I shit you not, 7-11 big gulp cups filled with ice. Then there was the freezer incident with the Tbird, and sticking my computer outside in the sub-zero Oregon air just to break an OC record for no good reason whatsoever. Hell, I even OC'd the stupid 486 SX-33 I had back in the day.

I'd get excited about builds back then. They were always far faster every couple of years. The performance increase you could feel. There were crazy-ass LAN parties, one of which I fell asleep in my chair for at like 3AM. When I woke up, there was a slice of pizza dangling from my graphics card, a TNT2 Ultra. My plate had fallen into the box somehow. The thing was dried out and glued to it because of the heat. Fortunately, the card still worked. LAN parties were so nostalgic for me that, when I got married back in 2014, my bachelor party was an old school LAN party. No need for the stripper thing - I DJ strip clubs sometimes, and know most of the strippers around here. It'd have been awkward. But the LAN party was fun - scrounging machines together from parts I had sitting the garage for years. Quake 3, Starcraft, Unreal Tournament... missed that shit.

It started going downhill for me right about when I built the Q6600 box. The build was smooth. No issues. The OC simple. No fun in it, really. The performance compared to my previous Athlon X2 was better, but not MUCH better, really (shit was poorly threaded back then - it was hard to saturate even a dual core much less a quad). Games were moving away from LAN and private servers and, whether because of my age, or because shit was changing, they were less fun to me. I spent less and less time gaming. The 2600k build that came about 4 years later. The Q6600 build still lives, powering my arcade machine. The 2600k build was likewise smooth. I didn't really need the upgrade at the time - but I had the itch to build and it at least offered justifiable performance upgrades over the 6600.

I ran that 2600k box into the ground, because every time I looked to see if it was worth it to upgrade, the answer was no - except for the GPU. That box lasted 6 fucking years, and in the end, the motherboard started to get seriously screwed up. With some repairs, that box still lives as an HTPC - but I wouldn't trust it for any truly demanding load anymore. And half the SATA ports are fried. I literally wore a machine out for the first time.

New box is great. First 1700X, then 2700X drop in. 1080 Ti. The most stable, easy build I've ever done. But I hardly game anymore and, aside from work (compiling, rendering, etc...), I really don't use this thing's full power. It's almost a waste (especially the poor 1080 Ti that sees far less use than I expected when I bought it). And now... I don't even need to hit the automatic overclock BIOS, or tinker at all with overclocking period. With Zen+, you get max performance out of the box. No box fans. No freezers. No LAN parties. Few games... And in single thread... it's hardly faster than a 2600k (to be fair, the same can be said for Intel CPUs). Most of the gains just come from throwing more cores at my work. Good, but boring.

It's almost depressing. It must feel like what happened to gearheads when the muscle car era died in the 70s. Hell, maybe we can compare AMD's Bulldozer to the Ford Pinto or the Mustang II, or something.
 
Frankly computers now excite me way more than the anemic ones of the past. Seen more advancement in the last 5 years then the previous 20. If one only goalpost is cpu from one generation IPC on a single thread to the next and nothing else then yeah, your golden age is over.

AMD bulldozer to Ryzen - a hell of a jump in performance. Talking overall and not some single thread limited software.
 
AMD bulldozer to Ryzen - a hell of a jump in performance.

I say like the P4 to core a good part of the jump was because the previous generation was a low IPC design that was very inefficient.
 
Mining is not the golden age of computing, it's the damn middle ages. It's medieval as hell. Using electricity to complete virtual transactions in virtual currency. It creates zero value. It's pointless waste of resources for the short term gains of a few ignorant/indifferent individuals.
Much like video gaming, or OC'ing for OC'ing sake. Both ultimately worthless and power consuming.
Think of crypto as the greatest video game of all time. It sold shit tons of hardware. It appeals to the OC'er/tweaker as performance could be greatly enhanced by doing so.
And the kicker is you win real $$ not just epeen!
Enjoy you hobby, and let others enjoy theirs.
 
Thats true, storage mining is on the ups so seeing much higher prices for SSDs may start happening.
Speed in storage mining isn't a factor, so no. SSD's, at least in the current forms of mining, are a non factor.
 
Well, over the years, since I am a custom PC builder, I can recall several distinct years when my sales took huge nosedives and fully attributable to Microsoft screw-ups.
1. Release of Windows Vista - 2006 years. I had stretch the life of Windows XP for another 5 years until Windows 7 came out as my defacto replacement.
2 Release of Windows 8 - I had to stretch Windows 7 out another 3 years until the Windows 10 update was released.

When Microsoft screws up on an OS release, bad news travels fast and really hurts the industry. It takes me a lot more work to convince a customer to upgrade or downgrade as the case may be and some customers will not even take the chance and continue to use the older/reliable OS for a couple more years thus hurting my ability to custom build computers quite a bit. I am not talking a lot of PC's mind you, but in my best year as a consultant I built 9 computers and there were 3 straight years I only bought/built 2.

I'm not even going to get into the damage US government policies have affected computer growth over the years, but I know they are culpable for some of it too alone with consoles and smart phone explosion.
 
Back
Top