Has "future proofing" ever paid off for you?

biggles

2[H]4U
Joined
Jul 25, 2005
Messages
2,215
Conventional wisdom with computers and technology is that making purchases with the goal of "future proofing" is unwise. The logic is that technology keeps on getting cheaper over time. So buying components or devices that satisfy a future need is not economical.

But we now have at least 2 exceptions to the rule, memory and graphics cards. Since both have increased in price by a lot in the past 12 months.

When I bought an MSI laptop a couple of years ago they threw in 16 gb of extra ddr4 ram for free (I guess there was a glut of ram at the time). So the machine has 32 gb, which is great. If I had to add ram now it would be a problem due to price (I currently am dealing with this problem with the desktop PC). So the extra ram "future proofing" paid off on the laptop.

Has this happened to any of you? I am guessing yes, especially on the gpu side.
 
I typically only ever pay 400 a year to every other year in upgrades to stay current. Rather than 1200-1500 every 5-6 years like some of my other friends. Same amount of money over time but I get to stay "top of the line" the whole time, where they like the feel of "leap frogging".

I think there are benefits to both, larger impact and all that.
 
"Future proofing" is kind of a nebulous term with goalposts that easily shift. If we all had crystal balls a year ago to know what the RAM and GPU market would be like now, we'd say that if you're buying to get 3+ years use, then you'll want to spend a bit more now to save you from spending even more later.

Future proofing can make sense if say, you play games @ 1080p now, but you know for sure you will be upgrading to 1440p/1600p/4K within the next 6 months so you jump for the 1080 instead of the 1060, etc.

Future proofing doesn't make sense if in the above example you "may" go to a higher resolution gaming and spend extra cash just because.
 
"Future proofing" is kind of a nebulous term with goalposts that easily shift. [...]
Bingo. I bought 32GB of DDR4 RAM on a whim about 2 years ago at $130 (specifically for a HTPC build that didn't work out), and boy was I lucky to have done so.
 
My buy cycles are usually:

5-6 years on CPU/Mobo/RAM/Monitor with a GPU refresh in the middle. So I'm getting a GPU every 2-3 years and a brand new system every 5-6.
Same. I expect my platform and display to last 5-7 years, so I build it as such. I don't expect my video card to last much more than 2 years.
 
Other than buying extra ram, nope.

I'd actually say one of my worst purchases was crossfire HD6950s for future proofing. God what a pain in the dick that was to get working..........
 
I guess it depends on how you look at it. I never really futureproof, I generally buy high end stuff because I want and I can afford but I guess it has worked out a couple times. The most recent and most noticeable would be the 1080Ti. I wanted that and bought it straight from nVidia day one. Normally that'd be a waste of money, just wait a bit and they'll be from VARs for less. However what with the crypto bullshit, it looks like a cheap purchase by comparison :p.

In general though no, I wouldn't "futureproof" I would advise more frequent upgrades rather than bigger ones.
 
I don't really consider "future proof" when I look to buy new hardware. That being said I always buy top of the line stuff when I do upgrade. This 7820X I bought last year for example - I know its going to last for years. Super fast single core performance and tons of cores for when games (they are now) start pushing it. Graphics cards I tend to be on the bleeding edge with. I have been upgrading whether I need the performance boost or not. However I don't see myself upgrading these 2 Titan Xp's any time this year or next to be honest.
 
future proofing works great for me...I bought an Intel Extreme CPU 8+ years ago because I wanted a 6 core early didn't want to upgrade my CPU for a long time...got my $$ worth...same with my monitors...I just recently bought my first G-Sync 1440p monitor after 8+ years of using a 1080p one...bought my first OLED TV this past July...am hoping it lasts 5+ years...big ticket items like that I expect a very long shelf life...other computer components like GPU, memory etc I know will require more frequent upgrades

you need to pick and choose and determine what items will stay on top for a longer period of time
 
I tend to buy bleeding edge on everything, because I find it's overall the cheapest solution. Unlike cars, which lose 80% of their value the second you drive it off the lot, computer parts tend to keep near their original value for the next generation (or two) before the price plummets. On average, since I started doing this in the early 2000s, I spend about $1000/year, and have practically all bleeding edge hardware.

Now, lately, that's changed. Since 2012, I've only upgraded my PC roughly every other year, because technology just isn't improving fast enough. But it's still not future proofing in my opinion, as I always had software which could max the machine out (as I do hobbyist video work, which eats up all the processors and RAM you can throw at it). Really, the only future proofing I can envision is when you buy a video card with a ton of RAM, and really, by the time any game uses anywhere near that, the card itself is going to be too slow to use it. So I'd say, no, future proofing has never paid off for me.
 
The only thing I future proofed is my sound card by getting a ODAC+AMP (which I've had since 2012). I upgrade whenever it makes sense. I usually buy a GPU every year but the last year has been lackluster.
 
It never pays off in the long term if I had my 6300FX chip instead of my 8700K I wouldn't be able to play half of the games released today. When I was still a member with the Intel Retail Edge program I switched out motherboards X4 times since 2014.Now I don't plan on upgrading for a while cause full installs and teardowns are a pain and just a waste sometimes. With Plug and Play graphics cards it's brainless....The thing I replace the least amount is either the PSU or the CASE I would say getting a future proof case is one of the most important things you could get. Mice only last a few years usually and keyboards come and go Monitors on the other hand are really important if you like one style in particular so maybe get 1-2 backup gaming monitors and whatever you use for general use.
 
Last edited:
The only "future proofing" I've experienced, for the most part, is entirely lucky and more related to market and industry trends. For example:

- ATI 9700 - who knew this card would have so much legs and nV would have nothing to compete for almost 2 years.

- Althlon XP - Similar story as 9700 except Intel was on the other side whose competing chips were hot, slower and buggy

- GTX 8800 - Massive step up in performance from prior gen pretty much ensured long life

- 16gb ddr4 - who knew there would be a DRAM shortage?

- GTX 1080 ti - the only one I sort of sensed prior. Combination of factors, foremost mining and lack of competition, had rocketed prices and stifled progress
 
Kinda worked for me with my GPU. I spent a lot more than I usually do on my 290x almost 4 years ago and here we are with GPU's either all out of stock or 300% over MSRP. As much as I'd love a new video card I can wait because this 290x is still getting the job done and my games are not really hurting.

I gotta think if I'd bought a cheaper GPU, my rig would be really struggling.
 
For a very limited definition of the word, yes. There are two ways I've found this unfolding in the last decade or so.

The first and more commonplace is spending a bit extra on higher end items that have a current benefit and grant more staying power. RAM is the best example here - I found that always buying one increment ahead of the "good enough for even higher-end gaming at the time" is a benefit. 4gb when most used 2, 8 when most used 4, 16 during common use of 8. The additional RAM was always a benefit in general use, as well as ensuring smooth gaming and other demanding usage, while "lasting" into the future. GPUs are a little different, but when a "mid grade" GPU was usually good enough for most titles settings on High/Ultra at common resolutions, I would often buy a higher end card and stick with it. This would vary based upon what was known as the best 'bang for buck high end" card of its time and also depends on the second phenomena

Secondly, is trying to select for a particularly enduring chip/refresh. There are certain products that end up ahead of their time in terms of performance (and sometimes, price) and "staying power". For instance, the Intel Nahelem/Westmere Socket 1366 CPUs and the X58 chipset was legendary in that it allowed competitive, satisfactory performance with Intel's later models years and years (5+ into the future - even longer if you replaced it with Xeon! I used a Nahalem powered rig as my main gaming PC until I swapped to X99 Haswell-E for instance! On the mainstream side, many have used certain Sandy Bridge i5/i7 2000 series years later, and AMD's Phenom II X4 and X6 Black Editions were really strong, long lasting chips in their era too. On the GPU side things could get wonky, but some models in recent years - particularly AMD 290 / 290X had a really long lifespan. Sometimes its a bit of luck, sometimes the stars just align so to speak, but sometimes you can pick a major update (ie a significantly different process or chip design) that can give you a major leap forward. Conversely, sometimes a refresh or later version kinda falls flat and isn't worth it, making the previous generation worth sticking around for awhile annd a better buy with similar performance!

There are of course aberrations - the pricing of RAM and GPUs going on now for instance is highly unusual in the last decade or two, for instance. Most price-jacking shortages were relatively limited (ie when the tsunami in Thailand caused HDD/SSD prices to rise due to inability to manufacture there etc), but things are a bit different now.

So ultimately some future-proofing can be done if you're smart about it. , but there are also many situations where buying way beyond your needs won't end up worthwhile, especially if you're picking out ultra-expensive items in markets that show a historical trend of being outdone easily by the next generation at a much lower price. Of course things are always evolving and we're finding that for a variety of reasons hardware is lasting longer than ever - and I personally thing this is a GREAT thing; I dont' long for the 90s or circa 2000 where you could build a $3000 rig and not be able to play games in a year or two on high settings anymore. We must also try to vote with our wallets when companies do ethical things in regards to both pricing and design, making good products in and open manner and allowing them to age gracefully with good support.

PS. Peripherals ARE one place you can often future-proof to some degree and buying higher-end items are certainly worthwhile, especially for niches like Arcade-style controllers, Flight/Space/Mech sim sticks or HOTAS etc... but also right down to using a quality gamepad like the Steam Controller, Xbox One (Elite), or DualShock 4 versions
 
My mantra has always been buy the best pieces at the price points within your budget and sell old pieces to reduce the burden of the newer ones. Let me say, glad I bought by 1080ti before all this mining crap lol.
 
I can't recall the name of the card to save my life, but I blew all my money on a Radeon card (9800 Pro maybe?) back in 2003'ish that lasted me something like 6 years with need for anything new.
Normally I'm on a 2-year cycle.
 
I can't recall the name of the card to save my life, but I blew all my money on a Radeon card (9800 Pro maybe?) back in 2003'ish that lasted me something like 6 years with need for anything new.
Normally I'm on a 2-year cycle.

Could have been the 9800XT. It was something absurd like $500 MSRP in 2003.
 
My buy cycles are usually:

5-6 years on CPU/Mobo/RAM/Monitor with a GPU refresh in the middle. So I'm getting a GPU every 2-3 years and a brand new system every 5-6.

I do about the same. I see no need to upgrade my i5-3570k, but I'm looking to replace my GTX 970 with a GTX 1080 Ti.
 
For the last decade, I've basically been riding the wave of previous generations. If I build it, the new stuff is on the horizon or just released, and if I buy, it's off lease 2+ years old, but I never spend a lot.
 
I do about the same. I see no need to upgrade my i5-3570k, but I'm looking to replace my GTX 970 with a GTX 1080 Ti.

I normally don't even make those level of upgrades, even though that's a solid one. The last systems I've had were:

P4 2.8G w/ Nvidia 5600 and 512MB RAM
Swapped main parts for an AMD 64 3000+ with a 7800GT and 2GB RAM

then:

Asus P5Q w/ QX6850 built it with an AMD4870
"Upgraded" to a 5770 because I hated the overheating and crazy power draw on the 4870
Finally settled on a 7870 for a little while

Then I upgraded to my current brand new system
i5 4670K w/ Z87-G45
R9 280X
Upgraded to nVidia 1070 Quicksilver Christmas of 2016
Got a 27" 1440p 144hz Monitor Christmas of 2017
 
Yes it has. I am still using a X58 chipset with a Xeon X5670 clocked at 4.4ghz. It can keep up easily with todays games. Video cards IMO usually do not age well unless you by the top tier card and hold on to it for a few years.
 
Best initial purchases IMO, over the years.


I7-920 (came out in 2008, would still be fine today)

1000 watt power supply (came out long ago, would still run any sli system today)

High end PC case (ATX seems to never change-except for USB ports)

Nice Corsair AIO radiator (they give away free cpu socket bracket adapters every gen to existing customers)

DDR4 (prices skyrocketed)
 
I have had different points where "future proofing" did pay off for me. I purchased a Voodoo 1 6MB from Canopus pure 3d. I used this care until the tail end of the voodoo 3 era when they were discounted out which allowed me to game just fine without buying Voodoo2 or 3 when it was new and expensive. During the Geforce 400 series I bought 4 cards and then Nvidia suddenly changed their driver to un-support Quad SLI which took awhile with hacked bios's and things to get working again but that prevented the need to get 500,600,700 series as I gamed just fine for a long while. I suppose I got lucky this last time. I fully upgraded when Ryzen Launched and got 64GB of DDR4 3200 and then later got 2 VEGA 64 LE's for 559 each. I also bought a Vega Frontier edition for 750 for my Mac Pro. All which seems like a good idea at this point.
 
You could have future-proofed with just about any CPU from the last 10'ish years if you're willing to overclock. Not even much, just enough to give 'em a 10% boost. First gen i7's are still absolutely fine for anything but serious gaming or video transcoding.
 
Everytime I have bought something and said to myself, this will last ages as its futureproof, usually within a year or two shit is broken or superseded by something much more “futureproof” and here we go again, back on the merry go round of buying shit that just does not last.
 
I'm still rocking a socket 1155 2600k and original 1155 motherboard (the one's they recalled). Slapped a 1080 in there and still works just fine for me. I don't future proof.
 
The RIVE with PCIE 3.0 with the 3930k appear to be holding up nicely. Only rivaled by my old BP6 in terms of system longevity. Moore's Law is pretty dead so cpu advancement has stagnated. However, other things such as storage have really come a long way in the past couple years. Buy the best you can afford at the time and hope for the best. Incremental upgrades nowadays, no need to build a whole new system every six months like the old days.
 
Last edited:
Still running my Sandybridge i7-2700K from 2012 with the same m/board, RAM, PSU and case.....so yes? Easily handles everything I throw at it on a 3440x1440 monitor at maximum details coupled with a 1080ti .
 
As someone who has upgraded their CPU a couple times without really wanting to (my wife gets my hand-me-downs), the gaming performances increases you get from a new CPU are minimal at best. Hell, some older setups actually do better because you can get a higher overall clock speed across a couple cores. Emulators love that. As long as your CPU isn't a bottleneck, it's an upgrade I'd save for last.

Another mostly future-proof upgrade is an SSD. Outside of pure benchmarks, an SSD from 5 years ago works absolutely fine right now. So well that I bet 99% of people wouldn't know the difference between an old one and a brand new one.
 
I think there is a middle ground to future proofing. Gotta find out what you think you will do in the next few years with your computer. Playing CS-Go or games like that don't require any future proofing IMO.
 
When someone asks me for advice on computers, my first question is "what's your budget? Now add 10-20%." Second thing I say now is "buy a Mac", but when I was building, or advising on a build, I always would get everything that maxed or over-maxed the budget, making sure it was rock-solid proven tech and manufacturers, with the best storage, double recommended RAM, and best vidcard possible under the budget. Futureproofing by best hardware under the budget.

And before someone says "buying a Mac isn't Futureproofing": yer wrong, because the average Mac user can go 10 years or better on the same hardware, barring failures. They build rock-solid machines, and the BAD-based OS is rock-solid, too.

That all said: I built my X58 i7-920 system in 2008 with everything I could afford at the time: great PS, nice case, HD4870, WD Black HD, and I only skimped on the RAM at 6gb (which I regret). I have never OC'd it. It games well at 1080p, it handled most of what I threw at it, except when torrenting alot & having a bunch of FF tabs open alot bogged the memory (regerts). Basically tho, for 10 years, it has been a rock-solid machine, and the recent upgrades (SSD, Xeon X5670) along with planned upgrades (more RAM, better vidcard), should allow me to run for at least another 3 years easily, especially after I OC the Xeon. ;)

I.e., Futureproofing DOES work, if you plan it properly.
 
I sort of lucked out with my future-proofing. When I made my purchase in 2013, I ended up going with an i73960x and 4 AMD r9 290x video cards (along with 32gb DDR3). That system lasted me until a couple of months ago when I switched my monitor to a 4k setup. I still have the system (its in the guest room to LAN on). I used to do switches every year prior.
 
Never, or at least almost never. Only past proofing worked for me :D
Let me explain. When Core2 came out I didn't have the money to replace MB, CPU, VGA, and RAM all at once. So I got an MB that supported both AGP and PCIE video cards, and also could run both DDR and DDR2 ram. So I only had to buy the CPU and MB at the same time.

The only thing that you can successfully future proof nowadays is a PSU.

When I got my X79 I thought I could upgrade from a 3820 to a 3930 or 4930 later. But those CPUs never got cheaper, they simply disapparead. And it was "cheaper" to get a x99 board with a 6800K than to simply get a 4930K. And the same is going on now. I'd be better off getting a x299 with a 7820X than trying to get a 6900K or 5950X that fits my current MB.

They make sure you can't future proof anything.
 
I think you absolutely can at least in the CPU/Mobo/RAM department. There hasn't been a lot of changes in IPC since Sandy Bridge.
So really, if your goal was to save money, you could've just been using the same setup for the past 7-8 years (I have actually...). The only reason to upgrade to a new CPU right now is for better performance per watt and efficiency with things like native h.265 decoding. But if you have a 4GHZ+ machine anyway, software decoding is still more than sufficient for all the improvements (especially considering how many things are still single thread and not multi threaded). And although there have been many cumulative gains, one could still make the argument that none of them are important enough to justify spending $600+ dollars on (when also considering the cost of mobo and RAM), unless you just want to.
Kyle here on HardOCP proved that point by testing Kaby Lake vs Sandy Bridge, and saw that for clock for clock there basically wasn't very much if any IPC gain. So I guess future proofing in this case "worked" for me?

GPU's on the other hand, can't really be future proofed at all. If you want to be "top of the line" basically you must upgrade every other release. Even if you buy the best of the best, by the third generation, you'll be begging for an upgrade.
 
Back
Top