Has "future proofing" ever paid off for you?

I have found that future proofing is hard as over time I upgrade my computers due how I use them. Although 3 years has been a happy place for the most part.
 
Not quite relevant to tthe thread but the reason for the recent change in intel processors with regards to more cores in consumer version is - i believe - due to amd getting their act together that actually provides a real competition.

Future-proofing: it's all a gamble..

Recently, the 2600k was probably the best example. I got the 2500k even though I could have picked up the 2600k, and regretted it years later. CPU really was topped out and holding stuff back, and at the same time simply not as smooth due to lacking the extra threaded resources.

Beyond that, good enclosures, good power supplies (even lower-wattage, my X650 is approaching a decade old and has seen multiple multi-GPU setups!), good peripherals and good audio stuff last.

CPUs (and accompanying motherboards and RAM) will be subject to the whims of the industry. Intel has had a hard time migrating processes, for example, which is one of the reasons that the 7- and 8-series were still on the Skylake arch, and why Intel popped out a six-core (and now potentially an eight-core) at 14nm when they'd planned to move to eight-core on a smaller process before now. Memory prices have been a shit-show all along, and memory technology shifts seem to have sped up a little. DDR5 will be interesting.

GPUs are a bit more steady though; unfortunately, that means that they're hard to 'future proof'. For those that don't have moving performance targets, i.e. are happy with 1080p60, that means that the price of entry for high-setting AAA-game performance has steadily dropped, while those of us that are interested in higher detail and higher framerate/higher motion resolution will have to keep up.
 
Not quite relevant to tthe thread but the reason for the recent change in intel processors with regards to more cores in consumer version is - i believe - due to amd getting their act together that actually provides a real competition.

Given how quickly the 8700k came out, generally speaking, it had to have been in development before Ryzen performance (or anything, really) was known. Intel may have kept it in reserve, but it was also clear that the 6700/7700 were not driving sales from a performance perspective. Without their new node and new arch, six cores was about all Intel could do.
 
Well the server processors had quite a few cores compared to the consumer version so I think it was obvious that intel knew how to make the chips they just didn't probably to prevent competition with the server chips which are quite expensive. Of course I could be wrong - they could have been planning on the 8700k release for years.
 
Well the server processors had quite a few cores compared to the consumer version so I think it was obvious that intel knew how to make the chips they just didn't probably to prevent competition with the server chips which are quite expensive. Of course I could be wrong - they could have been planning on the 8700k release for years.

Sure! But those have all been quad-channel HEDT parts with no GPU, and all have run at lower clockspeeds. The six-core 8-series CPUs look a whole lot more like Sky/KabyLake CPUs with two more cores tacked on than a stepped-down Xeon.

Another perspective: yes, Intel has 'known' how to make higher-core parts for quite some time. The challenge on the desktop is TDP; despite our enthusiast outlook at the [H], the customers Intel most has to please are Dell and HP. Getting six cores in a desktop package that is acceptable to the OEMs was their challenge, and while we did not see significant performance improvements on the desktop up until the 8700k, there were significant improvements in server and mobile products. The six-core CoffeeLakes are most likely the product of Intel back-porting their efficiency work that for example enabled their quad-core ultrabook CPUs to the desktop.
 
i bought a 1000watt power supply in the 6970's days and i still have it running after 5 platform changes and countless sli and crossfire setups.

actually i think i bought it for 3 way gtx 260's
 
My buying habits usually comes with a feel like it sort of attitude. I just enjoy the upgrade and building process and the knowledge I continue to gain from that.
 
The 1TB 740 Evo SSD I got YEARS ago was by far the best single PC component I ever bought. I think I am on my third if not fourth build with that drive.
 
The Q6600 and 2600k lasted me through multiple GPU upgrades each. Offered extremely strong value over time when overclocked. Always worth it to buy a high quality power supply which you can use for 10-15 years.
 
I dont try to future proof often.
Some purchases have lasted a very long time by luck, although some need manual work to keep the value.

2500K lasted me 5 years, didnt expect that.
I was upgrading every 1 to 3 years before that.

My 1080p Optoma projector is 5 years old, gets well over 8 hrs use per day on average and the bulbs are only £50 @ 4,500hrs+ each from Amazon.
Only downside is I had to replace a fan after just over 2 years which involved almost complete disassembly. Optoma cost would have been £160, my cost £20.
It needs taking apart every 2 years to clean dust because there are no filters.
It otherwise operates like new still.

Samsung laser printer @ only £53 new is nearly 10 years old and still works great. Although the toner powder needs evening out/levelling in its cartridge before use now or it complains.
This is one of the few intended future proof purchases and it worked. Better quality, way cheaper and an easier experience to boot.
My previous ink jet printers cost a bomb in ink and refused to work without 1/2hr+ troubleshooting when I needed a quick print job. Sod that.

Corsair AX750 PSU, now over 7 years old and is still perfect for my needs.
The 7 year warranty and Seasonic internals sold it.
I intended to make this one last and it hasnt disappointed.

LG 2780D 27" 3D monitor is almost 6 years old and is a great basic PC display when not using the projector.
I didnt plan on keeping it this long.
I have no need for 4K browsing so its a keeper.


I dont consider graphics cards future proof because I will only get 1.5 to 3 years from each.
Not a lot of future there.
 
Yea for posterity my windows 2500k lasted 7 years and my linux 2500k is still ticking. technically my windows 2500k is still ticking but it was moved to a spare computer i only use once every 2 or 3 months. I try to get 2 or 3 years per windows gpu - but I suspect the 1070 might last 4 or 5 as I'm not in a hurry for 4k. Monitors are random - sometime they last 10 years and sometime 3. I've had 2 dells just outright fail after 4 or 5 years (20incher turned out had known flaw component - well known for those who kept it 4.5 years); one of my dell is too dim to be of much use after 5.5 years. Not sure how long my aoc Q2770PQU 27 inchers will last - but so far they've made 4+ years.
-
I tihnk I already said future proofing is usually not cost effective (imho).
 
Don't know; I don't think I kept any particular build long enough to find out ....
 
Back
Top