Has "future proofing" ever paid off for you?

I think you absolutely can at least in the CPU/Mobo/RAM department. There hasn't been a lot of changes in IPC since Sandy Bridge.
So really, if your goal was to save money, you could've just been using the same setup for the past 7-8 years (I have actually...). The only reason to upgrade to a new CPU right now is for better performance per watt and efficiency with things like native h.265 decoding. But if you have a 4GHZ+ machine anyway, software decoding is still more than sufficient for all the improvements (especially considering how many things are still single thread and not multi threaded). And although there have been many cumulative gains, one could still make the argument that none of them are important enough to justify spending $600+ dollars on (when also considering the cost of mobo and RAM), unless you just want to.
Kyle here on HardOCP proved that point by testing Kaby Lake vs Sandy Bridge, and saw that for clock for clock there basically wasn't very much if any IPC gain. So I guess future proofing in this case "worked" for me?
That might be true for gaming, but as someone who does a lot of rendering and video editing, I'm always craving for more computing power. Whoever said competition is not driving prices down... just look what happened to Intel cpu prices after ryzen and threadripper. I bet if those didn't came out, or were significantly worse than they are intel would still try to sell us 6 cores for $700 and 8 cores for $1500
 
I've gotten lucky a few times, that's about it....

The trusty ole' 9800 Pro lasted wayyyy too long.

Only other noteworthy component would have to be my i5-750.
 
16gb ddr4 sticks

64gb ddr4 300 dollars :giggle:

of course gtx 1080 tis was, no reason buy non ti or 70s :D
 
That might be true for gaming, but as someone who does a lot of rendering and video editing, I'm always craving for more computing power. Whoever said competition is not driving prices down... just look what happened to Intel cpu prices after ryzen and threadripper. I bet if those didn't came out, or were significantly worse than they are intel would still try to sell us 6 cores for $700 and 8 cores for $1500

The proof is in the pudding. IPC is IPC. Insert third "truism" statement here.
The long and the short, the numbers speak for themselves. There have been far bigger gains in going with more than 4 physical cores and in GPGPU than there have been with straight IPC. Meaning that the point I made in my other post is still correct.

Just so you know my usage case, I don't game at all. I've been spending far more time in Final Cut these days than Photoshop. And everything I do on my machine is photo/video related. I find my limitation is far more based around my GPU (with a good part of it being just a sad frame buffer of 1GB). And is only seconded by the number of physical cores I have. But in terms of raw IPC, that's a non-issue. 4K playback in realtime with all the grading crushes my GPU far more than it does anything to my CPUs.

In case it wasn't obvious yet, I'm on a Mac (Final Cut). And to show that really you can get away with older hardware, I can post the Mac Benchmarks for multi-core performance:
https://browser.geekbench.com/mac-benchmarks#2
There are still multiple 2013 and 2010 Mac Pros in the top 10. Really only getting bested by the absolute fastest iMac Pro's rocking 10+ Core Xeons. But If you would've purchased a 12 core Westmere system in 2009/2010, really that machine could be chugging along just fine in terms of IPC on multi-core content. Simply add your Vega 56/64 or nVidia 1080 Ti (if you're doing more in Cuda) with a PSU hack and a nearly 10 year old machine would crush anything you really need to do in video rendering.

Now, are there Threadripper systems (or Xeon systems as were shown with those iMacs) that are faster now? Absolutely. But I could build out a 2010 Mac Pro and upgrade it to 2017 specs (2x 3.43GHz 6-core processors [which are more in the 25k+ region, you'll note they don't show up in those benchmark scores], 96GB of RAM, and a Vega 64 FE, including a LG 31MU97Z monitor) for less than $2k (keep in mind, I'd be buying literally every component in the machine used, in case you're looking at new prices). The most expensive part of doing said purchase is the GPU. Take that off and that's $800 less.

To drive my point home, here's a video where this guy did exactly that, and then compared it via benchmarks to his Ryzen 1700X system. I'll save you the part where he builds the machine and specs, etc, but it's basically a maxed out 2009 Mac Pro with an RX580 in it. His usage case for the machine was purely to transcode his Canon C200 Cinema RAW light files. So if you wonder why he built a 9TB Raid 0, it's because nothing stays on those HD's after he's done rendering. But anyway.

EDIT: It won't allow me to have the timecode be a part of the media clip. Skip to 11:24. Or 12:18 if you want to go straight to benchmarks.


If you don't like that route, a 2013 trashcan Mac can be updated to 2018 specs by buying a Thunderbolt GPU enclosure and using any graphics card with it you want that way. And as noted, is plenty quick.

===

tl;dr: So all of this point being: 8+ year old machines in terms of IPC have been at least some degree of "future proof". Is there faster? Yes, if that was your whole point, then obviously you're right. But in terms of so much faster that they've made previous systems obsolete? Not by a long shot. These old machines are still capable of quite a bit.
 
Last edited:
I typically only ever pay 400 a year to every other year in upgrades to stay current. Rather than 1200-1500 every 5-6 years like some of my other friends. Same amount of money over time but I get to stay "top of the line" the whole time, where they like the feel of "leap frogging".

I think there are benefits to both, larger impact and all that.

I do what your friends do. Usually every 5 years or so I buy a whole new system. This time was slightly different because I built my system before the 1080ti was out and I built it with a regular 1080. Once the 1080ti came out I bought that and sold my 1080.
 
Yes, a smart buy up front bests replacement costs.
My system is about 7 years old now. Replaced graphics card about 2 years ago because of hw failure, and upgraded SSD out of necessity.

I'd be happy to compare performance vs tco.
 
Future proofing has generally worked out for me, but that's only because I stay in the AMD pond.

Generally I buy a good motherboard with a good amount of RAM and can typically get 2-3 generations of CPU out of it. (Many of the better AM2 boards, for example, supported AM2+ chips with a BIOS upgrade - same with AM3 and AM3+, and AMD had a series of Phenom II's with DDR2 and DDR3 memory controllers to bridge AM2+ and AM3 as well). Every year or 2, a major component gets an upgrade (Mainboard + RAM, CPU, or GPU), and that has worked out pretty well for me. When I upgrade a component, it typically trickles down to wife, then kids, then Media Center. I game regularly at 1080p and have never had a problem at maintaining max detail in the games I play at that resolution at >30 fps (which since FPS games make me motion sick, is plenty). Hardware isn't usually "retired" until there is something it can't run - most recent retirement being a Radeon 6950 2 years ago - my wife had it at the time, and it would not play Star Wars Battlefront (the release before this current one) because the game utilized some feature the card didn't have and crashed at startup. This card wasn't retired because it was slow...

Of course the downside is that sometimes the platform jumps are less "graduated" than Intel guys are used to, like when I went Ryzen from FX8320 - the sudden lack of PCI slots came out of left field.
 
The proof is in the pudding. IPC is IPC. Insert third "truism" statement here.
The long and the short, the numbers speak for themselves. There have been far bigger gains in going with more than 4 physical cores and in GPGPU than there have been with straight IPC. Meaning that the point I made in my other post is still correct.

Just so you know my usage case, I don't game at all. I've been spending far more time in Final Cut these days than Photoshop. And everything I do on my machine is photo/video related. I find my limitation is far more based around my GPU (with a good part of it being just a sad frame buffer of 1GB). And is only seconded by the number of physical cores I have. But in terms of raw IPC, that's a non-issue. 4K playback in realtime with all the grading crushes my GPU far more than it does anything to my CPUs.

In case it wasn't obvious yet, I'm on a Mac (Final Cut). And to show that really you can get away with older hardware, I can post the Mac Benchmarks for multi-core performance:
https://browser.geekbench.com/mac-benchmarks#2
There are still multiple 2013 and 2010 Mac Pros in the top 10. Really only getting bested by the absolute fastest iMac Pro's rocking 10+ Core Xeons. But If you would've purchased a 12 core Westmere system in 2009/2010, really that machine could be chugging along just fine in terms of IPC on multi-core content. Simply add your Vega 56/64 or nVidia 1080 Ti (if you're doing more in Cuda) with a PSU hack and a nearly 10 year old machine would crush anything you really need to do in video rendering.

Now, are there Threadripper systems (or Xeon systems as were shown with those iMacs) that are faster now? Absolutely. But I could build out a 2010 Mac Pro and upgrade it to 2017 specs (2x 3.43GHz 6-core processors [which are more in the 25k+ region, you'll note they don't show up in those benchmark scores], 96GB of RAM, and a Vega 64 FE, including a LG 31MU97Z monitor) for less than $2k (keep in mind, I'd be buying literally every component in the machine used, in case you're looking at new prices). The most expensive part of doing said purchase is the GPU. Take that off and that's $800 less.

To drive my point home, here's a video where this guy did exactly that, and then compared it via benchmarks to his Ryzen 1700X system. I'll save you the part where he builds the machine and specs, etc, but it's basically a maxed out 2009 Mac Pro with an RX580 in it. His usage case for the machine was purely to transcode his Canon C200 Cinema RAW light files. So if you wonder why he built a 9TB Raid 0, it's because nothing stays on those HD's after he's done rendering. But anyway.

EDIT: It won't allow me to have the timecode be a part of the media clip. Skip to 11:24. Or 12:18 if you want to go straight to benchmarks.


If you don't like that route, a 2013 trashcan Mac can be updated to 2018 specs by buying a Thunderbolt GPU enclosure and using any graphics card with it you want that way. And as noted, is plenty quick.

===

tl;dr: So all of this point being: 8+ year old machines in terms of IPC have been at least some degree of "future proof". Is there faster? Yes, if that was your whole point, then obviously you're right. But in terms of so much faster that they've made previous systems obsolete? Not by a long shot. These old machines are still capable of quite a bit.

This shit right here. Futureproofing is about "bang for the buck", especially on the back end. I figure 90% of windows PCs get tossed because Windows itself just plain bricks them--"planned obsolescence", anyone?

The video was great. I love those old Mac Pro cases, they're built like tanks, and obviously work great. I'm sure my math is way off, but that last comparo made me think he got 22 seconds for about $50 a second. Yeah, maybe high, but I'll bet that's ballpark and if you can't wait 22 seconds more for cheap, maybe you need to up your ADHD meds.

Great post.
 
This shit right here. Futureproofing is about "bang for the buck", especially on the back end. I figure 90% of windows PCs get tossed because Windows itself just plain bricks them--"planned obsolescence", anyone?

The video was great. I love those old Mac Pro cases, they're built like tanks, and obviously work great. I'm sure my math is way off, but that last comparo made me think he got 22 seconds for about $50 a second. Yeah, maybe high, but I'll bet that's ballpark and if you can't wait 22 seconds more for cheap, maybe you need to up your ADHD meds.

Great post.

As noted in the video, it also depends on what it is your doing. As noted in the synthetic benchmarks, they are similar. And for transcoding he notes that the Mac Pro is faster (although it is technically different codecs). Also, I'd bet that FCPX would run significantly faster than Premiere (although that is an Apple's to Oranges comparison) because Premiere is so poorly optimized (which is Adobe's issue, not a Mac/PC issue).

So the old Mac Pro isn't definitively worse in all tasks. And what it wins and loses on isn't significant, at least when the heavy weigh of cost is a consideration.
 
Worth it financially? Nah not really.

Peace of mind? Fun? Bragging rights? Shits and Giggles? Yes to every one of those.
 
Well, when the cheapest one is already too expensive for you, anything is future-proof. As in: the future comes, it's still working, and you don't have money for a new one anyway so yeah!

I grew up refurbishing old hardware because of that. I could fire up my Athlon X2 4400+ from 2010 right now and it would be OK.

I praise the Lord for the day in which netbooks started being a thing. That prompted software developers to step up and make things go faster in crappy hardware, and I reap the rewards to this day. =D
 
My future proofing streches only until the next gen GPU.
Say, I buy 1080Ti when it's released and I plan to keep it until 1180Ti.

CPU and RAM etc I swap out when I feel like my stuff has become old or the new tech is much cooler.

For example my CPU history (that I can remember)

i7 920 > 3770k > 6700k > 8350k

I know, I know going from i7 to i3, but 8350k is actually a fantastic gaming CPU. No difference from the best i7 out there, especially since I game on 4k so the GPU is more important.
 
I'm still using an i5-3570k with 2x8GB of RAM and GTX 970 and they work just fine in all the games I regularly play, so yes, I think my purchase choices have been holding out just fine.

I do want to upgrade to an 8 core with 16GB of RAM and a GTX 1070Ti, but I am in no hurry.
 
I dunno, I dropped $1500 in 2008 on my rig, and it's still going strong...

Doesn't mean it necessarily saved you any money.

The classic example are those folks that go out and spend for top tier video cards to play on 1080 monitors. You could just as easily got a mid tier, then in a couple of years got another mid tier (that comes close to what the top tier from a couple years ago ran), and probably save $200 or more overall, for what equates to about the same gaming experience.

Same goes with people who overload on RAM just for the case of "future proofing". Folks that just browse the web and play the occasional video game loading up with 32G or 64G of RAM, just in case... rather than going with what they know they need, and then upgrading if and when they need it.

Not saying you do that. And that doesn't account for variations in generations and pricing, but overall it's more or less true. Some of those things are a gamble - RAM and GPU prices have been pretty crazy lately, but the general trend has been $/byte or $/performance has gone down over time.
 
I find that upgrading everything all at once has paid off for me. Not sure of the exact cost, but with games being not as demanding nowadays the amount of time you can go between upgrades is increasing imo. I mean, it's not like in 2005 where we had Doom 3, Half Life 2, and others and then later Crysis to try and get the highest-end we can get our hands on. Hell, even the system I built for Doom 3 ad Half Life 2 I used for 5 years and then the one after that as well.

My predicament this time is different. I built my current rig at a time when my CPU was the more affordable hexa-core processor on LGA-2011v3, but since it looks like there are no new CPUs coming out for the X99 platform I may be upgrading my CPU and motherboard earlier which i've never done before. the 1080ti I suspect will be good for at least 2-3 years and more than likely even longer. I mostly game on my desktop nowadays so short of me getting a computer game that is heavily CPU intensive I don't think i'll be upgrading any time soon, and even then my CPU is more than adequate.
 
I built the rig in my sig in 2012 and the only new thing I've had to upgrade along the way was the graphics card. And although I didn't buy my 1080 Ti to mine with, it's been a great investment!
 
I purchased a high end PSU, case and peripherals planning to keep them long term.

Beyond that, I pretty much follow a 3 year refresh rule, where my rig gets swapped out with new parts once performance has increased enough. I think my current rig will go longer, as it's already 2.5 and going strong.
 
Doesn't mean it necessarily saved you any money.

The classic example are those folks that go out and spend for top tier video cards to play on 1080 monitors. You could just as easily got a mid tier, then in a couple of years got another mid tier (that comes close to what the top tier from a couple years ago ran), and probably save $200 or more overall, for what equates to about the same gaming experience.

Same goes with people who overload on RAM just for the case of "future proofing". Folks that just browse the web and play the occasional video game loading up with 32G or 64G of RAM, just in case... rather than going with what they know they need, and then upgrading if and when they need it.

Not saying you do that. And that doesn't account for variations in generations and pricing, but overall it's more or less true. Some of those things are a gamble - RAM and GPU prices have been pretty crazy lately, but the general trend has been $/byte or $/performance has gone down over time.
I don't play any over 1080p, so it's no big deal for my HD 4870, but that's really long in the tooth now, so it's due, but I won't upgrade to 4k until I do that with my TV, which may be a long while yet.

As for RAM: I wish I had upgraded a few years ago based on my usage, b/c 6g ain't cutting it anymore, even for web browsing and HTPC and torrents. I think that's gonna be the next upgrade...
 
Until the last few weeks, it was i7-920 (never overclocked), ASUS P6T Deluxe, 6gb RAM, ASUS HD4870-1GB, WD 1Tb Black, Samsung 1Tb, a couple other storage drives, Corsair 650W psu, in a Cooler Master CM690 gen1 case. The Samsung went in later, the other drives came out of my old 2003 Dell XPS, as did the optical drives.

Typically I play HL/HL2 series, DOOM3, some older games, and Minecraft. You may scoff at that, but that's me, and the machine is more than adequate with that setup. I also use on a Hisense 55" LED HDTV 1080p (formerly on a 58" Samsung plasma HDTV, and the one before that was a Westinghouse 24" 1980x1050 24" monitor, inherited from the XPS--which lasted from 2003 to 2009 with one cpu and one vidcard upgrade, both used).

The only reason I even began this quest was that Micron 2tb SSD deal from Amazon for $370, it was too good to pass up. Which got me thinking about more upgrades, which led to the X5670 (best bang for the buck at $40). Toss in the Hyper 212 at $25, a half dozen fans at ~$6 each, a $30 optical drive, and a $30 fan controller, and I'm out maybe $500 for a machine that should last another 2-3 years. And, in this day and age, at the salaries they're paying olde fartes like me, it's the ONLY way I can go.
 
Until the last few weeks, it was i7-920 (never overclocked), ASUS P6T Deluxe, 6gb RAM, ASUS HD4870-1GB,.
Yeah I had one of those as well. I tried to future proof by getting a 930. That didn't turn out that well, as I got less OC than most typical 920s would get. Could it still run games today? Sure. Would I still want to be on it? NOPE. I have a bunch of 950s at work, and those, well even the 980x-es feel ancient by now. They feel slow as hell compared to anything modern. I don't think x58 was a particularly great platform by intel. x79 however is much more future proof I think. Sometimes I wish I kept my 3820 and didn't upgrade to the 6800. I felt I got more from going 930->3820 than I ever got from 3820->6800K
 
Generally no. Just look at the drop in cpu prices with the release of amd new parts - there are a few very rare instances - while not for future proofing i picked up 1070 when it first came out and due to the current scacity of parts it is quite abit cheaper than today gpu but I think that is the only case I can think where compnents actually didn't drop 20% to 50%. I guess ram prices have been all over the place the last few years and facebook/google have kept ssd prices high.
 
16gb of ram +1440p in 2011 plus a ocz v3 and 2600k was and still is a pretty good future proof choice..
 
Generally no. Just look at the drop in cpu prices with the release of amd new parts - there are a few very rare instances - while not for future proofing i picked up 1070 when it first came out and due to the current scacity of parts it is quite abit cheaper than today gpu but I think that is the only case I can think where compnents actually didn't drop 20% to 50%. I guess ram prices have been all over the place the last few years and facebook/google have kept ssd prices high.

That really depends on how you define "future proofing." Personally, I look at it as the "sweet spot" convergence of price (ie - what I paid for it) vs longevity (how long I can reasonably expect it to perform at an acceptable level). Being on the bleeding edge of performance is not normally desirable, because the bleeding edge is super expensive. It'll probably last a while, but the outlay is significant. But being on the low end of the scale is equally bad, because that $100 graphics card is probably not going to be doing very well in a year either.

Monitor:
I've been at 1080p for a really long time now. I see very little reason to get a higher resolution monitor. Side-by-side demos of 1080p, 1440p and 4K images and gameplay don't really look very different to me, despite the order of magnitude of GPU/CPU power required for each step up. Thus I got a good 1080P LED backlit monitor (Asus for ~$120) and with brightness set ~75% or so, it should last damn near forever. The diminishing returns of higher resolution gaming make it unlikely I'll ever want a >1080p screen. HDR is kind of tempting though (maybe next year), and as it happens, RX 480 is already ready to go on that front...

GPU:
There is literally NOTHING out there that an R9 290 can't render at 1080p at full detail. History: My wife and I personally each have a RX 480 8G card. I got mine new for $200 to replace the 6950 my wife was using at the time (she got the R9 290 at this point) - and that 6950 was replaced because there started to be games it could not run, not becuase it was slow at 1080p. As a matter of fact, I gifted it to my cousin's wife who only plays Neverwinter Nights Online and occasionally Smite to replace the integrated Core i5 graphics on her system and she is very happy with it to this day. The performance level of RX 480 is only marginally faster than R9 290X, so it was really just a side-grade for me. My wife got her RX 480 only because I got a deal on one from a friend that tried the card and didn't like it (he prefers nVidia cards in general, so...). He let me have it last November for $150 (it doesn't hurt that he hates Cryptominers with a passion). This was super useful, because I had built a gaming rig for my kids off an old Skylake i5 I got off another friend (Z170 mainboard, CPU and 16G DDR4 for $150) and couldn't get a decent graphics card for it, so the kids at that point inherited the R9 290. They now no longer plague the wife's computer, which means she is no longer bitching at me about how the kids keep messing it up. I bought that R9 290 on sale at $375 in March of 2014 in anticipation of Dragon Age: Inquisition. This card is still going strong in my children's system, and although it was more $$$ than I normally am willing to pay, I do not anticipate it needing to be replaced in the next few years. I'm getting my money out of all of these cards.

CPU:
I paid ~$250 for the Vishera FX 8320 when it came out. It was a drop in replacement for my Phenom II X6 1090t (which migrated to the wife at the time, of course). Despite what Intel fan boys will tell you, this chip was perfectly adequate for gaming. In combination with the R9 290, there wasn't any game I couldn't play @ 1080p full detail on this rig @ 30FPS or higher (and remember from a prior post, I don't play FPS games, so this 60FPS is not necessary for me at all). It just so happens that video encoding - the most demanding thing I actually do on my computer with any regularity, also worked pretty darn well on that processor. I bought Ryzen R7 1700 (OC to 3.8, BTW) last March (hey, Kyle dared me to do it) - not because I needed it, but because I'm a nerd and wanted to play with the new toys. I also had the money to do it at that time, so why not? I actually ended up selling the FX 8320 set up several months later for a good price, and that allowed me to pay the difference to get my wife onto a Ryzen R5 1600X. With the way DDR4 prices have spiked, I'm glad I did this when I did - I paid ~$250 in total for 4x8G sticks. Her Phenom II X6 1090t, mainboard and 16G of DDR3 RAM went to the Media Center PC in the living room, and got drop-in upgraded to an FX 8350 this past December courtesy of a deal in the FS/FT forums here ($50). This machine runs Plex and Kodi, and it so happens that Plex is another one of those apps that prefers having more cores to faster cores. The Phenom II x6 was adequate, but the FX 8350 is ideal for this purpose. I gave the Phenom II to my dad. I fully expect these Ryzen systems to last at least as long as the Phenom II and FX chips they replaced. The kids on the Core i5 6700K should be alright for a good little while as well.

Mainboard and RAM:
Mind you that my primary motivation for CPUs has been video encoding (x264), and this is an activity where more cores are generally better than faster cores. There's really not a lot to say here. AMD has generally provided more "good enough" cores for less money, and their platform support cycles are way longer than Intel's - plain and simple. I got a good Asus AM2+ board with (I think) an Athlon X3 CPU that I later got a Phenom II X4 AS A DROP IN REPLACEMENT for. That system later migrated to the wife, and the good Asus AM3 mainboard I got for my new Phenom II X6 later got an FX 8320 AS A DROP IN REPLACEMENT. That AM2+ eventually went to her brother, because he had a chip and more DDR2 RAM for it, so I got my wife a good Asus AM3 board with 8G of DDR3 as well - she kept her Phenom II X4 chip from the AM2+ board on her new AM3 board. When the prices on DDR3 dropped really low, each system ended up with 16G of RAM, and everything I have has been with 16G of RAM ever since.


tldr; my benchmark for "future proofing" is how much I had to pay vs how long it is before my I (or my wife) can't play a game I want to play at max detail at 1080p. So far, I've doing FAN-FUCKING-TASTIC by this metric for several years.
 
I consider future proofing as buying more than you need today with the expectation that it is cheaper to buy more today than buy another system tomorrow. The reason I think this normally fails is that the price gap between what you know you need today and what you might need tomorrow is usually 30% to 50% more; and given that what you need tomorrow will usually be relatively cheap tomorrow compare to today prices. Today games require more cpu/gpu than yesterday games but imho 8 years ago you could have gotten a 2500 i5 and it would still be more than adequate for most games today. There was little need to buy an i7 8 years ago in-case you needed it today. An i5 today is 20-30% faster than i5 8 years ago and an i7 is 20% cheaper than an i7 8 years ago. Not to mention if you had made more the more expensive purchase there is a chance it would have died before it became required.
-
gpu have beeen a bit difference in combination with dx changes and radical increase in polygons in modern games.
-
In general the issue with future proofing are three folds: components wear out before you need them; prices drop faster than the premium cost for more expensive parts and standards change so what is available today is simply not suitable for tomorrow. In most cases I think it is a huge mistake with regards to total $$ spent to 'future proof'.
-
for myself personally i usually keep the mb/ram/cpu 7 to 12 years and replace the gpu as needed. I have two 2500k (sandy bridge) system and one haswell refresh system. I will be upgrading one of the 2500k system this year or next year (the system I use as a server) since the mb is showing some issues and newer parts use a bit less power (this system does not have a dedicated gpu and is 8 years old - if they still made sandybridge mb i would probably just replace the mb).
-
Btw I alway stake these future proofing questions in context of gaming but perhaps that is not always the case.

That really depends on how you define "future proofing." Personally, I look at it as the "sweet spot" convergence of price (ie - what I paid for it) vs longevity (how long I can reasonably expect it to perform at an acceptable level). Being on the bleeding edge of performance is not normally desirable, because the bleeding edge is super expensive. It'll probably last a while, but the outlay is significant. But being on the low end of the scale is equally bad, because that $100 graphics card is probably not going to be doing very well in a year either.



tldr; my benchmark for "future proofing" is how much I had to pay vs how long it is before my I (or my wife) can't play a game I want to play at max detail at 1080p. So far, I've doing FAN-FUCKING-TASTIC by this metric for several years.
 
Last edited:
I remember when Flight Simulator X came out. It was super awesome. But no matter what, there wasn't a computer that could play the game on high settings with a reasonable frame rate. Low or medium settings with lots of graphics options turned off gave decent performance. You could buy a $10,000 rig, and it wouldn't be able to get 60 FPS at high settings. Now, 10 years later, that game can still bring a modern PC to it's knees. There was no future proofing for that game!

I generally don't bother buying or playing new games. I mean, I'm playing EVE Online, and that game's over 10 years old now as well. So, my upgrade cycle is more of a... When it breaks, I fix it, type of deal. But, I was planning on breaking that cycle by upgrading both of my machines, my gaming rig with either just a new graphics card, or retiring it to server duties, and building a new gaming PC. But, with graphics card prices so stupid high, and the stupid security flaws in Intel and AMD CPU's, I decided to hold off on either option.

Personally, I don't usually care about 'futureproofing', I just buy the best I can afford at the time and expect it to last for a while.
 
No, I buy stuff that I can afford when I need it. Usually this is a rolling upgrade.

I always have a budget with a little "fat" in it. I always purchase within the budget. I know what is out there (cause I like knowing) and I purchase around the pricepoints I need to to get a certain level of tech that I need to do what I need to do. Typically that means I don't get the fastest gear, it means I hit a good midrange.

I don't buy the top end motherboards, I feel they're completely superfluous for most users - you get 95-98% of the performance and features from a midrange board as you do a high end board. The exception to this is if there is a board which has a very specific feature you need (the Asrock z97-extreme6 has a M2/u2 4x slot, for instance). My current motherboard is the Asus X370 prime pro, because it has more power phases than the competition at the same price, and an intel nic, which the competition doesn't have. I don't need the x370 chipset, however to get what I wanted I needed to go the extra mile for one..

I don't buy workstation/top end processors most of the time. The fastest high end processor I've purchased near release was the i7-5775c. I was doing non-linear video editing at the time. It was fast enough to meet my needs. I actually found that Excel was harder on the processor than video editing - go figure.

Memory wise - it's about what I'm running at the time. There's usually a sweet spot re price/speed - I try to find this and go with it.

Graphics cards are a bit of a bugbear - I like having enough oomph for cuda/opencl but typically don't run that workload very often. It's a paradox.. I do this upgrade separately and don't buy the best. EG - I run a vega 56 at the moment, which I bought before the price hike.

As I've gotten older, speed doesn't matter as much. I don't upgrade as often, so my cycle is about 3-4 years vs 2-3 years.

I also don't buy games at release anymore. I buy them 4-5 months after release if I want to play them.
 
Last edited:
I've had a couple of lucky purchases.

Dual 2.8GHz P4 Xeon. I bought it for CGI work, not future proofing, but it just so happened to bridge the gap into when dual-core CPUs became mainstream. I stopped doing much CGI but that machine was a beast for general usage and games for far longer than a single P4 build would have been.

GTX 260. Was amazing on release and ran everything with ease. The extended console cycle with the Xbox 360 and PS3 almost completely stagnated high-end graphics and I was happy with that card for about five years. The GTX 760 I finally upgraded to I was unhappy with almost from day one, turn on the DX11 shiny and it was back to almost the same performance as the 260.

GTX 1080. This was the most "future proofing" of my purchases instead of lucking out. Bought it at release when I still had a 1080p monitor knowing I would be getting into VR and 4K. I never would have guessed its value would have gone way *up*. I suspect I'm going to be pretty tempted to upgrade to NVIDIA's next cards though as it can struggle in 4K and in some VR titles. If the prices stay absurd maybe it'll be like the 260 where I'll be able to run with it for another couple of years while things calm down.
 
Then you don't future proof. How is this relevant to the thread ?


No, I buy stuff that I can afford when I need it. Usually this is a rolling upgrade.

I always have a budget with a little "fat" in it. I always purchase within the budget. I know what is out there (cause I like knowing) and I purchase around the pricepoints I need to to get a certain level of tech that I need to do what I need to do. Typically that means I don't get the fastest gear, it means I hit a good midrange.

I don't buy the top end motherboards, I feel they're completely superfluous for most users - you get 95-98% of the performance and features from a midrange board as you do a high end board. The exception to this is if there is a board which has a very specific feature you need (the Asrock z97-extreme6 has a M2/u2 4x slot, for instance). My current motherboard is the Asus X370 prime pro, because it has more power phases than the competition at the same price, and an intel nic, which the competition doesn't have. I don't need the x370 chipset, however to get what I wanted I needed to go the extra mile for one..

I don't buy workstation/top end processors most of the time. The fastest high end processor I've purchased near release was the i7-5775c. I was doing non-linear video editing at the time. It was fast enough to meet my needs. I actually found that Excel was harder on the processor than video editing - go figure.

Memory wise - it's about what I'm running at the time. There's usually a sweet spot re price/speed - I try to find this and go with it.

Graphics cards are a bit of a bugbear - I like having enough oomph for cuda/opencl but typically don't run that workload very often. It's a paradox.. I do this upgrade separately and don't buy the best. EG - I run a vega 56 at the moment, which I bought before the price hike.

As I've gotten older, speed doesn't matter as much. I don't upgrade as often, so my cycle is about 3-4 years vs 2-3 years.

I also don't buy games at release anymore. I buy them 4-5 months after release if I want to play them.
 
My version of future proofing is going as high end as i can afford, then upgrade the gpu every other generation. I skipped the 10 series but will get the 11 series.

Usually i'll only upgrade the cpu if there is a noticeable fps increase or there is a new feature. This most recent upgrade was because of quad channel ddr4.

I think its a cost effective strategy, if nothing else it lessens the blow of buying a new cpu/mobo/gpu/ram/psu every 5 years
 
I usually replace MB/Proc/Memory every 6-8 years and replace GPU every 2-3. My goal is always sweet spot between price/performance, not top end gear.
Future proofing is always a crap shoot and I've mostly lost, usually by buying better MB's than needed for features I think I need later... but really don't.
SLI-ready stuff has burned me at least twice - I always plan on doing it, but it never ends up worthwhile. Same with playing with OC'ing.
 
I only "Future Proof" a couple things:
- Case
- Power Supply

All the rest I plan on changing before the others. That being said, I will spend more money on these parts.
 
I would say that it would probably be a better strategy now, than it was historically. We have MUCH smaller leaps in features and tech (especially in the GPU department). Things do still evolve, get faster, lower power, etc. But no huge leaps lately like when hardware based transform/lighting was put into the first GeForce GPUs. It kind of works both ways though. You could buy that GeForce 256 for that killer feature, but it wasn't implemented in much until later when faster GPUs were out. So you'd kind of waste a bit of money up front that way AND later on when you inevitably buy that shiny new card when the features are actually implemented in games you wanted to play. Things were way more turbulent several years back. Now, you can stick with a fairly decent system for quite a while. I'm still running a 4690K+1070+16GB which I have been since the 1070 was first released, and really have kind of stopped even following CPUs and GPUs for a little bit. I'll pick back up in another year I would think. This would be unheard of to my friends and I say 7-10 years ago. We ALWAYS jumped on the latest back then. Now I look for the sweet spot (that others have mentioned) usually a mid-range CPU, and upper mid GPU, and I'm pretty happy these days.
 
Not really, I've always bought mid-range until recently since I'm 5 years out of college and have substantially more savings / low debt ratio.

Hopefully these high-end parts will last me a while though, don't feel any sort of temptation to upgrade further anwyays :)
 
Other than buying extra ram, nope.

I'd actually say one of my worst purchases was crossfire HD6950s for future proofing. God what a pain in the dick that was to get working..........

Honestly, I'd say that they basically never worked during their useful life. This was the generation that AMD got called to carpet on Crossfire microstutter.
 
I figure 90% of windows PCs get tossed because Windows itself just plain bricks them--"planned obsolescence", anyone?

Is this a thing?

Have more than one system that started with Windows 7, saw 8, 8.1, and now multiple versions of 10, all staying up to date. Have even older systems (Lynnfield-era) at work with 7.
 
Future-proofing: it's all a gamble..

Recently, the 2600k was probably the best example. I got the 2500k even though I could have picked up the 2600k, and regretted it years later. CPU really was topped out and holding stuff back, and at the same time simply not as smooth due to lacking the extra threaded resources.

Beyond that, good enclosures, good power supplies (even lower-wattage, my X650 is approaching a decade old and has seen multiple multi-GPU setups!), good peripherals and good audio stuff last.

CPUs (and accompanying motherboards and RAM) will be subject to the whims of the industry. Intel has had a hard time migrating processes, for example, which is one of the reasons that the 7- and 8-series were still on the Skylake arch, and why Intel popped out a six-core (and now potentially an eight-core) at 14nm when they'd planned to move to eight-core on a smaller process before now. Memory prices have been a shit-show all along, and memory technology shifts seem to have sped up a little. DDR5 will be interesting.

GPUs are a bit more steady though; unfortunately, that means that they're hard to 'future proof'. For those that don't have moving performance targets, i.e. are happy with 1080p60, that means that the price of entry for high-setting AAA-game performance has steadily dropped, while those of us that are interested in higher detail and higher framerate/higher motion resolution will have to keep up.
 
Back
Top