What is the performance jump which triggers you to buy new card?

I don't look for a performance bump. Mainly I wait for a game that I really want to play, Crysis and Battlefield 4, for example and build a PC that can play at the maximum settings at smooth FPS for that game.
 
I'd say the console generation helped me staying off upgrades for my GTX 570, I had been playing games left right and centre without even looking at the spec requirements, and been able to run them without much issues except for a few oddities (EG Ubersampling from Witcher 2, CoH2), so generally the game requirements for the past 3 years, for me, had been tolerable (on a 60hz 1080p)

But I think I have hit a time where a new card today is outdated tomorrow, hence why I went all out with the SLI (for 144hz 1440p), will I regret it? Don't know yet, time will tell, though I think I will lol...
 
40%. this allows me to sell my current card before it loses too much value and put that into a new card, so i usually end up only paying $100-$150 or so for a new one. it's nice.
 
I used to jump as soon as there was something out there with 10-15% improvement. Nowadays, it either needs at lease 25-30% improvement AND I expect some other benefit e.g. lower power usage/heat, smaller form factor, free games.
 
Let's say you bought recently the best bang-for-the-buck, e.g. GTX 980. Suddenly out of the blue appears a new card in the same price range but with every aspect of performance better by X%. How big have to be the minimum X triggering your immediate activity to change for the new card? As this is mental experiment, we assume there is no limit on X :cool:.
Good question. And, I just purchased a Gigabyte GV-N980G1 GAMING-4GD too to go with my i7-4790K system and my Acer H276HLbmid. :) I admit, I spoiled myself, but that card gets at least 60fps on most gaming benchmarks. Yes, I know I hobbled it with a 60Hz monitor that has a max resolution of 1920x1080.

So, even though you can come out with a card that has a 100% improvement, I won't see it unless I change other aspects of my system, such as a better, higher resolution monitor with a higher refresh rate. I would like to get a G-SYNC monitor, but right now, they are very expensive, and are first generation. Plus, who knows what will happen with the Freesync vs G-Sync battle. And, I won't be replacing the monitor for about 2-3 years. Also, due to the technical challenges, I'm not a fan of SLI or Crossfire.

I know that that's not the answer you are looking for, but that's what I'm looking at.
 
Last edited:
Let's say you bought recently the best bang-for-the-buck, e.g. GTX 980. Suddenly out of the blue appears a new card in the same price range but with every aspect of performance better by X%. How big have to be the minimum X triggering your immediate activity to change for the new card? As this is mental experiment, we assume there is no limit on X :cool:.

I only saw this question now lol.

Anyway, my answer is: it also depends on the current state of hardware running the games/software that is available.

At the extreme high end, where hardware massively outpaces software, then no amount of % will let me upgrade it.

At the extreme low end, where software massively outpaces hardware, then the % is whatever would let me play at tolerable levels (it certainly is looking this way anyway)

At the middle ground, I also look at what is available. If my experience with gaming will noticeably improve with the upgrade (be it GPU only, or GPU/Monitor, be it upgrading from 30 to 60 fps or from 60 to 144fps), then I will most likely bite.

Take for example, my old and new rig. Technically, my 570 is still adequate for 1080p 60 fps gaming, though the release of DSR gave me a taste of the difference between high resolution and lower resolution. That led me to me biting a single 970 early. I already had planned to update my rig as a birthday present for myself, the DSR ensured 970 arrived sooner (so I can enjoy it earlier) rather than later.

What got me to even consider 970's SLI? Swift. It was a monitor that I wanted as soon as I knew about it, it has 1440p, 144hz and G-Sync, everything I actually wanted to try in 1 monitor lol (also, I found my 27" 1080p screen to be too big for that resolution).

When would be the next upgrade? Well unless games start to struggle to even reach 60fps on high settings (i can live without ultra, but anything lower than high may present a problem) for games I want to play, maybe then. Or... if they release 4k 144hz 1ms G-Freesync hybrid monitors :D (with GPU power to run it of course). Though the former might arrive a lot sooner than expected.
 
Last edited:
I have generally bought every other generation going back to the original Riva128 (4mb! woo!). This time out is the first time I skipped two. Went from a 560ti 448 Core to a 970
 
I use to buy just about every generation, but have held onto my 680/7970 setup for close to three years now. They handled the games I play fine (Skyrim, BF3/4, CS:GO, the Batman series, etc.) and I saw no need to pickup the Hawaii/780i/Maxwell series. With games like DA:I and the upcoming Witcher 3, I'm more than likely going to upgrade. I know a pair of GTX 980's will more than double my performance, but I'm holding out for big Maxwell and or the next AMD part before dropping upwards to 2 grand to upgrade both systems.
 
I usually try to hold out for at least 100%, sometimes much longer, I like to hold off as long I can so that when I upgrade, it really feels like one. If it gets to the point where there's a dozen or so games I'm interested in, but know I can't run at a decent speed with SGSSAA on, then I start looking into it.
 
Went from a 9800 GT to a GTX 760, what's that jump in percentage?

My CPU jump is at least 300% going from a q9450 to a 5820k.
 
Ideally I'd wait for a 200% improvement on benchmarks for under $500.

My last upgrade was a 4850 to a 7970, and I think I'm going to be waiting for another generation or two to make the next jump.
 
when there is a feature that makes me go.. "Take my money!" (and the wife lets me)

I just went from a 7950 w/boost that was just fine for most everything to a GTX970

The bonus i suppose is better performance (duh) with lower power requirements, makes much less noise and heat. I still like AMD's stuff but for the price it is hard to beat at this moment.
 
when there is a feature that makes me go.. "Take my money!" (and the wife lets me)

Eyefinity was the feature in recent memory that made me to that. I went from a single GTX 280 to Xfire 5870's. Along with monitor costs, the price of entry was pretty steep at the time. No regrets, as I still enjoy multi-monitor gaming to this day.
 
I buy a single GPU that is 15% faster than the dual-GPU setup I have. Then whenever I see a really good deal on a second GPU to match my first, I buy one and have a dual-GPU setup; rinse and repeat.

This for me also, but in addition, the single GPU has to be a good bang-for-the-buck card, like the 970.
 
Nowadays it's when the games I play don't run as smoothly/with as much eye candy as I like. The price range (used) generally stays the same. So figure every 2 years or so, $150-200 or so, I'd say 40-50%?
 
When the games I'm playing are going below 60 fps at the desired resolution and settings it's time for an upgrade. That generally happens every other generation 5870 to 7970 was my progression. My 7970 is really starting to show it's age so I'll probably pick up a new 390 whenever they come out.
 
Normally, I only upgrade when there are major improvements in IQ (eg 5950Ultra for Doom 3, 7800GTX for Farcry 1, 8800GTX for Crysis 1) but recent gpus offer nothing but raw performance gains and I don't game that much anymore.
These days, I only buy new gpu when there's a really good deal or when my current card can't maintain 45+fps at 1080p.
 
When I can no longer play the games I want maxed out at a framerate I am happy with. If I wanted to play without all the eye candy at 30fps I would play on a console.

I also like to do it while my current card/cards still fetch a decent price. That helps to offset the new toys.
 
I used to upgrade fairly often but have slowed down since, probably because when I buy now, I'm buying higher end cards. I don't see myself upgrading my 680s for a while. Certainly not this gen, and not likely next gen either. Most of my games scale pretty well with SLI so the performance is stil excellent. There's a couple that flirt with the vram wall at their highest settings.
 
Eyefinity best flexibility with 3 screens
Mantle best API for gaming

Performance is seldom if ever any different from one vendor to next.
290x still reigns supreme as the best card for performance having eyefinity 2.0 and Mantle just offers plenty of benefits to choose AMD.

Since speed and performance dont take jumps its like ram timings when people bought expensive ram for benchmark tests instead of asking do I even notice the difference?

I buy smart and went with AMD 290 a year ago for Mantle and it still reigns supreme until whatever amd release next and I bought the 7970 for eyefinity.

Performance is plenty today.
 
It's not the jump, it's neccessity. If everything runs smoothly I won't buy a new card no matter what performance jump.

I'll buy a new card if I'm faced with a game that doesn't run fast enough to my liking. But now with 4K I'm faced with another problem. I can't go any further without tri-sli, but I don't want that.
 
Eyefinity best flexibility with 3 screens
Mantle best API for gaming

Performance is seldom if ever any different from one vendor to next.
290x still reigns supreme as the best card for performance having eyefinity 2.0 and Mantle just offers plenty of benefits to choose AMD.

Since speed and performance dont take jumps its like ram timings when people bought expensive ram for benchmark tests instead of asking do I even notice the difference?

I buy smart and went with AMD 290 a year ago for Mantle and it still reigns supreme until whatever amd release next and I bought the 7970 for eyefinity.

Performance is plenty today.

http://www.lazygamer.net/general-news/nvidia-no-benefit-to-using-amds-mantle/

Essentially, while there may be some improvement and benefit to using Mantle, that benefit is wholly offset by the fact that DirectX 12 does the same thing, and will have wider support – making Mantle obsolete and unnecessary.
 
Laptop wise, I went from a 260M to a 970M, I consider that a TREMENDOUS jump ;)

My desktop was a 760, again against my 970M, still a good 25% increase in performance.....
 

The problem with DX12 is that it will only be available for Windows 8 and up, so the vast majority of us are SOL. It's also not cross platform with linux/Mac nor has it even been released yet. As a whole, I honestly think that something like Mantle is a way better option for PC gamers, developers and the community alike.

It's pretty shitty of nVidia to not support it IMHO and this is coming from someone who used to only buy nVidia. Between that, their new gameworks API (which intentionally cripples performance on AMD cards) and proprietary G-Sync crap, I'm starting to really appreciate AMD a lot more these days. nVidia is being very anti-competitive.

I hope Samsung slaps them around in this lawsuit.
 
Usually it is related to a GPU dying, or rebuilding the whole computer. In recent years I've gone from 5770 1gb, 6930 1gb, 280x 3gb. I don't remember what cards I had prior. It is a combination an improvement in speed and memory. I'm guessing that I usually double my performance. I don't buy because a new card increases performance. I buy because the old card is holding me back. I wish I could do this with CPU upgrades, :rolleyes: dreams of a 2x increase in CPU performance :rolleyes:
 
In 2 months I will give the win10 consumer preview a try. I expect that uptake of win10 will be similar with uptake of win7. I'm also expecting the price to be lower than in the past.
 
The problem with DX12 is that it will only be available for Windows 8 and up, so the vast majority of us are SOL. It's also not cross platform with linux/Mac nor has it even been released yet. As a whole, I honestly think that something like Mantle is a way better option for PC gamers, developers and the community alike.

It's pretty shitty of nVidia to not support it IMHO and this is coming from someone who used to only buy nVidia. Between that, their new gameworks API (which intentionally cripples performance on AMD cards) and proprietary G-Sync crap, I'm starting to really appreciate AMD a lot more these days. nVidia is being very anti-competitive.

I hope Samsung slaps them around in this lawsuit.

still the problem with mantle is.. why bother UNLESS you think linux gaming is actually going to take off, which would be amazing :) long run it is not worth programming for.

The Mac gaming market is also not really worth programming for currently either.. in terms of gaming.

Add in that much of the market (Nivida and a small amount of others) wont support it and some AMD cards (pre HD7000) won't use it. Compare to DX12 everyone on windows uses it, which is nearly all of the market

If other types of apps, CAD, Design and what have you really take advantage of mantle.. great! especially in cross platform apps.

I'm not saying mantle or amd is bad, they're nothing of the sort but mantle is not what is going to bring people back to AMD.

My prediction people will switch to Win10 as it seems to fix most of the UI issues people have, the Mantle buzz will 'quickly' fade after DX12 among most gamers.

I was a huge AMD fan until i went back to intel cpu and realized how misguided i was ;-) I think i was brainwashed by the Athlon MPs :)
 
Last edited:
Eyefinity best flexibility with 3 screens
Mantle best API for gaming

Performance is seldom if ever any different from one vendor to next.
290x still reigns supreme as the best card for performance having eyefinity 2.0 and Mantle just offers plenty of benefits to choose AMD.

Since speed and performance dont take jumps its like ram timings when people bought expensive ram for benchmark tests instead of asking do I even notice the difference?

I buy smart and went with AMD 290 a year ago for Mantle and it still reigns supreme until whatever amd release next and I bought the 7970 for eyefinity.

Performance is plenty today.

You either posted in the wrong thread or You tried really hard to turn this into and AMD > nVidia topic
 
Feels as if performance wise, we are still not there yet for 1440p @120hz. I don't ever consider SLI/crossfire configs reliant because of their compatibility issues and too much $$. A single high end GPU is always a more consumer friendly upgrade.
 
The problem with DX12 is that it will only be available for Windows 8 and up, so the vast majority of us are SOL. It's also not cross platform with linux/Mac nor has it even been released yet. As a whole, I honestly think that something like Mantle is a way better option for PC gamers, developers and the community alike.

It's pretty shitty of nVidia to not support it IMHO and this is coming from someone who used to only buy nVidia. Between that, their new gameworks API (which intentionally cripples performance on AMD cards) and proprietary G-Sync crap, I'm starting to really appreciate AMD a lot more these days. nVidia is being very anti-competitive.

I hope Samsung slaps them around in this lawsuit.

Same things was said about Windows XP not supporting the latest DX. At a certain point, the end user needs to move forward. Most PC gamers will have moved away from Windows 7 since even now it doesn't support the latest iteration of DX.
 
Back
Top