The GPU Upgrade Dilemma

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
PC Perspective has posted an editorial today titled The Upgrade Dilemma. The article discusses whether or not you should you upgrade your graphics card, whether or not you "need" to and possible upgrade paths if you do. Definitely interesting reading, hit the link to read the full article.

In some ways, this exercise was one with a pre-determined result: is getting a newer graphics cards going to give you better performance? Absolutely. What we honestly weren't sure of was HOW MUCH performance consumers were going to get for that upgrade and the answer is just as complicated as I had thought it would be. Depending on the games you are playing, you can go from unplayable to a good experience (Metro 2033) or see little real-world advantage (Left 4 Dead 2). I think the results seen in this article show that more often we are leaning towards the former and as 2011 progresses and titles like Crysis 2, Bulletstorm, FEAR 3 and Battlefield 3 are released, the upgraded GPUs of today will be more and more necessary
 
Strangest article I've yet to read. They mention Crysis 2, but it's a console port. If Crysis 1 can't run on consoles, then why need a better graphics card for Crysis 2?

Consoles are the limiting factor for PC gaming, and therefore shouldn't need to upgrade the graphics card. Unless you want to run with max AA, and max Ansi, and max resolution. Even then, it's a lot of unnecessary over kill.

The only reason to upgrade your graphics is if...

A. You have on board graphics. I have an on board Radeon 4250 HD for my HTPC, and GTA IV actually runs slow, but not unplayable. If you have an intel graphics, then it'll be the best upgrade you've ever done.

B. Your graphics card can't do DX10.1. Not that having a Radeon 2400 HD is a good thing, but anything less then DX10.1, probably can't run modern games at a good speed.

C. You have too much money on your hands. Cause honestly, what better way to waste money then to buy a SLI or Cross Fire setup. Especially with this economy, and the fact that modern PC games can run fast on almost any modern graphics card, made in the past 3 years.

Had if there been exclusive PC games that taxed the machine, I would form an entirely different opinion. The fact is, modern PC games are console ports, and consoles are very dated in hardware.
 
Had if there been exclusive PC games that taxed the machine, I would form an entirely different opinion. The fact is, modern PC games are console ports, and consoles are very dated in hardware.

You had me until this statement, which as you know, is mostly false.
 
You had me until this statement, which as you know, is mostly false.

care to elaborate? game developers produce multi-platform games that need to run on weak console hardware and with the rare exception of games like starcraft 2 (which is pc only because of its genre) the pc port will not make use of the superior capabilities of its platform. and since there won't be a new generation of consoles for a while (it's already "HD" after all even though I played quake 2 in higher resolution back then) we will be stuck with 2006 graphics for a while.
 
Not all games are console ports, and I don't mean all but a few, I mean most games are not console ports.

Metro 2033, Starcraft 2 and Just Cause 2 are all examples of games that are both non-ported, and heavily demanding.

In addition to this, a lot of console ports are still very demanding due to their terribad code. GTA4, Saints Row 2 and Rainbow Six Vegas come to mind.
 
TBH I would also have liked to see was a comparison between modern cards and one from 2 generations back like the radeon 4870/4850 and gtx 280/260. my hunch would be that the end result of those comparisons would tell a very different and much more interesting story.
 
Also I really think that metro 2033 should not really be used for comparison as it uses an engine that although beautiful is really pretty poorly optimized as evidenced by many other equally visually impressive games that don't require $1200 dollars worth of hard ware to run at 30 fps.
 
The cards from that generation are still pretty easy to compare due to their similarities to modern hardware. The HD4870 sits a few % above the HD5770, and the GTX260 sits a few % above the GTX260-216. In unbiased games the 4870 and GTX260 216 were basically identical.
The HD4890 and GTX280 were also about equal, halfway between the 5770 and the 5830.
The GTX285 is basically the same as the 768MB GTX460 in performance.
The GTX295 performs similarly to the GTX570 apart from the memory, and the 4870X2 performs similarly to the 5870.

kinjo: Can exclude that if you like, but you'd also have to exclude Crysis, Call of Pripyat, Arma II, Cryostasis and Lost Planet 2.
 
Im not upgrading till I see the requirements for Arkham City.
 
As somebody who used a Geforce 4 Ti 4200 for 5 years and a GMA950 for the next 3 years before getting a 880G and adding a GTX470, this article has me quite unimpressed.
 
Not all games are console ports, and I don't mean all but a few, I mean most games are not console ports.

He's also confusing games for the PC that were ported *to* the console with games that were ported *from* it.
 
Am I the only one who noticed that all the screenshots of the graphics settings showed 1680x1050 (except Metro), yet all the other graphs and text said 1920x1200?

Their review is worthless because they don't even know what resolution they're benchmarking with. :confused:
 
I've been off the video-game-as-reason-to-upgrade-one's-video-card-merry-go-round since I can't remember when. Hasn't been anything new & original in PC gaming since Deus Ex. More pixels, more shiny, more cores, more & bigger monitors, yes, but gameplaywise, nothing comparably groundbreaking. YMMV
 
My trusty 9800GTX+ is still chugging along. Sure I'm going to upgrade soon, but I haven't picked up a game that I was interested in yet that was unplayable.
 
My trusty 9800GTX+ is still chugging along. Sure I'm going to upgrade soon, but I haven't picked up a game that I was interested in yet that was unplayable.
well I can play every game on an 8600gt. that doesn't mean I would want to. :eek:
 
havnt needed to upgrade since my 8800GTX still havnt found a game i cant run at high settings.

i agree with the above post that most games are console ports and therefore do not push pc hardware forward.
 
He's also confusing games for the PC that were ported *to* the console with games that were ported *from* it.

Majority of games on PC are console ports. If the game is running on both PC and console, it is limited by console. There are a few exceptions, and when I mean few I mean that you can probably count the games with 1 hand.
 
The 9800GTX+ isn't a terrible performer, it's up there almost with the HD4850, and probably equals the HD5750. The issue with 9800s is they usually go pop after a few years :p
 
Well I did just replace my 8800 gts 320MB card with a 460GTX running at 885 MHz core and 1950 MHz memory.

Heaven 2.0 benchmark is 4 times the frame rate as the 8800 @ 1920x1080.

I simply didn't find the 200 series of Nvidia or the 5000 series of Radeon to be worth the money for the games I play (Dirt 2 came closest to needing it but I really wanted more framerate than mid 30's - 40's and for less than $200).
 
Why are we comparing a 2008-2009 generation of geforces to a 2009-2010 generation of radeons exactly?
Radeons have been slightly better value than geforces across the board for years. There's a small price premium for that nvidia sticker...
 
The 9800GTX+ isn't a terrible performer, it's up there almost with the HD4850, and probably equals the HD5750. The issue with 9800s is they usually go pop after a few years :p

LOL... guess that'll make the first worthwhile *working* vintage collectible videocards later...
 
The test system is rocking a Core i7 965. I'd think most people who have a 9800GTX or a similar performing card are not rocking the latest X58 platform but more likely something older, maybe along the lines of a C2D/C2Q. The i7 is probably making some of the 9800GTX results better than they are, something users with older systems might not even see.
 
I think that article is really BS. 9800GTX with 4x AA 16x AF on a modern game? What did you expect. I have a 9800GT, and it plays everything I throw at it well, do I use all of the high settings, AA, and AF? No. It's all personal preference, some people can't stand a game without it all maxed out, and some people don't care and will play it at lower lower res. I don't believe that anyone can really make an argument for anyone else but themselves. Let the people decide if they need an upgrade or not, not what someone else suggests or thinks would give anyone a better experience.
 
Meh. My 9800GX2 fried itself. So I was forced to upgrade, and I really didn't "need" it (games played: Mass Effect 2, StarCraft II, Left4Dead2). All those games also ran on my backup 7900GTX @ 1680x1050 without looking horrible. Now THAT is old-school. It was actually my CAD running like crap (and holiday sales) that incentivized me to buy a new card.

Not that I dislike my GTX470...

On one hand, if your demand for settings (high detail, AA, AF) are constant, sure you'll need to upgrade off a 9-series. However, if you're more adaptable and are willing to drop settings down (as at least one poster has suggested already), then the "upgrade itch" may be avoidable for a while longer. I've a friend who does CAD, StarCraft II and Total War on a 9800GT @ 1680x1050. He's one of the much more common setting-compromisers. I find it hard to believe that there exist a large number of people who both question the need to upgrade (for financial reasons, most likely) and demand the highest settings.
 
The other killer here is resolution. Lots of people are using 1400x900 or 1680x1050 displays. In fact till the last 2 years its wasn't that affordable to get a 1900x1200 monitor. Monitors with higher demands then that are still a rare asset with all but the smaller monitors (27") tending to cost around 1k. The same price someone can get a 50" LCD for the living room (that will run 1920x1080).

At 1900 and smaller resolution, vid cards are having a harder and harder time separating them selves and the development time for game engines and their "generations" have increased dramatically while video card refreshes have been still been pretty consistent. So outside the latest and greatest crowd you have several generations (Geforce 8800+ and Radeon 4.8k+) users that are less and less inclined to upgrade their hardware. Its starting to get to the point where the mid level cards are starting to feel those similar pinches.

You just need to look at the benchmarking tests that HardOCP does here. Single cards are reviewed with the latest games to be comfortable with the best gaming experience to be at 2560x1600. Heck even the latest mid level (6870) was comfortable at 2560x1600 in the review here with Metro. Wanna boost your 1920x1200 frames? Faster CPU. But even then a modern Phenom or Core 2 duo and a 4850 or 8800GT or better performer and no game should perform poorly at 1920x1200 or less.
 
Last edited:
generally there just isn't that much reason to upgrade the video card anymore with all of the ports. I'm still running a gtx260 core 216 and aside from metro 2033 haven't really felt the urge to upgrade from that. Sure I could get higher fps with a newer card, but most games are perfectly playable with what I have. I'm certainly not going to upgrade until I have a real reason too. Especially not when the economy is as uncertain as it's been.
 
Depends on the resolution. At 1440x900 and even to a lesser extent 1680x1050 a GTX260+ is enough to play most games out there at reasonable settings. However, there are several games that need considerably more graphics power than that at moderate resolutions to run at high settings, and there are also those who use 1080p and above.
A couple of years ago a 4870 or GTX260-216 were the benchmark cards to have for 1920x1200. Two years on, games have got more demanding but not by an enormous extent.
 
well in most newer games its only s couple of settings that can take a game from smooth to sluggish. for example just turning down sun shadows and ssao to medium will make Clear Sky perfectly playable on my gtx260 at 1920x1080. and sadly many of the settings that kill the framerate are not even noticeable unless looking at a screenshot. and even then you usually have to look close.
 
The other killer here is resolution. Lots of people are using 1400x900 or 1680x1050 displays. In fact till the last 2 years its wasn't that affordable to get a 1900x1200 monitor. Monitors with higher demands then that are still a rare asset with all but the smaller monitors (27") tending to cost around 1k. The same price someone can get a 50" LCD for the living room (that will run 1920x1080).

I have a 22" 1920x1080 monitor and I play everything windowed at 1440x900. Not for GPU reasons, but because when playing Multiplayer, it's easier for me to keep everything in sight without having to move my head. That way there's not anything out of my FOV that I'll miss.

Side benefit is not needing ultra powerful video cards to play at pretty high quality settings.
 
I have a 22" 1920x1080 monitor and I play everything windowed at 1440x900. Not for GPU reasons, but because when playing Multiplayer, it's easier for me to keep everything in sight without having to move my head. That way there's not anything out of my FOV that I'll miss.

Side benefit is not needing ultra powerful video cards to play at pretty high quality settings.
so you play games windowed in a 16:10 aspect ratio on a 16:9 monitor? most people want the wider field of view and large res.
 
so you play games windowed in a 16:10 aspect ratio on a 16:9 monitor? most people want the wider field of view and large res.

I'm used to 16:10 resolutions since my previous laptop was 1680x1050. I also played at 1440x900 on that in windowed mode. When I moved Steam over to my desktop with the 22" monitor, I just kept the settings the same. The benefit to 16:10 over 16:9 is a slightly taller picture WITH a wider FOV. When I play, I can sit comfortably and not need to move anything more than my eyes to see everything that happens. When I go fullscreen or native resolution, I end up having to sit further back and move my head a lot more... which also adds to fatigue and strain. So sitting closer and dropping the resolution just works better for me.
 
I'm used to 16:10 resolutions since my previous laptop was 1680x1050. I also played at 1440x900 on that in windowed mode. When I moved Steam over to my desktop with the 22" monitor, I just kept the settings the same. The benefit to 16:10 over 16:9 is a slightly taller picture WITH a wider FOV. When I play, I can sit comfortably and not need to move anything more than my eyes to see everything that happens. When I go fullscreen or native resolution, I end up having to sit further back and move my head a lot more... which also adds to fatigue and strain. So sitting closer and dropping the resolution just works better for me.
you make it sound like 16:10 adds height to 16:9 when in reality it doesn't. for properly done widescreen games, 16:9 simply ads more to the sides of 16:10 while keeping what you see in the game from top to bottom the same.
 
you make it sound like 16:10 adds height to 16:9 when in reality it doesn't. for properly done widescreen games, 16:9 simply ads more to the sides of 16:10 while keeping what you see in the game from top to bottom the same.

Not the case in Valve games. Screencaps show more height. Depends on the engine more than anything else. Bioshock simply cropped the tops and bottoms for 16:9 mode until they released a patch due to complaints about the reduced vertical FOV.
 
Not the case in Valve games. Screencaps show more height. Depends on the engine more than anything else. Bioshock simply cropped the tops and bottoms for 16:9 mode until they released a patch due to complaints about the reduced vertical FOV.
screencaps will show more physical height yes. what you see in the actual game is NOT larger from top to bottom. Valve uses hor+ in all of their games and going 16:9 adds more to the sides and that is it. 4:3, 5:4, 16:9 or 16:10 will all show the exact same amount of info from top to bottom in the actual game if it is hor+.
 
Strangest article I've yet to read. They mention Crysis 2, but it's a console port. If Crysis 1 can't run on consoles, then why need a better graphics card for Crysis 2?

Consoles are the limiting factor for PC gaming, and therefore shouldn't need to upgrade the graphics card. Unless you want to run with max AA, and max Ansi, and max resolution. Even then, it's a lot of unnecessary over kill.

The only reason to upgrade your graphics is if...

A. You have on board graphics. I have an on board Radeon 4250 HD for my HTPC, and GTA IV actually runs slow, but not unplayable. If you have an intel graphics, then it'll be the best upgrade you've ever done.

B. Your graphics card can't do DX10.1. Not that having a Radeon 2400 HD is a good thing, but anything less then DX10.1, probably can't run modern games at a good speed.

C. You have too much money on your hands. Cause honestly, what better way to waste money then to buy a SLI or Cross Fire setup. Especially with this economy, and the fact that modern PC games can run fast on almost any modern graphics card, made in the past 3 years.

Had if there been exclusive PC games that taxed the machine, I would form an entirely different opinion. The fact is, modern PC games are console ports, and consoles are very dated in hardware.

Its like you dont realize that a port can have added textures/features etc. making it look vastly superior to the console version...

Even being a "console port" Crysis 2 looks every bit as good as the original.
 
Its like you dont realize that a port can have added textures/features etc. making it look vastly superior to the console version...

Even being a "console port" Crysis 2 looks every bit as good as the original.

meh...
 
Console games look like ass nowadays, most are running at 720p with no AA.
Frankly, the only ones I can bear to play are sports games.
 
Back
Top