1680x1050 - Minimum needed?

Vorret

[H]ard|Gawd
Joined
Aug 26, 2003
Messages
1,100
Hi,

1680x1050 is starting to be obsolete but I love my screen and I'm not gonna ditch it. That said, my current card is a 9600GT. Nothing too fancy but it does the job, or he did the job... anyway, at this resolution, what's the cheapest card I can get that'll run "everything" at max settings? (Everything is in quotation mark for those rare occurance like Crysis and GTA4)

Thanks!
(It can be CF or SLI as well, doesn't have to be 1 card, as long as it's a cheap solution)
 
1680x1050 isn't going to be 'obsolete', yes there are monitors with larger res' but 1680x1050 won't die.

First of all we haven't got much to go on here without a budget, i mean Crysis Warehead on max setting @ 1680x1050 will still require something like an ATi 5850 (£200) but max settings on MW2 you're good with a 1Gb 4870 or a GTX260 which are a lot cheaper.
 
I was in the same boat as you (9600GT @ 1680x1050), and decided to jump on a used GTX 260 (or you could pick up a used 4890). If, like myself, you are not in a hurry to jump on DX11 yet then this is a great bang for the buck upgrade. Now I can run most every game (excepting Crysis and GTA IV) maxed out, especially if you get a decent oc.

At this resolution, there is really no need for anything more powerful right now, and buying used now means that you may have money left over for a DX11 card after Fermi is released since (hopefully!) prices will then start to come down later in the year.
 
Well I'd like to keep it under 250$ CAN
My other system specs don't really matter since I'm in the process of uppgrading, I just don't want to put 400$ on a GFX card but I was looking for something that is future proof so yeah DX11 would be nice.

I've read that there's a 5830 coming soon, would it be wise to wait for that?
 
1680x1050 isn't obsolete, according to the steam survey its the 2nd most used resolution, only very slightly behind 1280x1024, with about 20% of all users.

I would try and aim for a 5850 if possible, that would be a decent card to future proof you for a good few years and give you DX11 support, or you could go for a 5770 which is a bit slower/cheaper.
 
Hi,

1680x1050 is starting to be obsolete but I love my screen and I'm not gonna ditch it. That said, my current card is a 9600GT. Nothing too fancy but it does the job, or he did the job... anyway, at this resolution, what's the cheapest card I can get that'll run "everything" at max settings? (Everything is in quotation mark for those rare occurance like Crysis and GTA4)

Thanks!
(It can be CF or SLI as well, doesn't have to be 1 card, as long as it's a cheap solution)

1680x1050 not going obsolete.
 
Using an EVGA GTX 260 [216] with 22" Acer monitor @ 1680x1050.
With my ic7-920 at 3.2ghz I can run Crysis/Warhead and everything else maxed out.
I'm a Nvidia fan for sure but the fact is the 260 has been great since the day the rig was put together a year ago.
The 260 is like the Ti4200's of years ago to me.
It just works.
 
I think the hierarchy goes like this in terms of resolution

Eyefinity - very high end
25*16 - high end
19*12 - above mainstream
16*10 - mainstream
below 16*10 - low end
 
1680x1050 isn't obsolete, according to the steam survey its the 2nd most used resolution, only very slightly behind 1280x1024, with about 20% of all users.

Apologize for the derail but I was wondering what was up with the 1680x1050's myself. You can still find them online but good luck trying to find them in a store somewhere. I went all over town before finally finding this Acer at a computer shop, and it was their last one. Everything out there in showrooms is 21 or 23.

22 is perfect for me. Plenty of real estate and a resolution that doesnt require a $300 video card to get decent framerates.

Back on topic, I too have a 9600 GT 1GB and although I am saving up for a beefier card I am VERY impressed with my card's performance. Im able to max out Dead Space and Bioshock and with Crysis set to Medium (shaders on High) I get 31 fps on the Crysis benchmark.

Dont get me wrong, I know these arent stellar numbers but you not need that big a card. Judging by my 9600 GT's performance with a pokey Kuma processor (getting an Athlon X3 Wed), Im starting to think you may not need as beefy a card as you think.

Id say the minimum for solid frame rates in every game would be the 4770 or 4850. GTS250 or 5750 would be probably be the best all around choice and a 5770 or 4870 would be plenty.
 
I do really like the 1920x1200 resolution and I am still adjusting to the 1680x1050 size. I only recently "downgraded" to a 22" from my Asus 25.5" because I felt that 3x22" ips panels were much more affordable than 23-24" ips panels. I knew I wanted Eyefinity, and I was tired of my crappy looking TN panels. Now I have 4 ips panels and 2 TN panels. Though my wife has been sacked with one TN panel and probably wouldn't want to trade a 24" for a 20" ips at 1680x1050.

I could see the advantages of a single 30" display, but I really like the 48:10 ratio for some games. Internet at 3x22 in portrait is great as well. I would be hard pressed to use a single 22" panel though. There's just not enough resolution!
 
I'll chime in that the 20" 1680x1050 as an exact resolution is going to be harder to find. Samsung's new line of panels are all 1600x900, and are a main source for a lot of monitor dealers. That being said, it seems to be popular among panel manufacturers for the 22" size range, so as a resolution it's definitely not obsolete.

Personally, I'm running 3x20" 1680s, and loving it with a 5770. Single screen gaming, pretty much everything can be maxed at this resolution, short of a couple of games that need a bit of tweaking, but you can pretty much get an idea of what to expect from the reviews on the card.
 
4890 is pretty great and best bang for buck. I run it along with a phenom II at 3ghz and runs eveyrthing maxed out very smooth.
 
To max out all new games @ 1680x1050 you will need 2gb's especially if you use AA. I'd say get a 5850 at the minimum.
 
To max out all new games @ 1680x1050 you will need 2gb's especially if you use AA. I'd say get a 5850 at the minimum.

Not even close. My 512mb does a bang up job @ that resolution with 4-8AA. If you want to have some headroom, a 1gb model is more than enough.
 
1680x1050 obsolete? Maybe to the snobbish enthusiast, but I assure you that a good majority of gamers still play at that res, or even lower... Just look at the Steam statistics sometimes. Anyway, for a budget card for that res you could easily get away w/an HD5770 imo. Most NV cards are overpriced right now but a GTX 260 for $150 or less would do the trick as well, it's slightly faster than the HD5770 even tho it won't do DX11 (which probably won't be an issue for a while).

I game just fine w/my GTX 260 at 1920x1200, can't run high AA in a few of the most demanding games (or any AA in a scant few) but that's about it. If you can score a 4890 (slightly faster than a GTX 260 or a 1GB 4870) for around the same price that would be even better but don't overpay for it. I haven't really checked the prices of these older cards lately, been checking the pricetags of the 5850 waiting for a price drop or a timely rebate to bring 'em closer to $250, heh.

A 5850 would certainly go over you budget (I think, I've no idea what CAN to US is any more) but it'd be a better long term investment if you can swing it. That's why I mentioned not paying more than $150 US for any of the cards I mentioned before, a 5850 would actually run ya double that amount unless the older cards are somehow more expensive over in Canada.

16:10 displays seem to be getting scarce, with manufacturers moving over to 16:9 (so 1680x1050 would be just as obsolete a 1920x1200 in that sense), but I don't think they'll ever be abandoned altogether... Games will certainly not stop supporting either as it's a trivial matter, and plenty of people still prefer the higher vertical pixel count (for programming, productivity, etc.). If you ask me the move to 16:9 is just a marketing gimmick to sell displays as 1080p capable or w/e, probably a budget decision too so they can maybe share some parts w/low end TVs.
 
according to steam hardware survey, 1280x1024 is the most popular for gamers. so even 1280x1024 is not obsolete.

but this is [H], 1680x1050 is obsolete. 1920x1200 is starting to become obsolete :D

on topic, I game like you at 1680x1050. back in the good old GTX2xx days I felt most of 1680x1050 users choose GTX260 or 270, GTX280 felt like an overkill. but after going around some charts I picked the GTX285. I don’t regret it, at max settings I’m getting 40 to 60 FPS (FarCry2 avg 62 min 34) not to mention Crysis, Empire total war, BC2 and even Red Alert 3 with a big map.

I might get banned for this but I’ll have to say, minimum 5870 :D
 
At what FPS, 10?

I agree. This thread is so full of shit that its turning brown.
@OP: I would look into a 4870 or 5770. That should be more than enough horse power to cover most every game at that res. A more powerful gpu wouldn't hurt but I wouldn't worry about getting more than 1gb of video memory or a dual gpu setup.
 
Yeah Im gonna have to call BS on this "you need a minimum of XXXX".

I just bought my new monitor so Ive still got the 9600GT I was using that maxed out all my games on my old 1024x768 17" monitor....except Crysis of course. (38 FPS on the benchmark on High, no AA)

I havent bought a new GPU yet so Im still using my trusty old 9600GT. So far, Im able to play Bioshock, Left 4 Dead and Dead Space fully maxed out and according to Fraps, Im getting well over 60 FPS on each. With Crysis set to Medium (shaders to High, processing to Low) Im getting 31 FPS. In DX9 Im getting in the low 40's.

Granted these arent stellar frame rates but theyre most certainly playable. So if a 9600GT is able to crank out playable frame rates with respectable graphics level, Im sure something along the lines of a 5770 or 4870 would be more than enough for any game.
 
Last edited:
Depends on the title. Crysis and STALKER: Clear Sky keep a HD 5850 @ 45 avg FPS or lower. That means your mins may be below 30 in some cases. IMO buy the best card you can afford. If I were buying today I'd grab an HD 5850 and OC if I wanted more push.



 
Crysis is at 4xAA in that benchmark though. Warhead/Crysis is usually in the 40's on Gamer which is fine by me. Only use 2x edge AA in autoexec. Doesn't go below 30.
 
I guess it boils down to how fast do you want to play Crysis. LOL!

If you want 60 FPS with max settings then youll need something like a 5850 or 5870. If you dont like Crysis then a 5770 will be more than enough. ;)
 
1680x1050 not going to be obsolete for a very long time, most if not all current games today played on the PC are ports, sometimes an embarrassing upscale 720p.
I'm currently playing Masseffect 2 and i do find myself trying to constantly zoom out due to claustrophobic feel, try playing a true PC game like Battlefield 2 or Crysis then switch back to Bioshock and you'll understand.
 
My 4870 512mb card runs my 24" just fine. Other than Crysis, I could run most things at max or near max with no problem. Not sure how newer games will handle (DX11, etc), but I'll cross that bridge when I get there. At 1680x1050, I would say anything within the mid-range would serve you just fine. Like most have said, you should look at the 5770 or 5830. Or if you want old but better price/performance tech, then a used 4890 or gtx260 would do the job plenty.
 
Not trying to pestimistic here, BUT.....older games maxed out at that resolution sure no problem, but try a newer DX11 game with say an oced 5870 and an 920 runing at 4ghz and it is going to be choppy, I bought into the suggestions given here that a 5870 was overkill for 1680x1050 and it could push 1920x1200 in say Crysis maxed out....20-45 fps is what I got though in reality, I would have to say at least a 5870 for 1680x1050....it barely cuts it at 1920x1200, and in some case doesn't cut it at all.

I would really like to know what these people's ideals are that formed thier opinions AKA" 5870 over kill", I would really like to know what is thier exact definition of acceptability, just being able to run the game, maxed out yet choppy, a lot of these people are still runing older hardware besides a 5xxx card who are making these claims, was it blessed by god to run better then later hardeware? What is the secret?
Honestly with no ill intentions intended I think some of this opinion are over exagerations in the very least.
 
Not trying to pestimistic here, BUT.....older games maxed out at that resolution sure no problem, but try a newer DX11 game with say an oced 5870 and an 920 runing at 4ghz and it is going to be choppy, I bought into the suggestions given here that a 5870 was overkill for 1680x1050 and it could push 1920x1200 in say Crysis maxed out....20-45 fps is what I got though in reality, I would have to say at least a 5870 for 1680x1050....it barely cuts it at 1920x1200, and in some case doesn't cut it at all.

I would really like to know what these people's ideals are that formed thier opinions AKA" 5870 over kill", I would really like to know what is thier exact definition of acceptability, just being able to run the game, maxed out yet choppy, a lot of these people are still runing older hardware besides a 5xxx card who are making these claims, was it blessed by god to run better then later hardeware? What is the secret?
Honestly with no ill intentions intended I think some of this opinion are over exagerations in the very least.

I have a 5870 and purchased over 70 games via steam alone, including newer ones. Crysis is the exception, not the rule. Even then, I can max it out as long as i leave AA off. 2xAA is still playable but gets down to around 25fps in some scenes. This is at 1920x1200.
 
Once you experience larger you'll never want to go back. I can just image those 2560x1600 :D
 
1680x1050 is going obsolete? Damn, I'm still on 1440x900!

Step it up man!!

I run a 22in Dell @1680x1050 Its appears to be as sharp as ill ever need at this size. But i have not seen a PC monitor in action at 1900x1080 yet. Also im in the (it can be too big camp). 28in is as big as ill ever go i think. My 50in sammy plasma is annoyingly big when playing games, lol so i now have it hooked up on my dell. Perfect.
 
Not trying to pestimistic here, BUT.....older games maxed out at that resolution sure no problem, but try a newer DX11 game with say an oced 5870 and an 920 runing at 4ghz and it is going to be choppy, I bought into the suggestions given here that a 5870 was overkill for 1680x1050 and it could push 1920x1200 in say Crysis maxed out....20-45 fps is what I got though in reality, I would have to say at least a 5870 for 1680x1050....it barely cuts it at 1920x1200, and in some case doesn't cut it at all.

I would really like to know what these people's ideals are that formed thier opinions AKA" 5870 over kill", I would really like to know what is thier exact definition of acceptability, just being able to run the game, maxed out yet choppy, a lot of these people are still runing older hardware besides a 5xxx card who are making these claims, was it blessed by god to run better then later hardeware? What is the secret?
Honestly with no ill intentions intended I think some of this opinion are over exagerations in the very least.

Like I said before, Im running a 22" monitor and a 9600 GT for God's sake and Im running Dead Space, Left 4 Dead and Bioshock at well over 60-70 FPS with the graphics at max!! So yeah, if a $70, 3 year old, midrange GPU can play 90% of the games out there with max graphics, a $450 5870 is going to be overkill.

With Crysis, I set everything to Medium, Shaders on High and no AA or AF and I can get a little over 30 FPS in DX10 and over 40 in DX9. For Crysis and Crysis Warhead, I agree, a 5870 would be great to get all the eye candy and great frame rates from these games on a 22" and if you can afford a $450 video card then God bless. BUT, Im not sure how many people wouldnt rather turn off AA and AF and set Water to Medium and enjoy 50 FPS with a $130 5750.

I dont judge what is or isnt overkill by a game that you need a $600 video card to get 60 FPS with max graphics. I judge that by the other 99% of games.
 
Last edited:
I play at 1680x1050 with a XFX 5770 and on Crysis:Warhead I play with no AA and all settings at gamer. I haven't used fraps to check framerate yet, Crysis isn't my favorite game so I could give two shits if my card can handle it or not.

Every other game I play is completely maxxed out with the 5770 at that res.
 
If they'd make Crysis nice and super long I'd happily invest in a big card to play it fully maxed out. Even so a lot of these guys here are running and or getting ready to run 3 monitor setups. Imagine playing your fav game that way, then those big cards start looking a lot more attractive and desirable.
 
Last edited:
Back
Top