GeForce GTX 580 vs. Radeon HD 5970 2GB Performance @ [H]

It's obvious we wouldn't have the modern without the archaic. Specifically the basic wheel. I know this. You know this, most people know this. Tell me, what does that knowledge really do for me? How does it help me enrich my life?

Just because something isn't "helpful" per-say to what you're trying to do does not man it non-relevant to what you have today... It is very relevant. That's all I've been arguing about.
 
I have a feeling we've started discussing two different things... As a refresher; you said specifically:

In any case punch cards and tape backups have no real relevance to modern gaming.

Which is where you were incorrect. It has completely relevance. Not perhaps relevance to modern trouble-shooting but your sentence here word for word was simply untrue. And actually, until just a few years ago, I still had a tape backup. Nothing could beat it for size. Infact, we STILL use tape backup in the modern era for some tasks.
 
Just because something isn't "helpful" per-say to what you're trying to do does not man it non-relevant to what you have today... It is very relevant. That's all I've been arguing about.

Let me put it another way. Kids today may not learn anything about punch cards and 8" floppy disks. They may gain a great understanding of the concepts which make computers work through analogies and by understanding simple computers like the XT, without relating to those older devices. They may grow up to be great technicians and even programers. In other words, you could conceivably get by without such knowledge. Would it give you a deeper understanding of the science behind computers? Certainly. Would it prove interesting? Maybe. Would it help you day to day? Doubtful.

I have a feeling we've started discussing two different things... As a refresher; you said specifically:



Which is where you were incorrect. It has completely relevance. Not perhaps relevance to modern trouble-shooting but your sentence here word for word was simply untrue. And actually, until just a few years ago, I still had a tape backup. Nothing could beat it for size. Infact, we STILL use tape backup in the modern era for some tasks.

Tape backup is relevant to computing. Is it relevant to the gamer? Not likely. At least not relevant in a way they'll see and maybe even understand. Their game servers may depend on tape backups for disaster recovery. Certainly ISP's and various businesses related to their hardware certainly do. Tape media is very much alive and well. The data centers I support rely on it heavily. Gamers don't necessarily need to know that, care to know that, or would be able to apply such knowledge in a meaningful way. Punch cards are much the same in that regard.
 
Anyway, back on topic...

I was able to OC the 580 to 890 core and 2200 memory last night. Was topping 93 C but fan was only running ~80-85%.

Will try for more tonight.
 
Let me put it another way. Kids today may not learn anything about punch cards and 8" floppy disks. They may gain a great understanding of the concepts which make computers work through analogies and by understanding simple computers like the XT, without relating to those older devices. They may grow up to be great technicians and even programers. In other words, you could conceivably get by without such knowledge. Would it give you a deeper understanding of the science behind computers? Certainly. Would it prove interesting? Maybe. Would it help you day to day? Doubtful.

OK, I know believe you really do not know what you are talking about. You said they would not need to know floppy disks but just simple computers like the XT... The XT used... Way-hay... Floppy disks.. Infact the original XT in 1985 came with ONLY floppy disks. Regardless of their size, floppy disks all worked the same just with tighter densities, like CDs to DVDs and so on.

And for that matter, even tape is still relevant. It's still a magnetic media read and stored by a head. Just in strips instead of sectors like a HDD platter. I think if you actually understood what you were talking about, we wouldn't even be having this discussion.

Anyhow, thanks AGAIN for proving my point... Again...
 
I think I've figured out your error in perspective... You're takling about 'gamers', not what you originally talked about 'gaming'...

And my definition of the modern 'gamer'; doesn't even know squat about computers. Those console nerds are ruining the true gaming arena due to childish over-sales of the half decade old technology. However, modern 'gaming' very much does. Everything is a part of modern 'gaming' including servers, programming, etc. You need to choose your words more carefully. Hence why you've already proved my point so many times...

Tapes drives et.all. DO have a very much still current impact and relevance to modern 'gaming'. Quoted from your original post where this all started. If you had said 'gamers', that's another completely different topic of discussion. Gamers are very much clueless 99.99999% of the time. I wouldn't trust some of them to insert a CD properly.
 
I've been speaking to the gamer's perspective this entire discussion. Not sure why anyone missed that. To gamers, even those who install their own hardware, punch cards, and tape backups hold little relevance. At least none they are aware of.
 
OK, I know believe you really do not know what you are talking about. You said they would not need to know floppy disks but just simple computers like the XT... The XT used... Way-hay... Floppy disks.. Infact the original XT in 1985 came with ONLY floppy disks. Regardless of their size, floppy disks all worked the same just with tighter densities, like CDs to DVDs and so on.

And for that matter, even tape is still relevant. It's still a magnetic media read and stored by a head. Just in strips instead of sectors like a HDD platter. I think if you actually understood what you were talking about, we wouldn't even be having this discussion.

Anyhow, thanks AGAIN for proving my point... Again...

Yeah because I don't know how floppy drives, tape drives, hard drives, or SSD's work. :rolleyes:

I've mis-worded this and I've used bad examples I guess. The floppy drive did come with the XT. I'm very well aware of that. I've owned such boat anchors in the past. Actually I'm not sure how learning about the XT or AT would even help them. Kids today are starting with much more modern hardware than that. Though if you were going to build machines, you could learn to do that with the AT or earliest ATX systems. Things go together pretty much the same. The XT wouldn't be totally irrelevant, but not as good a learning tool as newer machines.

From the average gamer's perspective and even for those who assemble their own systems, such knowledge is largely meaningless. I know computer technicians who aren't very good, and know little about the modern technology. They swap hardware until stuff works. They get by. No doubt more information could only help, but they've got no desire to learn.
 
My knowledge of punchcards gives me an extra 5 FPS in CoD, it also lets me aim quicker in TF2.
 
My knowledge of punchcards gives me an extra 5 FPS in CoD, it also lets me aim quicker in TF2.

And this bit of sarcasm proves my point. Unless you really want to excel as a computer technician or engineer, knowledge about certain technical aspects of the hardware and their archaic predecessors is useless.

That's nothing bro. Call me your senior.

I've been PC gaming for 22+ years. :D

I'm only 31. I'm not that old and I know it.
 
Getting back on topic, thanks to the article that was written and this thread, I now know what to get when I eventually upgrade my 5970. A graphics card with LOTS of ram. :) My card prior to 5970 was GeForce 68000, and I don't remember having any issues with it whatsoever. But with the 5970, while I haven't encountered as many problems as others, they're definitely there. I think I might switch back to Nvidia when I eventually upgrade, but I guess I'll have to see what both camps offer at that point. :) For now, I'm fairly content with my 5970, and don't see the need to upgrade at this time. Maybe 1 -2 more years down the road.
 
Wow, really? Kyle, why did you delete the two posts I had? Considering how offensive you come across and your abusive behavior towards your readers, I'm rather surprised you would delete constructive criticism of your articles. What I wrote (about using MP to test / or using an oced card) could only benefit this site and others. If you don't agree with it, fine, reply to it. But deleting it shows a lack of appreciation for what others might have to say.

If you wanna go around w/ your chest out pretending your way's right or the highway, that's fine. I'll find another place to contribute.
 
Good lord. I didn't take a piss in your breakfast cereal. I didn't define the modern era in any specific way. Though now that you mention it I'd say that started with the original XT and AT systems. Both of which I've worked on. My first jobs in this industry had me working on archaic hardware. So while I didn't experience this stuff when it was new, I did experience some of it. Though nothing going back to the punch card days. In any case, it may have started with punch cards and tapes, but the fact is that has little relevance today. Few gamers know what those items are and if they do, they know nothing outside of historical context. It has as much to do with gaming as the VAX machines or pocket calculators. Outside of historical context, it's meaningless. These items aren't referenced every day and aren't easy to relate to. They are too different from what we use today. So different they are archaic and almost alien. Beyond the philosophical argument that we wouldn't be at "X" without "A" coming first, how exactly is it relevant to the modern gamer? How does being aware of such things impact your technical ability to troubleshoot a system, build modern systems, install software or play games? Beyond relating legends of such devices to your customers, coworkers and friends, such knowledge serves no practical purpose. It has as much to do with modern gaming as a Roman chariot has to do with a 2011 Mustang GT.

That doesn't mean that such knowledge is totally useless or that we shouldn't bother knowing this history, but to claim relevance beyond anecdotes used in conversation seems silly to me. This knowledge does nothing to further your understanding of what we use today. Anymore than the abacus helps one understand the technical workings of the pocket calculator or the GPS system in your car.

This^^
 
Well I'm 31 now in got into PC gaming when I was about 16. I didn't have a Windows machine to start out. I started with DOS 6.2. I had a crappy 386SX 16MHz to start and had to go up from there. My 486DX2 66 ran Doom and the other games of the day OK, but Descent brought it to it's knees. Forget about Quake or anything like that. Of course it didn't come out until I had a 486DX4 100.

I have you beat, pup, MY first computer with games was an atari 800 with, get this, the memory expanded to 32K (cost an arm and a leg too, this was back when they only came with ). we got to play pacman, zenon, bounty bob, antz, and asteroids :D

a far cry form the GTX580. hell my PHONE is a couple of magnitudes more powerful. and here we are bickering about 4xAA vs 8xAA at 4mp
 
Last edited:
From what I've seen elsewhere and with the [H] review as well is that while the frame rate of the 5970 is more erratic (and that's to be expected with multi-GPU), it's frame rate lead is sufficient so that it's minimum frame rate is often just as good or better than a single GPU card. reviews like the anandtech confirm these findings across a much broader set of games.

Your parting shot about horrid drive support is telling, not only that you're eyeing up Nvidia cards already, but also that you're naive enough to think that Nvidia driver support is any better. Sure your current drive issues will go, and be replaced with an entirely new set of problems. Anyway, that's a tangent I wont follow any further.



Again this is tangential but fundamental I feel, these sorts of frame rates aren't really fantastic, especially for a hardware enthusiast site, if you were reviewing the 5770 or something for the run of the mill middle to low end PCs with not much rendering power to play with you might expect 30fps to be your baseline, but £500 video cards @ 2560x1600?

Personally as an enthusiast the target frame rate is too low, I've mentioned this before as a massive failing in the review structure. Hardocps methods are more specific than "canned benchmarks" which they downplay the usefulness of but it's a 0 or 1 result, if you don't agree with the subjective result of "playable" from the [H] team then the benchmarks actually don't tell you much at all, and canned benchmarks while less detailed offer better insight.

Furthermore the testing methodology doesn't take into account the benefit of the "remainder" frame rate when setting a game to 1 configuration such as 2560x1600 8xMSAA and both cards get playable frame rates, but one frame rate is quite a bit higher, they're seen as equal in that game but some of us see benefit in that additional frame rate, primarily because I don't agree with the target frame rates to begin with. Or put another way with a higher standard for optimal frame rate it might actually be that the difference is FPS between the cards is enough that at this new standard, one card actually does need to have it's settings lowerd where another doesn't. That's why canned benchmarks with graphed results from a wide range of settings are actually more helpful, at least to some of us.

The additional frame rate in something like Metro2033 is just as valuable to me as the lack of stuttering is to other people, which is why subjective reviews are a dangerous grey area. Certainly if i'm provided with overwhelming evidence that the average, as well as the min/max FPS numbers for a specific card are better, yet game play is deemed worse by subjective analysis, then there better at least be investigation into why, or some kind of detailed reason as to why I should just abandon the frame rate numbers and trust a subjective measurement.



I refuse to believe we cannot measure it, I think someone with a statistics background might be able to give a better method of measurement, maybe the standard deviation of the frame rate to measure a kind of jitter, the problem is this is often just as much engine based and different per game than all the other things, you have to shoot for an average over many games since you don't know what the reader will be playing...which is why i measure general/overal performance in wins/losses over a number of games and see the 5970 winning here, and at anandtech.



Actually you mean the gaming frame rate isn't as consistent in SOME games, the frame rate looks just as consistent to me in the last 3 games, the fame rate doesn't look any more stable for either card...with such a small game sample size we could say that its likely there games where the 5970 might appear more stable.

I registered just because i couldn't resist replying to your gibberish talk

I won't get in the details why you are wrong on so many levels since several people just tried explaining to you why going after canned benchmarks is not realistic in real world. But you still blindly trying to fool yourself that higher frames in a chart the better even if it's 1-2 frames apart.

but one frame rate is quite a bit higher, they're seen as equal in that game but some of us see benefit in that additional frame rate, primarily because I don't agree with the target frame rates to begin with

What the damn did you tried to explain here? That one framerate does all the difference Give me a break!

I can't interpret what you just wrote here no matter how many times i've read it over and over again.

Your wrong just accept it!
 
Wow, really? Kyle, why did you delete the two posts I had? Considering how offensive you come across and your abusive behavior towards your readers, I'm rather surprised you would delete constructive criticism of your articles. What I wrote (about using MP to test / or using an oced card) could only benefit this site and others. If you don't agree with it, fine, reply to it. But deleting it shows a lack of appreciation for what others might have to say.

If you wanna go around w/ your chest out pretending your way's right or the highway, that's fine. I'll find another place to contribute.

I can't speak for Kyle, but I'll throw my $0.02 in anyway. Using multiplayer to test with has been brought up a few times over the years. It simply introduces too many variables that make reproducing your results even harder than they already are. Actually using game play to test with isn't easy as it is. It's very hard to do the same exact things each time duplicating your results pixel for pixel, shot for shot, move for move. It's already hard enough given that computer A.I. deliberately does things differently in certain games in order to give the appearance of more randomized circumstances. Enemy placement is sometimes varied, as are enemy actions, etc. Going online introduces variables like network connectivity, ISP's, ping times, servers, players, their actions, player load, etc.

Testing is about reproducible results, quantifying them as well as evaluating them in a meaningful way in order to answer specific questions. You will not get the desired results by introducing tons of variables beyond your control. That makes the results from one gaming "run" or another totally useless for comparison.
 
Last edited:
Of course it wouldn't be exact, but based on past experience moving card to card, chip to chip, I haven't seen a large disparity in results. Sure, it's not ideal, but it gives a much better representation of what we'll see in the real world w/ that card/chip combo.

For example, BC2 sp is playable on a dual core chip, not so w/ MP as there's just far too much going on. Anyway, it's just a suggestion. Maybe a separate article to have fun with?

Regardless, thank you for taking the time to explain your opinion. I feel as though you read mine and offered a great response to my feedback (as pissy as it was).

Thanks!
 
Of course it wouldn't be exact, but based on past experience moving card to card, chip to chip, I haven't seen a large disparity in results. Sure, it's not ideal, but it gives a much better representation of what we'll see in the real world w/ that card/chip combo.

For example, BC2 sp is playable on a dual core chip, not so w/ MP as there's just far too much going on. Anyway, it's just a suggestion. Maybe a separate article to have fun with?

Regardless, thank you for taking the time to explain your opinion. I feel as though you read mine and offered a great response to my feedback (as pissy as it was).

Thanks!

Edited.

I didn't mean for it to come out "pissy."
 
I believe he meant his response before that you responded too was "pissy", yet you still took the time to respond and he appreciated that.

I thought that might be the case, but I went ahead and removed some text from my post which really wan't relevant and could have been taken in a less than positive light. I didn't mean anything by it, but figured it might not be construed that way by everyone.
 
Anyway, back on topic...

I was able to OC the 580 to 890 core and 2200 memory last night. Was topping 93 C but fan was only running ~80-85%.

Will try for more tonight.

Mine's been sitting pretty at auto fan speed (maxing at 80-85c with 60% fan auto, after long gaming sessions) with 860 core/2200 memory... I don't think you need your fan speed so high for your overclock there :).
 
I thought that might be the case, but I went ahead and removed some text from my post which really wan't relevant and could have been taken in a less than positive light. I didn't mean anything by it, but figured it might not be construed that way by everyone.
I found nothing offensive in your post...maybe you should put it back. :p

I was being pissy and rude...
 
Mine's been sitting pretty at auto fan speed (maxing at 80-85c with 60% fan auto, after long gaming sessions) with 860 core/2200 memory... I don't think you need your fan speed so high for your overclock there :).

Fan was on auto, I need to work on my case to better airflow, it is kind of bad right now (think one of the inlet fans aren't working and I know my top outlet fan isn't working (the blade component actually fell down into the case, just haven't replaced it). Stuff for the weekend.

If I can lower temps well enough, I cannot wait to see what I can get the core to.
 
Only made it through the first page at 40 ppp, but one thing that really struck me as odd, coming from the likes of SirGCal and some others, is all the hate on the 16:10 format. Why the dislike for this format? I realize it is a matter of personal preference, but the way I see it is us computer users are getting shafted due to wide scale adaptation of HDTVs.

If I have a large LCD monitor, I would like to have as much resolution as possible. All having a 16:9 monitor does to (not for, as it certainly isn't a plus) me is take away resolution from me. Why should I have to suffer from ridiculously low vertical resolution, at least from a relative standpoint? Even at "only" 1920 horizontal resolution, having only 1080 in vertical is rather annoying. Sure, in fullscreen games it isn't an issue, but it really is a hamper on productivity. I can't have as much stuff on screen, and even in games it affects me, as lower resolution results in lower picture quality and overall experience.

When I upgraded from a 20'' to a 23'' monitor, I expected more usable screen space, as far as resolution goes. It is unacceptable that my 1680x1050 20'' and my 1920x1080 23'' have almost exactly the same amount of usable vertical space. I knew when I made the purchase that it wasn't much of an increase, but I was still somewhat disappointed. For me personally, a larger screen with what I consider a low resolution creates inferior screen images. Resolution should scale appropriately with physical monitor size. And while 1920 vertical may be the standard for 23'' monitors, I want all the resolution I can get.

The move by manufacturers to make *primarily* 16x9 instead of 16x10 is a rip off to consumers. It probably has some to do with manufacturing costs, particularly to only have to have one fabrication process for HDTVs and monitors, but it still sucks. This recent move to 16x9 as the common ratio instead of 16x10 is a step BACKWARD from monitors from only a few years ago.

Overall, I place the blame squarely on the huge rise in popularity of HDTVs, and what I like to think of is a result of the massive uptake in console gaming. So there, it's all console gamers' fault with their low resolutions and kiddiefied games. :p

Anyway, that was just my monitor rant, but we are [H]ard here right, so I'm allowed to be a picky elitist, aren't I? :D
 
I'm just going to slap my two cents in that black bars aren't a bad thing at all. A 24" 16:10 has more screen space than a 24" 16:9. For the majority of my uses, I have more screen space. For a few uses, I sacrifice some screen space... but it's really not that bad since a comparable 24" 16:9 would already have less screen space for everything else and only slightly more for 16:9-only applications.

Either way, I had my 1600x1200 4:3 Samsung before my 1920x1200 Dell, there was zero chance that I was going to drop vertical resolution below 1200 ever once I had the Samsung. Anything less than 1200 vert rez now is a huge regression for me.
 
16:9 is the worst thing that happened to computer monitors. Vertical resolution is absolutely horrible on them. I hate it when I have to sit in front of one, I feel like I'm back on 15" 4:3. Fortunately, quality LCDs are still made in 16:10. 16:9 is more popular in TN crap.

I don't see how black bars are an issue at all. Anyway, 16:9 have them too because movies are wider.
 
I, for one, really appreciate the 2650x1600 reviews. This was exactly the information I was waiting on to refresh my video card.
 
I, for one, really appreciate the 2650x1600 reviews. This was exactly the information I was waiting on to refresh my video card.

Yeah, it doesn't make much sense to spend $500+ every year for the latest and greatest video card but not $1300 once to have the best single monitor to game on.
 
Only made it through the first page at 40 ppp, but one thing that really struck me as odd, coming from the likes of SirGCal and some others, is all the hate on the 16:10 format. Why the dislike for this format? I realize it is a matter of personal preference, but the way I see it is us computer users are getting shafted due to wide scale adaptation of HDTVs.

If I have a large LCD monitor, I would like to have as much resolution as possible. All having a 16:9 monitor does to (not for, as it certainly isn't a plus) me is take away resolution from me. Why should I have to suffer from ridiculously low vertical resolution, at least from a relative standpoint? Even at "only" 1920 horizontal resolution, having only 1080 in vertical is rather annoying. Sure, in fullscreen games it isn't an issue, but it really is a hamper on productivity. I can't have as much stuff on screen, and even in games it affects me, as lower resolution results in lower picture quality and overall experience.<snip>

Well, that entirely depends on what you're after. Personally I have far MORE deskspace with a wider format. You can shrink the system down to get more on a page in fewer pixels so the productivity aspect is absolutely bumpkiss. And being a software engineer by trade, I know. Using that as an excuse is just lazyness to adjust your system to you're particular tasks. The only hamper on productivity is your willingness to put in the effort. You can be just as productive on even a 4:3 unit as you can on a 16:9 or 16:10 unit if you set it up properly.

As for the wider format; That's the movie convention. The 16:10 is more 4:3 than 16:9 for sure. Ya you might have more pixels per-say, but they're all height. Useless pixels. Plus, with the same top to bottom viewing area, I have a wider field of view. I went to 16:9 during FPS tournaments. The others could alter their field of view to match mine but it squished the contents which I could not stand. So, do you just want more pixels or do you actually want to see more content.

16:9 is the worst thing that happened to computer monitors. Vertical resolution is absolutely horrible on them. I hate it when I have to sit in front of one, I feel like I'm back on 15" 4:3. Fortunately, quality LCDs are still made in 16:10. 16:9 is more popular in TN crap.

I don't see how black bars are an issue at all. Anyway, 16:9 have them too because movies are wider.

I never mentioned a problem with black bars other than when watching blu-ray content. Bars don't bother me. But my point is I want WIDER aspects. If they made a 2.4:1 full anamorphic monitor with no lag, I'd do that over the 16:9 in a heart beat. The 'computer' industry's knee-jerk jump to 'widescreen' didn't look ahead and they missed on 16:10 format.

As for panel construction, until they make a completely lag-free non TN panel, I can't see calling any TN crap... Infact, most of them today are excellent... Even color accuracy isn't nearly what it used to be. Plus the lower height of the 16:9 format makes color shifts even LESS. Infact, I haven't seen many 16:10 TNs I liked, but then again, I just don't like the 10's anyhow.

In the end, each to himself. My priorities are fastest response, widest viewing angel and non-squished picture. That means simply 16:9.
 
meh there is 4 gb version of 5970 you could easily test against it
If they tested 5970 4gb, against gtx 580. The ATI 5970 4gb would get raped and murdered. This would pit GTX 580 SLI against a single 5970. Because they are comparing roughly equally priced setups. The ATI 5970 4gb, is over 1200 dollars for a single one. Of course we wouldn't need to test that would we, since we know what the outcome would be.
 
If they tested 5970 4gb, against gtx 580. The ATI 5970 4gb would get raped and murdered. This would pit GTX 580 SLI against a single 5970. Because they are comparing roughly equally priced setups. The ATI 5970 4gb, is over 1200 dollars for a single one. Of course we wouldn't need to test that would we, since we know what the outcome would be.

QFT, you could buy SLi GTX 580 and a game or 2 for $1200 plus tax :)
 
Yeah, it doesn't make much sense to spend $500+ every year for the latest and greatest video card but not $1300 once to have the best single monitor to game on.


completely agree. dropped 500 dollars on my samsung 245BW(1920x1200) almost 3 years ago even when i didnt have a graphic card(8800GT) that could really push the resolution. was the most expensive single thing i've bought in a long time, but it was an upgrade for the future.
 
How many of you guys that own 30` monitors actually play with 8x AA?

I understand that for testing the Vram limit it was used but is 8x AA even necessary at that high a resolution.

And tell the truth now!

There was numerous people who had valid post in here that just got deleted or flamed for you cannot afford a 30inch monitor so keep quiet.

And i'm still confused about that one graph where all cards were hitting minimum fps of 0 and max's of 70 yet the GTX was the better card ??????
 
Well, that entirely depends on what you're after. Personally I have far MORE deskspace with a wider format.

So, do you just want more pixels or do you actually want to see more content.

I don't understand how you have more deskspace with 16:9 versus 16:10? One is 1920x1080 and one is 1920x1200 - so 16:10 has the same horizontal pixels and more vertical pixels, which makes more pixels total. So how is that less space?

I don't see how anyone would argue for less vertical height. If you want more width, okay, but why take it at the cost of height when you can have both with 16:10?
 
I agree with you Forceman 16:9 vs 16:10 is not gonna give you more desktop space it in fact should be less as most websites, documents etc are vertical.
 
I don't understand how you have more deskspace with 16:9 versus 16:10? One is 1920x1080 and one is 1920x1200 - so 16:10 has the same horizontal pixels and more vertical pixels, which makes more pixels total. So how is that less space?

I don't see how anyone would argue for less vertical height. If you want more width, okay, but why take it at the cost of height when you can have both with 16:10?

I agree with you Forceman 16:9 vs 16:10 is not gonna give you more desktop space it in fact should be less as most websites, documents etc are vertical.

You're looking at pixels, not actual use. If you have the same height of the image viewable on both monitors, the 16:9 model has more to the sides. It's all about the asepct and point (or field) of view. I want the better field of view. If you use the desktop the same way, you have more usable area. If you're an 'everything is fullscreen' person, then you have no idea about utilizing desktop space in a development environment anyhow so it doesn't matter what resolution you're running.
 
Back
Top