Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
In Summary, all NVIDIA GPU based video cards were playable at this resolution in GTA IV, albeit with some lowered settings. On the AMD side only the Radeon HD 4870 X2 was playable at 2560x1600, the 4870 1GB, 4870 512MB and 4850 were not playable at this resolution.
Yes, I *really* like sites that include minimum framerates in their reviews because that's at least as usefil to know, if not more so, than averages. xbitlabs does it as well, they just get reviews out slower.
Hi all:
I'm the one who wrote the Techgage article, and I'd just like to respond to a few comments here.
(shorted down)
I have to agree. We should have used 8xAA, and to be honest, I'm unsure why the thought didn't cross my mind. To call us biased is truly ridiculous, however, especially due to a chosen setting in one game. We put a ton of effort into both our testing process and methodology, and we aren't just going to blow that by being biased. Besides, I'm not sure how we can be biased when we concluded that both cards are essentially equal...
That aside, thanks to anyone who commented on the article, and we'll be sure to better choose our anti-aliasing settings for our future GPU content.
When writing a head-to-head article and we see one of the models dominating the other, it's not so difficult to come up with a conclusion. That was the case with our article two weeks ago. NVIDIA's offering proved to be faster, more power efficient, and cheaper. It's impossible to misinterpret that. NVIDIA clearly had the better card. It's simply something that couldn't be argued.
rockstars sucks not ati before beta drivers nvidia cards were as bad as radeonsSure, picking one graph is always fun. Like this one:
Now show me one where the 260 sucks that hard as the HD4870 here...
Showing graphs is even better.
Another good graph GTX260 edges out the 4870 just barely w/o AA, 4870 shines with AA.
Hi all:
I'm the one who wrote the Techgage article, and I'd just like to respond to a few comments here.
130FPS is not impossible at 1680x1050, as the chart shows. In addition to the information shown on the graph itself, we also include direct screenshots from each game used to show exactly how it was configured (the L4D screenshots show Vsync enabled though, when it was actually disabled in order to achieve high FPS), and I can assure you, our results as as accurate as possible. Like [H], we don't utilize timedemos, and we re-run each setting/game multiple times in order to achieve accurate results.
The exact run goes as follows:
Benchmarking begins in the last safe house for that mission, and Fraps' FPS recording is started right before the door is opened. I proceed up the stairs and through the doors, hop on the trash can that's lying on its side and shoot the zombies up and down the hall. I then proceed to run down the hall, stopping at each doorway to take care of zombies that might be in each room.
At the end of the hallway is the metal grate which must be shot out, and after taking care of zombies, I make my way up the ladder to the roof and run to the top-left corner, then look down at the helipad. That's when I stop Fraps recording.
If you don't trust my results, apply the same settings and follow my general instructions there. Each run varied between ~1 - 3 FPS, so I wouldn't expect anyone's results to stray far.
I have to agree. We should have used 8xAA, and to be honest, I'm unsure why the thought didn't cross my mind. To call us biased is truly ridiculous, however, especially due to a chosen setting in one game. We put a ton of effort into both our testing process and methodology, and we aren't just going to blow that by being biased. Besides, I'm not sure how we can be biased when we concluded that both cards are essentially equal...
That aside, thanks to anyone who commented on the article, and we'll be sure to better choose our anti-aliasing settings for our future GPU content.
Just giving a number is something different then showing a graph.
One framedrop to 10fps in the whole run is something I don't care about compared to a drop to 20fps every 5 seconds. That's something a number won't tell you, but a graph will.
If you read what you quoted from the review, you might notice that the conclusion you're referring to was reached two weeks ago in a previous review that did not include the new ATI drivers. In this review, the GTX260 won every test.Your conclution:
Not difficult to come up with a conclution? Only in three of the games (of 8), the 260 was faster and one of them could go the other way with 8X aa (and you couldn't get the 260 to run the highest res). The rest was about equal or victory goes to 4870. I would find it difficult to come with a conclution even upon your own results, unless my conclution wasn't based upon the results itself.When writing a head-to-head article and we see one of the models dominating the other, it's not so difficult to come up with a conclusion. That was the case with our article two weeks ago. NVIDIA's offering proved to be faster, more power efficient, and cheaper. It's impossible to misinterpret that. NVIDIA clearly had the better card. It's simply something that couldn't be argued.
The conclusion? There is no conclusion. Given the pricing information above, I think both cards come out equal. ATI's card costs $20 less, but isn't quite as powerful as NVIDIA's card in certain games (most notably, Call of Duty: World at War). On the other hand, NVIDIA's card costs $20 more, but it runs a bit cooler, is more power efficient, and supports PhysX, which may be a big thing next year. It's really difficult to conclude on this one, so it's a matter of choosing what's more important, money saved now, or the certain perks that NVIDIA's card carries (namely PhysX). The good thing? It's difficult to go wrong with either.
If you read what you quoted from the review, you might notice that the conclusion you're referring to was reached two weeks ago in a previous review that did not include the new ATI drivers. In this review, the GTX260 won every test.
So you have effectively taken something that Techgage said over two weeks ago in a previous review, and used it to support your argument that their current review is flawed.
If you read past what you quoted from the Techgage review, you can find the updated conclusion:
The GTX260 is the better pick IMHO. Mainly because the 1GB of VRAM isn't really that useful on the HD4870 even at higher resolutions due to the 256bit bus and Nvidia has better drivers, Physx support right on the card, and more games are optimized for Nvidia. Personally I would rather have the HD4870 512MB over the 1GB version because the 1GB HD4870 cant reach the high memory clocks that the vanilla HD4870 can.
Dude, are you on something?
i see alot those around 230 on newegg
Dude, are you on something?
picture quality has been superior on ATI cards for years. Nvidia drivers are a main reason Vista got a bad first impression. ATI has been sharper and more realistic side by side IMHO
rockstars sucks not ati before beta drivers nvidia cards were as bad as radeons
They most certainly were not.
180.48 WHQL runs it equally as well as 180.84 Beta or 181.00 Beta and I've got over a dozen benchmarks conducted to prove it.
They most certainly were not.
180.48 WHQL runs it equally as well as 180.84 Beta or 181.00 Beta and I've got over a dozen benchmarks conducted to prove it.
Are you? Take a look at Crytek,Eidos, Valve, Bioware, etc....
All of those company's develop their game engines around the Nvidia cards so they run a lot better on the Nvidia cards and new Nvidia drivers are always being released and are better than ATI's drivers. Now with all of the GTX 260 deals out there at the moment I see no reason that someone would go with a HD4870 unless they are on a crossfire motherboard.
Now that Nvidia has bought out Physx and they are starting to utilize it on their cards it makes Nvidia even more appealing. To be honest I would only buy an ATI card if I was on a budget and my main concern wasn't pc gaming.
hmmm many ppl reported jump in fps may be it was for specific cards don't know but nvidia suggested beta drivers too so there must be something special for gtaThey most certainly were not.
180.48 WHQL runs it equally as well as 180.84 Beta or 181.00 Beta and I've got over a dozen benchmarks conducted to prove it.
Are you? Take a look at Crytek,Eidos, Valve, Bioware, etc....
All of those company's develop their game engines around the Nvidia cards so they run a lot better on the Nvidia cards and new Nvidia drivers are always being released and are better than ATI's drivers. Now with all of the GTX 260 deals out there at the moment I see no reason that someone would go with a HD4870 unless they are on a crossfire motherboard.
Now that Nvidia has bought out Physx and they are starting to utilize it on their cards it makes Nvidia even more appealing. To be honest I would only buy an ATI card if I was on a budget and my main concern wasn't pc gaming.
Well he is on something if he thinks a 512MB card is better in any way, shape or form than a 1GB card. He brought absolutely NOTHING to the table other than stating that the GTX260 is a good deal/on sale rightnow.
The 512MB is the better card because the memory overclocks a lot higher than the 1GB version and the 512MB version is a lot cheaper. I don't think that you read my full post if you think that's all I brought to the table. ^^ The 1GB on the HD4870 cant be fully utilized because of the 256bit bus.
The 1GB on the HD4870 cant be fully utilized because of the 256bit bus.
HD 4870 Memory Bandwidth: 115.2 GB/secThe 512MB is the better card because the memory overclocks a lot higher than the 1GB version and the 512MB version is a lot cheaper. I don't think that you read my full post if you think that's all I brought to the table. ^^ The 1GB on the HD4870 cant be fully utilized because of the 256bit bus.
The 512MB is the better card because the memory overclocks a lot higher than the 1GB version and the 512MB version is a lot cheaper. I don't think that you read my full post if you think that's all I brought to the table. ^^ The 1GB on the HD4870 cant be fully utilized because of the 256bit bus.