6800GT&Farcry @ 1600x1200? No Problem!

oozish said:
what the hell happened to my thread?

Post in the nVidia Flavor forum next time, or this will happen in every thread you start
 
oozish said:
what the hell happened to my thread?

A disease called fanboi. :D You had an excellent thread going showing some good stats to allow users to gather a great idea of how the card would perform. Thanks for sharing.....
 
Blackwind said:
The only person "ignoring reality" would be you burningrave101. The reality is that FRAPS is an excellent tool and always has been.

Noone ever said it wasn't. I said the results gathered with FRAPS are unrepeatable and influenced by alot of things other then just the GPU. I've said several times that testing in-game with FRAPS is noteworthy, but i also said it was not as accurate as testing with demo's even if its a more "realistic" overview of performance.

If every single person within this thread ran FRAPS in FarCry at 1600x1200 4AAx8AF. YES. The results would not be far apart and an average could be obtained. This is what the reviewers are attempting to do for us here and whether you like it or not...their findings are very relevant and accurate.

That just shows you dont know anything about testing with FRAPS. If you did then you would know the performance would change considerably according to the level, part of the level, AI reaction, and other hardware in the system.

Have you not looked at all the SM 3.0 Far Cry benches on the net? Not a single one of them gave the same results, even though they ran the same settings and levels for the most part. You could look at the same level and settings on three different sites and there would alot of times be a 10-15 fps or more difference. And thats all because they recorded their own demo's in-game from all different parts of the level.

Blackwind said:
Some people do very much care about numbers because thats what they have. They cant afford to go out and buy both cards and compare themselves. [H] was not the only site to declare the x800 better. Again....who isnt living in reality? :rolleyes:

lol

Blackwind said:
The point isn't always to sit around and compare numbers and see who's is highest but to use those figures to gather data on the performance of the card and to task it accordingly. If all we cared about were numbers we'd sit around and run 3DMark non-stop with no AA or AF.

Once again your contradicting yourself. First numbers weren't important and now they are. You say the exact opposite in each of your posts just so you can try to avoid the fact you dont know what your talking about.

I think we should all get back on topic though and drop it. I'm going to post some benches here in a little bit probably.
 
burningrave101 said:
Noone ever said it wasn't. I said the results gathered with FRAPS are unrepeatable and influenced by alot of things other then just the GPU. I've said several times that testing in-game with FRAPS is noteworthy, but i also said it was not as accurate as testing with demo's even if its a more "realistic" overview of performance.

Yuh think? :rolleyes: That is the entire point of reviewing gameplay. I want you to take a minute and pay special attention to what I have highlighted in your statement and then relate it to the term ...GAMEPLAY.

[H] is reviewing video cards. Their focus is gameplay. The results from both their methods of testing and what some label "apples to apples" comparison gives one a very good idea of the behavior and performance of a given video card tested.


burningrave101 said:
That just shows you dont know anything about testing with FRAPS. If you did then you would know the performance would change considerably according to the level, part of the level, AI reaction, and other hardware in the system.

Right. I've only used FRAPS since its release. I know nothing about it. Again, I've highlighted for you..... Yuh think? Hence the reason that every other site and [H] list the hardware involved in order to give a reader a starting point of reference. Take your time on the following sentence because I know reading comprehension isn't your strong suit.

Obtaining an average.

I'm sure many [H] forum readers do very much the same as I do. Take the figures from various sites and various platforms and compare them to their present hardware they use at home in their machines to better understand the various effecting pieces to the puzzle. To include frontbus, processor, gpu, memory and many other facets involved.

burningrave101 said:
Have you not looked at all the SM 3.0 Far Cry benches on the net? Not a single one of them gave the same results, even though they ran the same settings and levels for the most part. You could look at the same level and settings on three different sites and there would alot of times be a 10-15 fps or more difference. And thats all because they recorded their own demo's in-game from all different parts of the level.

Yes, I've read the various reviews available.....and have seen the expected results. While a variance of 10-15 fps is unacceptable to you...it is exactly what I expect from the testing and tools involved. When all the elements are taken into account it gives me the information I needed from the reviews in question....
 
Good for the original poster, and good for everybody else who gets in at the sweet spot in the nvidia product curve, which appears to be the GT, hands down. I've yet to see somebody talk about a GT that wouldn't clock as high as an ultra.

I drive a couple 191Ts with the evga 6800gt I put in over the weekend, so I don't ever go over 1280x1024, but it is really nice to go into a game, even at that rez, and turn everything to very high and still get a smooth gameplay experience. My old 9700pros couldn't do it, heh.

I tried out FC, then installed thief:ds(smooth while maxed), have to check out hitman3 with this card on its crazy max settings tonight.

Unless people are looking to drive things at 1600x1200 with full aa/af, you don't need anything more than an x800 pro or a 6800gt.

oh, and boo to the threadcrappers. I don't know what you guys think you're trying to prove. The only point of this post was that the original poster is enjoying his new card. Are you trying to say he isn't enjoying it? lol.
 
Blackwind said:
Right. I've only used FRAPS since its release. I know nothing about it. Again, I've highlighted for you..... Yuh think? Hence the reason that every other site and [H] list the hardware involved in order to give a reader a starting point of reference. Take your time on the following sentence because I know reading comprehension isn't your strong suit.

Obtaining an average.

I'm sure many [H] forum readers do very much the same as I do. Take the figures from various sites and various platforms and compare them to their present hardware they use at home in their machines to better understand the various effecting pieces to the puzzle. To include frontbus, processor, gpu, memory and many other facets involved.

An average is worthless and you just can't seem to realize that. Your reading comprehension comments are getting quite old when it is in fact that you cannot seem to understand a damn thing i've tried to tell you. I've explained over and over and over again why averages and results with FRAPS can be innaccurate but your reading comprehension just keeps getting the best of you. I've explained over and over again why testing in-game brings added variables you do not want if your going to compare actual NUMBERS and not just how it feels. If you want to test in-game and coment on the IQ and playability, thats one thing. But the actual figures will be innacurate and unrepeatable. Anandtech even said that about FRAPS testing in their last review because they couldn't come up with the same results twice.

Several people here on HardOCP have said when they tested an X800Pro and 6800GT, the 6800GT felt smoother, even with high AF. Some of you obviously dont know that FRAPS can be detected in-game and driver teams from nVidia or ATI can optimize for it. That right there would give you higher frame rates on one card while yet the other card was still more playable.

Welcome to the worlds first in-game 3dmark03.

Blackwind said:
Yes, I've read the various reviews available.....and have seen the expected results. While a variance of 10-15 fps is unacceptable to you...it is exactly what I expect from the testing and tools involved. When all the elements are taken into account it gives me the information I needed from the reviews in question....

I'm not talking about a 10-15 fps variance in results for both cards there MR. Reading Comprehension. I'm talking about a 10-15 fps difference and more in the results for EACH card. That means if you look at the same level and same settings on three different sites, and sometimes the 6800u is ahead by 15 fps, sometimes 3 fps, and sometimes not at all.

BOTH cards should go up or drop in performance according to the hardware.

The reason were seeing so many variances in performance in the different levels of Far Cry, across different hardware sites, is because alot of them recorded their own demo's and i'm sure none of them were recorded in exactly the same place.

Then you have to take SM 3.0 into account. There is a reason why the 6800's perform so much better in the nVidia recorded demo's, and thats because they knew exactly where SM 3.0 was used the most in the levels they recorded demo's for. The rest of these review sites can guestimate but thats about it.

The CPU is very much involved in the performance results when you test in-game. The exterior parts of Far Cry are very CPU intensive. And with the AI reacting differently each and every time, you cant possibly expect to get accurate numbers for comparing two cards when the difference in their performance is a mere 5-10 fps or less the majority of the time.
 
FRAPS may have been good back then but with games like Far Cry with dynamic AI(well more dynamic than other games out right now ;) surely an update or something is in order? FRAPS is a great tool and i'm not denying that i'm just saying that games of now and future will make FRAPS less useful. halflife 2 for example with its incredibly dynamic AI, would surely make no 2 benchmarks with FRAPS the same, although it wouldn't be the only benchmarking tool that would become less useful, i mention it because its pretty on topic right now :)
 
burningrave101 said:
An average is worthless and you just can't seem to realize that.

An average is everything when some many variables are present....


burningrave101 said:
Your reading comprehension comments are getting quite old when it is in fact that you cannot seem to understand a damn thing i've tried to tell you. I've explained over and over and over again why averages and results with FRAPS can be innaccurate but your reading comprehension just keeps getting the best of you. I've explained over and over again why testing in-game brings added variables you do not want if your going to compare actual NUMBERS and not just how it feels. If you want to test in-game and coment on the IQ and playability, thats one thing. But the actual figures will be innacurate and unrepeatable. Anandtech even said that about FRAPS testing in their last review because they couldn't come up with the same results twice.

If you are tired of hearing it then attempt to stop placing words in peoples mouths and actually read what has been said. How long has Anandtech used FRAPS? I would venture to say just as long as any other reviewer out there. No tool is perfect and anything can be altered. That is why we are readers have our choice of sites to trust and read.

burningrave101 said:
Several people here on HardOCP have said when they tested an X800Pro and 6800GT, the 6800GT felt smoother, even with high AF. Some of you obviously dont know that FRAPS can be detected in-game and driver teams from nVidia or ATI can optimize for it. That right there would give you higher frame rates on one card while yet the other card was still more playable.

Every tool that becomes popular will become a target for each manufacture to cheat at. This is a given and nothing new.


burningrave101 said:
I'm not talking about a 10-15 fps variance in results for both cards there MR. Reading Comprehension. I'm talking about a 10-15 fps difference and more in the results for EACH card. That means if you look at the same level and same settings on three different sites, sometimes the 6800u is ahead by 15 fps, sometimes 3 fps, and sometimes not at all.

Thats exactly what I was talking about. You are the one mistaken in your assumption.

burningrave101 said:
BOTH cards should go up or drop in performance according to the hardware. The reason were seeing so many variances in performance in the different levels of Far Cry, across different hardware sites, is because alot of them recorded their own demo's and i'm sure none of them were recorded in exactly the same place.

In a perfect world sure....this isnt.



burningrave101 said:
Especially when SM 3.0 is taken into account. There is a reason why the 6800's perform so much better in the nVidia recorded demo's and thats because they knew exactly where SM 3.0 was used the most in the levels they recorded demo's for. The rest of these review sites can guestimate but thats about it.

Of course ...you dont have an issue with that....go figure. :rolleyes:


eastvillager said:
Good for the original poster, and good for everybody else who gets in at the sweet spot in the nvidia product curve, which appears to be the GT, hands down. I've yet to see somebody talk about a GT that wouldn't clock as high as an ultra.


There are several who have not been able to especially on the BFG hardware. Try nvNews......if you can wade through the fanboi drivel and find the information.
 
Blackwind said:
There are several who have not been able to especially on the BFG hardware. Try nvNews......if you can wade through the fanboi drivel and find the information.

"Has anyone NOT gotten their GT to run @ Ultra speeds? ( 400/1100 )"

http://www.nvnews.net/vbulletin/showthread.php?t=31513

My XFX 6800GT Auto Detects in CoolBits at around 420/1130. The "Detect Optimal Frequencies" option in CoolBits is usually farley conservative too. Especially with the RAM.

BTW, nV News just got their new review up covering the 6800's performance. And they tested Morrowind :).

http://www.nvnews.net/articles/6800_games/index.shtml
 
ya i gotta say it looks like people are hitting ultra speeds pretty consistently at least on the core, with most people able to go beyond 400mhz. i try to think of it in the same way that people have been oc'ing amd processors to match/beat intel cpu, or at least be more competitive, while still laying out a smaller amount of cash. or like, instead of buying a ferarri, you buy a cheaper car and boost it up. although in this case it's like getting a free supercharger, you just have to flip the switch.
 
Man, this site just can't have a civil video card discussion. The thread starter bought a 6800GT and loves the performance it gives in Farcry. Great! I bought an x800 Pro and I love the performance it gives in Farcry. Excellent! Why do we then have to argue endlessly over which card is better? They're both blazingly fast and offer smooth gameplay with excellent IQ at high settings. Why can't we just leave it at that?
 
edit/update: ai does have an impact on real world gameplay, of course. I find that although I'm cpu limited in benchemall at all resolutions, when I use FRAPS and watch things in game, i'm much more comfortably staying above 60fps at 1280x1024 than at 1600x1200.

Now, I'm a FPS junkie, and a twitch-Knob (er...nevermind about that)...so I really can't stand when the minimum framerate gets to 30-40. To keep my minimum near the 50-60, i needed to drop down to 12x10. Other people may not have any problem playing at a higher rez and a slightly lower minimum fps.

BTW, I think I could enable 2x AA/4xAF and have no trouble at this rez either; those low AA/AF settings hardly impact performance at all and not noticeably in my gaming at this rez. Knock it up to 4xAA/8xAF and I start to notice the minimum frames again.

These comments are my own and based on my own subjective experience and watching FRAPS in game.
 
oozish said:
edit/update: ai does have an impact on real world gameplay, of course. I find that although I'm cpu limited in benchemall at all resolutions, when I use FRAPS and watch things in game, i'm much more comfortably staying above 60fps at 1280x1024 than at 1600x1200.

Now, I'm a FPS junkie, and a twitch-Knob (er...nevermind about that)...so I really can't stand when the minimum framerate gets to 30-40. To keep my minimum near the 50-60, i needed to drop down to 12x10. Other people may not have any problem playing at a higher rez and a slightly lower minimum fps.

BTW, I think I could enable 2x AA/4xAF and have no trouble at this rez either; those low AA/AF settings hardly impact performance at all and not noticeably in my gaming at this rez. Knock it up to 4xAA/8xAF and I start to notice the minimum frames again.

These comments are my own and based on my own subjective experience and watching FRAPS in game.


I 100% agree with you.....everything you have detailed is what I have seen from my 6800GT thus far. I also prefer 1280x1024. I'm usually running at 4 AA 8 AF. It can become annoying at points with a little jump here and there but playable. My drop in framerate are very brief. Will have to try your idea of 2 aa 4 af tonight and see what difference I see....or don't see for that matter. :D All in all I believe the 6800GT is the bang for the buck when it comes to the latest releases from ATI and nVidia. I'm supposed to get a x800 early next week to compare.....pretty sure I'll stick to the 6800GT on my main gaming machine.
 
Dijonase said:
Man, this site just can't have a civil video card discussion. The thread starter bought a 6800GT and loves the performance it gives in Farcry. Great! I bought an x800 Pro and I love the performance it gives in Farcry. Excellent! Why do we then have to argue endlessly over which card is better? They're both blazingly fast and offer smooth gameplay with excellent IQ at high settings. Why can't we just leave it at that?

ditto
cant lose either way
 
Dijonase said:
Man, this site just can't have a civil video card discussion. The thread starter bought a 6800GT and loves the performance it gives in Farcry. Great! I bought an x800 Pro and I love the performance it gives in Farcry. Excellent! Why do we then have to argue endlessly over which card is better? They're both blazingly fast and offer smooth gameplay with excellent IQ at high settings. Why can't we just leave it at that?

i completely agree and although i'm on the fence i've noticed that since the 9800/9700/9600 many people have found it hard that nvidia can make quality cards (6800GT is the best value for money card out here right now (if we're talking new otherwise the 9800pro 256-bit is still great). i can imagine it will be back and fourth. the NV4X will probably have this round with its SM3.0 etc., while with the R5X00 wh nows how well ati will be with SM3.0), just sit back and by the cheapest one
 
To try and compliment what Dijonase wrote, I have something I'd like to say.

I think people would be much more open about the fact that Nvidia is back on track with its new line of cards and that ATI doesn't have it's former advantage this round if, IF, some people would just tone it down a tad. Typing paragraph after paragraph, quoting some1 repeatedly, and calling them a moron when they don't argee with you is Fanboy'ism if you ask me. Since I'm an ATI supporter at the moment, I find it hard to support anything Nvidia with the level of insaneness that surrounds it. Nvidiots who felt the burn last year are HELL BENT on proving something this year. And it's dumb. When you act like an ass pimping your brand of video card, all you do is create more Fanboy'ism in the opposite direction you origonally intended.

See Fanboy's new card. See Fanboy yell and w00t. See Fanboy flame at non-believers. See Fanboy crash and burn and banned. DON'T BE A FANBOY!! :D
 
ShowMeThe$$$$$$ said:
See Fanboy's new card. See Fanboy yell and w00t. See Fanboy flame at non-believers. See Fanboy crash and burn and banned. DON'T BE A FANBOY!! :D

Amen. I went from a Voodoo 5 5500 to a GeForce 4 Ti4400 to my current Radeon X800 Pro. I'm the complete opposite of a fanboy. I'm completely non brand loyal. Though my last two cards were both from Visiontek, so maybe that makes me a Visiontek fandoy! ;)
 
oh thank the lord jesus, a [H]ard 6800gt review! frickin' woot! Thank you Brent.
 
Well, for the record, mine auto detects everytime I autodetect between 418/1090 and 420/1100...mostly the latter, so I've set up two overclock setting in powerstrip, ultra (400/1100) and ultra extreme (420/1100). The highest setting results in about a 3-4C bump in my load temp.
 
oozish said:
Well, for the record, mine auto detects everytime I autodetect between 418/1090 and 420/1100...mostly the latter, so I've set up two overclock setting in powerstrip, ultra (400/1100) and ultra extreme (420/1100). The highest setting results in about a 3-4C bump in my load temp.

Ultra Exteme is 450/1100.
 
Back
Top