Would it benefit me

Your brain AND eyes can tell the difference between 70 and 72, the minimum suggested refresh rate is 72hz to limit eye strain ;), NOT 70.

~Adam
 
CleanSlate said:
Your brain AND eyes can tell the difference between 70 and 72, the minimum suggested refresh rate is 72hz to limit eye strain ;), NOT 70.

~Adam

Note: I did not say Hz I said fps in my statement. Its not hard to tell the difference between 60 and 72, 72 and 85 Hz. Its different for FPS since its redered and displayed by the monitor at a constant 72 or 80Hz.

Perhaps you forgot that he's not talking about a CRT here. The same rules don't apply to LCD's even though windows assigns them a "refresh rate". They won't cause eye strain the same way a CRT will when set to 70Hz. Their response time is the limiting factor. You should also know that LCD's work best in their native resolution, not emulated ones. It will just cause blurring to set the "refresh rate" too high on one.

Think before you post please.
 
I saw no reference to either CRT nor LCD.

Either way to compare 72 and 85 fps there has to be the ability to see the fps thus the assumption to hz is necessary for that to work, you can't compare the two lest you're talking about at least 72 and 85 hz..... Think before YOU speak.

~Adam
 
BossNoodleKaboodle said:
Your brain can't tell the difference between 70 and 85fps. Even the best CRT's won't do much more than 75Hz at 1600x1200. You'd benefit from a CRT if your monitor doesn't go above 1600x1200 and has a poor response time/contrast ratio.

http://www.3dcenter.de/artikel/2003/11-21_a_english.php
http://graphics.tomshardware.com/graphic/20040603/index.html

Theres two good articles on ATI's optimizations that try to trick the eye. I didn't write the reoprts ;)
Did you even read those?
Nvidia does that too, and are apparently worse, used to have shimmering problems, they fixed it though.
I really like how you seem to love nvidia so much that you manage to convientently not mention nvidia does the same thing, and was the first to do it with their FX series.
 
Moloch said:
Did you even read those?
Nvidia does that too, and are apparently worse, used to have shimmering problems, they fixed it though.
I really like how you seem to love nvidia so much that you manage to convientently not mention nvidia does the same thing, and was the first to do it with their FX series.

The difference is, you can turn it off with Nvidia's drivers.
 
RLich8 said:
I really just can't decide.

I just upped the core to 504 and the memory to 524 on mine, played in HL2 for about 10 minutes, no artifacting. Plays noticeably different than before. That's making me want to keep the card. I'm thinking of just putting a new cooling system on it and getting a new monitor. I have an LCD right now and it sucks for gaming because the refresh rates are either 60 or 70 on it, which means I cant go higher than 60-70 FPS.

I just really don't know right now, I just don't know...I want the 6800GT, but then again I want to stick with the X800 and overclock it...

If you'd read the guy who asked the initial question for the entire post, you'd see that I was adressing an LCD. Think before you post once again lol. If you knew how LCD's work vs CRT's your realize that the emulated windows "refresh rates" aren't important and its best to run them in their native resolution. Their response time, which is how long it takes a pixel to change color etc, rather than the time it takes for the pixels in a CRT to be redrawn with the electron beam affects image quality. Try not to misinform the guy asking the questions too much. He's looking for information, not speculation.
 
BossNoodleKaboodle said:
The difference is, you can turn it off with Nvidia's drivers.
Why would you want to do that?
It didnt fix the shimmering problem, so why would you sacrafice performance for a small increase in IQ? is nvidias filtering really that bad?
 
BossNoodleKaboodle said:
If you'd read the guy who asked the initial question for the entire post, you'd see that I was adressing an LCD. Think before you post once again lol. If you knew how LCD's work vs CRT's your realize that the emulated windows "refresh rates" aren't important and its best to run them in their native resolution. Their response time, which is how long it takes a pixel to change color etc, rather than the time it takes for the pixels in a CRT to be redrawn with the electron beam affects image quality. Try not to misinform the guy asking the questions too much. He's looking for information, not speculation.

I read your arguments, perhaps if you argued correctly, i.e. stating specifics for your arguments rather than being overly general thus causing your arguments to become invalid, I wouldn't have to read back 3 pages to get information that really doesn't matter anyhow.

And.. are you joking thinking you have to explain how LCD's work to me? Assumption number 99, add 1 more and I'll give you a cookie for being the biggest idiot on the forum.

~Adam
 
Moloch said:
Why would you want to do that?
It didnt fix the shimmering problem, so why would you sacrafice performance for a small increase in IQ? is nvidias filtering really that bad?

I've been running Nvidia and ATI cards on my two systems for some time, with lots of different settings. Currently I've got trinlinear filtering forced on my Nvidia and I've never seen the shimmering you're talking about in ANY application. My 9800 runs hot as hell and draws more power but its only artifacted a few times ever.

"Unfortunately ATI went a bit further than the competition did with respect to texture filtering logic simplifications and, as a consequence, deviations from textbook quality." article quote.

The lod clamp that you're talking about DID in fact fix the shimmering which was only a problem with certain drivers. You're obviously not trying to inform, but rather persuade and spin information for absolutely no ones gain. Do you work for ATI or something :p
 
Moloch said:
Why would you want to do that?
It didnt fix the shimmering problem, so why would you sacrafice performance for a small increase in IQ? is nvidias filtering really that bad?

The difference also is that ATi actually created the optimisation correctly in the first place and didn't need to turn it off because it is a true optimisation and doesn't hurt IQ like Nv's most likely does.

~Adam
 
CleanSlate said:
I read your arguments, perhaps if you argued correctly, i.e. stating specifics for your arguments rather than being overly general thus causing your arguments to become invalid, I wouldn't have to read back 3 pages to get information that really doesn't matter anyhow.

And.. are you joking thinking you have to explain how LCD's work to me? Assumption number 99, add 1 more and I'll give you a cookie for being the biggest idiot on the forum.

~Adam

Tsk tsk tsk, name calling now are we? It should be obvious that he was talking about LCD monitors because its clearly stated in his question, which is what the thread is meant to answer... isnt it? Or is it a place where you come to call people names and act like a child?

Those optimizations ALL only do one thing... Give better FPS at the expense of image accuracy and quality.
 
BossNoodleKaboodle said:
Tsk tsk tsk, name calling now are we? It should be obvious that he was talking about LCD monitors because its clearly stated in his question, which is what the thread is meant to answer... isnt it? Or is it a place where you come to call people names and act like a child?

You weren't specific in your argument, you failed to make the case correctly. Why point the fingers else where? Be specific if you're going to take the time to make an argument.

~Adam
 
CleanSlate said:
The difference also is that ATi actually created the optimisation correctly in the first place and didn't need to turn it off because it is a true optimisation and doesn't hurt IQ like Nv's most likely does.

~Adam
Thats what I thought the general consensis is, I dont got a new ati card so I dunno, I find my 8500's bilinerial ok for the most part, cant really see the line unless game uses some messed up LOD bias, like serious sam 1/2.
since there seems to be some doubts about tech report saying the X800 pro is dropping to 300, ill prolly get a 6800 NU.
And please stop saying im trying to spin info, you're killin me since you're linking to an article talking about ati filterting tricks, when nvidia does the same thing by default you dolt.
you seem to be related to the trendz fella..
 
CleanSlate said:
You weren't specific in your argument, you failed to make the case correctly. Why point the fingers else where? Be specific if you're going to take the time to make an argument.

~Adam

No one pointed any fingers at anyone. You simply stated things about CRT's refresh rates when he was asking about LCD's and frames per second. Perhaps you need to read more carefully to avoid such mistakes.

You're trying to pick apart my answers to the guy who started the thread's questions for some odd reason. He was specific when he asked his question, and I gave him a direct answer. You tried to correct my answer to his question, perhaps without reading his question, or simply not knowing the facts. Whatever the case, his question was answered, which was the only point of my post, not to get into some uninspired banter with someone who's arguing about eye strain from low Hz refresh rates on an LCD monitor, which is totally moot.
 
BossNoodleKaboodle said:
Those optimizations ALL only do one thing... Give better FPS at the expense of image accuracy and quality.

An true optimisation does NOT give a noticable decrease in IQ, which is in the definition of OPTIMISATION. Would it be an optimisation if half your screen was black but hey- the other half was going twice as fast, that's an obvious "optimisation" pffft.

+1 assumption *hands you cookie*

~Adam
 
I think the 6800GT is overall a better card than the X800 Pro. You're probably going to take a $40+ loss in selling the card, and buying a 6800GT though.

Is your Pro a Vivo? If it is... I would seriously consider trading you my BFG 6800GT for it. I wouldnt trade my GT for a regular Pro, but for a Pro flashed to a XT I think I would.
 
BossNoodleKaboodle said:
No one pointed any fingers at anyone. You simply stated things about CRT's refresh rates when he was asking about LCD's and frames per second. Perhaps you need to read more carefully to avoid such mistakes.

You made the argument about 72 as compared to 85 fps, yet said nothing about CRT monitors. Obviously if you were specifically talking about CRT monitors you should specifically say CRT's, you failed to do so thus your argument becomes incorrect, by definition.

~Adam
 
CleanSlate said:
You made the argument about 72 as compared to 85 fps, yet said nothing about CRT monitors. Obviously if you were specifically talking about CRT monitors you should specifically say CRT's, you failed to do so thus your argument becomes incorrect, by definition.

~Adam

What you don't seem to realize is that #1, you can't hand me a cookie through a computer screen, no matter how high its refresh rate is, and #2 that I wasn't answering a question from the likes of you. I was answering the question the guy asked about FPS on his LCD. If you can't use deductive logic to figure that one out, you've got my sympathy. If the best thing you got is to argue my semantics thats pretty sad.

A true optimization would be to have NO need for the optical trickery designed to give higher fps at all! i.e. more processing power. If mr cookie can tell the difference between 71.00005Hz and 72, surely he'll be able to tell the difference between the 4x AF areas of the screen, and the 2x optimized areas. So for his sake, we should get rid of the low quality optimizations in ATIs driver sets, or at least have the option to turn them off like you can with Nvidia's. I'd love to be able to turn off the optimizations on my 9800 pro, but I can't because they think I care more about 5 fps than a quality image. And yes you can tell the difference between 2xAF and 4x.
 
BossNoodleKaboodle said:
What you don't seem to realize is that #1, you can't hand me a cookie through a computer screen, no matter how high its refresh rate is, and #2 that I wasn't answering a question from the likes of you. I was answering the question the guy asked about FPS on his LCD. If you can't use deductive logic to figure that one out, you've got my sympathy.

A true optimization would be to have NO need for the optical trickery designed to give higher fps at all! i.e. more processing power. If mr cookie can tell the difference between 71.00005Hz and 72, surely he'll be able to tell the difference between the 4x AF areas of the screen, and the 2x optimized areas. So for his sake, we should get rid of the low quality optimizations in ATIs driver sets, or at least have the option to turn them off like you can with Nvidia's. I'd love to be able to turn off the optimizations on my 9800 pro, but I can't because they think I care more about 5 fps than a quality image. And yes you can tell the difference between 2xAF and 4x.
I'm sure you'd love to, since you think the filtert optimizations are an evil horrible thing.
If something it going to greaty increase my framerate(4fps sure..) I'll take it since ati has been doing it since the 9600 and no one noticed

:rolleyes:
 
I'm sure you've seen the side by side comparison pictures, and if you get the good low compression jpgs you can see the difference where the 2x AF is used vs 4x and even the bilinear vs true trilinear.

No one noticed? Why are there articles all over written about that exact subject? Curious...
 
BossNoodleKaboodle said:
No one noticed? Why are there articles all over written about that exact subject? Curious...

Because it is there to be written about. They made it no big secret.

~Adam
 
CleanSlate said:
Because it is there to be written about. They made it no big secret.

~Adam
QFT, look at the damn date on the articles.
It would do you good to stop replyling to threads BossNoodleKaboodle, cuz with each post mine, and and sure others, opinion of you goes lower and lower.
 
Moloch said:
QFT, look at the damn date on the articles.
It would do you good to stop replyling to threads BossNoodleKaboodle, cuz with each post mine, and and sure others, opinion of you goes lower and lower.

Your grammar seems to get worse the more you post too lol. Conjunctions are typically used one at a time.
Are you posting here to gain approval and admiration of others; or to get help with technical issues from people who know what they're talking about?
 
BossNoodleKaboodle said:
Your grammar seems to get worse the more you post too lol. Conjunctions are typically used one at a time.
Are you posting here to gain approval and admiration of others; or to get help with technical issues from people who know what they're talking about?
Theres a good reason for that ass, its late.
I am not posting to gain approval of anything, I dont have to prove my self to anyone, that goes for real life also, I post to gain knowledge, and correct people.
 
Moloch said:
I am not posting to gain approval of anything, I dont have to prove my self to anyone, that goes for real life also, I post to gain knowledge, and correct people.

QFT.

~Adam
 
The Nvidia's are a lot better for 3D
Owning both a BFG6800GT and an X800XTPE i know which does 3D better to my eyes anyways :p i will leave it at that ;)
I agree with what the other people are saying in that the X800Pro really should of been marketed against the 6800NU not against the 6800GT ;)
 
RLich8 said:
That's it, I've made up my mind, I'm done with the X800. I've played some games on 2 6800GT's and I can tell a difference, or at least I think I can. Either way, I want the card more.

It's on ebay.

If anyone is interested, here's the link. http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&item=5148441632

Good call. Its not worth the hassle of modding the card and having the mod not even work or make it unstable, when you can get better performance for less or equal money right out of the box, not to mention Nvidia's drivers are much simpler to change settings in. You did the right thing by comparing it for yourself, because thats the only way you'll know for certain.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
RLich8 said:
That's it, I've made up my mind, I'm done with the X800. I've played some games on 2 6800GT's and I can tell a difference, or at least I think I can. Either way, I want the card more.

It's on ebay.

If anyone is interested, here's the link. http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&item=5148441632


At least you played on the new one--teh way you were talking at the beginning of your thread you made it sound like hte rumour mill had turned you against your x800.

I'm still not convinced that didn't happen, but you played on the new one, and that's hot. Much better than modding the old one.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
glynn said:
Owning both a BFG6800GT and an X800XTPE i know which does 3D better to my eyes anyways :p i will leave it at that ;)
I agree with what the other people are saying in that the X800Pro really should of been marketed against the 6800NU not against the 6800GT ;)
Why would they market it againt a slower car like that?
If the only game in the world was doom 3, perhaps, but due to doom 3's lackluster
multiplayer, think doom 3 has nothing goin for it right now expect pretty lighting.
In other games its on par with the 6800 GT, and it has a more useable 6X FSAA for guys that play at 1280 or above, where the 6XXX cards take a huge fps hit since it uses super sampling.
 
Well, the whole thing started by me seeing my buddy's PC w/ a 6800GT in it and a 19" NEC/Mitsubishi monitor. I was like wtf why isn't my $400 card getting that. So he suggested selling the X800 and convinced me about the 6800GT. So I messed w/ my settings, no dice. I went back over by him and played some games on 2 of his PC's that have 6800GT cards, and I got hooked.

So I'm selling the X800 Pro, picking up a 6800GT, and also getting a 19" flatscreen CRT. I know I'll be MUCH happier with my new video setup...I've never disliked an nVidia in the past. I've disliked 2 of my ATi cards that I've had now. I think it's safe to say I'll be buying nVidia in the future again.

Thanks for your guys help and opinions, it was a big help.

Roy
 
Was his CRT Aperture Grille or Shadow Mask? Try to get a good Aperture Grille Sony or NEC/Mitsu. Beware of the cheapo shadow masks out there. Its hard to get good color accuracy with one.
 
To me anyone who says there is any image diffrence between the X800 Series and the 6800 Series is just seeing diffrences they (WANT TO SEE)? Both the cards i have, I have tested them both inside out and back to front. And the only diffrence i could find was of course the X800XTPE was a little faster than the 6800GT i have=To be expected.But image quality was Identical. :confused:
 
I don't see how you could feasibly get identical images when one doesn't even do true trinilear filtering. I work with graphics for a living, so its kind of hard not to notice the differences between anisotropy and bilinearity in textures, especially on my reference monitor.
 
BossNoodleKaboodle said:
I don't see how you could feasibly get identical images when one doesn't even do true trinilear filtering. I work with graphics for a living, so its kind of hard not to notice the differences between anisotropy and bilinearity in textures, especially on my reference monitor.
You dont stop do you?
What is nvidia doing by default?
What will most users have it set to?
Default.
Jesus...
 
Let us know how it goes after you switch. I'm encouraged that folks are willing to try new things and switch vendors based on the merits of the product. ATI and Nvidia will both produce better products if they know that their customers will purchase the better technology instead of following a label.

Heck, this thread will probably still be open with the great FPS/Mhz/AF/Grammar debate anyway, so you've got some time to install and put the 6800GT through its paces for a while before coming back with your thoughts.

Good luck!

(and good choice)
 
Moloch said:
You dont stop do you?
What is nvidia doing by default?
What will most users have it set to?
Default.
Jesus...

Yes most people don't know how to use a graphics card's drivers properly, yes most people just leave it on default, yes most people don't even know the difference between bilinear and trilinear filtering. I do, however, and I like the fact that its availabe and actually useable with Nvidia. What's so frustrating to you about that? ;)
 
Back
Top