CleanSlate
Supreme [H]ardness
- Joined
- Mar 28, 2003
- Messages
- 5,258
Your brain AND eyes can tell the difference between 70 and 72, the minimum suggested refresh rate is 72hz to limit eye strain , NOT 70.
~Adam
~Adam
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
CleanSlate said:Your brain AND eyes can tell the difference between 70 and 72, the minimum suggested refresh rate is 72hz to limit eye strain , NOT 70.
~Adam
Did you even read those?BossNoodleKaboodle said:Your brain can't tell the difference between 70 and 85fps. Even the best CRT's won't do much more than 75Hz at 1600x1200. You'd benefit from a CRT if your monitor doesn't go above 1600x1200 and has a poor response time/contrast ratio.
http://www.3dcenter.de/artikel/2003/11-21_a_english.php
http://graphics.tomshardware.com/graphic/20040603/index.html
Theres two good articles on ATI's optimizations that try to trick the eye. I didn't write the reoprts
Moloch said:Did you even read those?
Nvidia does that too, and are apparently worse, used to have shimmering problems, they fixed it though.
I really like how you seem to love nvidia so much that you manage to convientently not mention nvidia does the same thing, and was the first to do it with their FX series.
RLich8 said:I really just can't decide.
I just upped the core to 504 and the memory to 524 on mine, played in HL2 for about 10 minutes, no artifacting. Plays noticeably different than before. That's making me want to keep the card. I'm thinking of just putting a new cooling system on it and getting a new monitor. I have an LCD right now and it sucks for gaming because the refresh rates are either 60 or 70 on it, which means I cant go higher than 60-70 FPS.
I just really don't know right now, I just don't know...I want the 6800GT, but then again I want to stick with the X800 and overclock it...
Why would you want to do that?BossNoodleKaboodle said:The difference is, you can turn it off with Nvidia's drivers.
BossNoodleKaboodle said:If you'd read the guy who asked the initial question for the entire post, you'd see that I was adressing an LCD. Think before you post once again lol. If you knew how LCD's work vs CRT's your realize that the emulated windows "refresh rates" aren't important and its best to run them in their native resolution. Their response time, which is how long it takes a pixel to change color etc, rather than the time it takes for the pixels in a CRT to be redrawn with the electron beam affects image quality. Try not to misinform the guy asking the questions too much. He's looking for information, not speculation.
Moloch said:Why would you want to do that?
It didnt fix the shimmering problem, so why would you sacrafice performance for a small increase in IQ? is nvidias filtering really that bad?
Moloch said:Why would you want to do that?
It didnt fix the shimmering problem, so why would you sacrafice performance for a small increase in IQ? is nvidias filtering really that bad?
CleanSlate said:I read your arguments, perhaps if you argued correctly, i.e. stating specifics for your arguments rather than being overly general thus causing your arguments to become invalid, I wouldn't have to read back 3 pages to get information that really doesn't matter anyhow.
And.. are you joking thinking you have to explain how LCD's work to me? Assumption number 99, add 1 more and I'll give you a cookie for being the biggest idiot on the forum.
~Adam
BossNoodleKaboodle said:Tsk tsk tsk, name calling now are we? It should be obvious that he was talking about LCD monitors because its clearly stated in his question, which is what the thread is meant to answer... isnt it? Or is it a place where you come to call people names and act like a child?
Thats what I thought the general consensis is, I dont got a new ati card so I dunno, I find my 8500's bilinerial ok for the most part, cant really see the line unless game uses some messed up LOD bias, like serious sam 1/2.CleanSlate said:The difference also is that ATi actually created the optimisation correctly in the first place and didn't need to turn it off because it is a true optimisation and doesn't hurt IQ like Nv's most likely does.
~Adam
CleanSlate said:You weren't specific in your argument, you failed to make the case correctly. Why point the fingers else where? Be specific if you're going to take the time to make an argument.
~Adam
BossNoodleKaboodle said:Those optimizations ALL only do one thing... Give better FPS at the expense of image accuracy and quality.
BossNoodleKaboodle said:No one pointed any fingers at anyone. You simply stated things about CRT's refresh rates when he was asking about LCD's and frames per second. Perhaps you need to read more carefully to avoid such mistakes.
CleanSlate said:You made the argument about 72 as compared to 85 fps, yet said nothing about CRT monitors. Obviously if you were specifically talking about CRT monitors you should specifically say CRT's, you failed to do so thus your argument becomes incorrect, by definition.
~Adam
I'm sure you'd love to, since you think the filtert optimizations are an evil horrible thing.BossNoodleKaboodle said:What you don't seem to realize is that #1, you can't hand me a cookie through a computer screen, no matter how high its refresh rate is, and #2 that I wasn't answering a question from the likes of you. I was answering the question the guy asked about FPS on his LCD. If you can't use deductive logic to figure that one out, you've got my sympathy.
A true optimization would be to have NO need for the optical trickery designed to give higher fps at all! i.e. more processing power. If mr cookie can tell the difference between 71.00005Hz and 72, surely he'll be able to tell the difference between the 4x AF areas of the screen, and the 2x optimized areas. So for his sake, we should get rid of the low quality optimizations in ATIs driver sets, or at least have the option to turn them off like you can with Nvidia's. I'd love to be able to turn off the optimizations on my 9800 pro, but I can't because they think I care more about 5 fps than a quality image. And yes you can tell the difference between 2xAF and 4x.
BossNoodleKaboodle said:No one noticed? Why are there articles all over written about that exact subject? Curious...
QFT, look at the damn date on the articles.CleanSlate said:Because it is there to be written about. They made it no big secret.
~Adam
Moloch said:QFT, look at the damn date on the articles.
It would do you good to stop replyling to threads BossNoodleKaboodle, cuz with each post mine, and and sure others, opinion of you goes lower and lower.
Theres a good reason for that ass, its late.BossNoodleKaboodle said:Your grammar seems to get worse the more you post too lol. Conjunctions are typically used one at a time.
Are you posting here to gain approval and admiration of others; or to get help with technical issues from people who know what they're talking about?
Moloch said:I am not posting to gain approval of anything, I dont have to prove my self to anyone, that goes for real life also, I post to gain knowledge, and correct people.
Owning both a BFG6800GT and an X800XTPE i know which does 3D better to my eyes anyways i will leave it at thatThe Nvidia's are a lot better for 3D
Next time make sure your corrections aren't erroneous like they were this time .CleanSlate said:QFT.
~Adam
RLich8 said:That's it, I've made up my mind, I'm done with the X800. I've played some games on 2 6800GT's and I can tell a difference, or at least I think I can. Either way, I want the card more.
It's on ebay.
If anyone is interested, here's the link. http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&item=5148441632
RLich8 said:That's it, I've made up my mind, I'm done with the X800. I've played some games on 2 6800GT's and I can tell a difference, or at least I think I can. Either way, I want the card more.
It's on ebay.
If anyone is interested, here's the link. http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&item=5148441632
Why would they market it againt a slower car like that?glynn said:Owning both a BFG6800GT and an X800XTPE i know which does 3D better to my eyes anyways i will leave it at that
I agree with what the other people are saying in that the X800Pro really should of been marketed against the 6800NU not against the 6800GT
You dont stop do you?BossNoodleKaboodle said:I don't see how you could feasibly get identical images when one doesn't even do true trinilear filtering. I work with graphics for a living, so its kind of hard not to notice the differences between anisotropy and bilinearity in textures, especially on my reference monitor.
Moloch said:You dont stop do you?
What is nvidia doing by default?
What will most users have it set to?
Default.
Jesus...