Search results

  1. E

    Noise Levels: X1900XTX vs 7900GTX, videos @ techreport.com

    Do you think thats a bad thing? I for one think thats a great thing. That means Nvidia is, should I say it? Actually working on optimizations and implementing features in those games to maximize experience. To say the least, they push the envelope to fix issues or structure better...
  2. E

    Noise Levels: X1900XTX vs 7900GTX, videos @ techreport.com

    Turning off optimizations it not a significant hit as saying, turning ATI's optimizations off. In the games it beats the XTX, it will still beat it with optimizations off. The only reason I mention to turn optimizations off is shimmering in BF2 which hopefully will be fixed next driver...
  3. E

    Noise Levels: X1900XTX vs 7900GTX, videos @ techreport.com

    Its not just a little noise, did you even watch the videos in the links? And if I gave you two identical screen shots, do you think you can pick which one is the X1900XTX? If you answer yes, then I surely will go through the trouble because I can guarantee you, having Nvidia's...
  4. E

    Noise Levels: X1900XTX vs 7900GTX, videos @ techreport.com

    The XTX may be faster in Oblivion, but god forbid there are other games out there, including future games like Quake Wars and RTCW2 that the 7900GTX will assumigly be faster at. I still can't believe people blatantly say XTX outperforms the GTX. It out performs it in ~less than 50% of all...
  5. E

    Hdr + Aa

    Oh the madness.... I think HDR+AA is awesome, no matter what the HDR procedure. I can't wait until both Nvidia and Ati users can play with HDR+AA on the new upcoming Half Life 2: Episode 1 expansion. I applaud Valve for incorporating a usuable HDR for both cards.... Bravo!
  6. E

    Hdr + Aa

    My bandwidth limit is already over its max. I'll upload two to image shack for you... http://img125.imageshack.us/my.php?image=dedust00028vo.jpg http://img84.imageshack.us/my.php?image=dedust00019vx.jpg
  7. E

    Hdr + Aa

    Some people clearly stated that Nvidia just cannot do HDR+AA. A few post up there is a direct qoute. I'm not trying to compare anything, I'm just posting screenshots of HDR+AA on an Nvidia card because some people believe it to be completely not true. But I am going to have to remove those...
  8. E

    Hdr + Aa

    Well then, let the arguement for HDR float-based+AA begin.... :confused: I mean come on, High Dynamic Range is High Dynamic Range no matter what the procedure. The botton line is that 7900GTX's can run HDR+AA.
  9. E

    Hdr + Aa

    Here they are fellas, HDR+AA on a 7900GTX. Let the arguemnt finally end.... Some shots are purposely overexposed. EDIT: Had to change servers, so heres 2 http://img125.imageshack.us/my.php?image=dedust00028vo.jpg http://img84.imageshack.us/my.php?image=dedust00019vx.jpg
  10. E

    Hdr + Aa

    Well, I just played the new de_dust for Counterstrike:Source (HDR added) and it was amazing. And you know what, I'm using a 7900GTX and I had HDR+AA in full force and it was stunning. So I don't know what to assume. I don't know much about the FP16 blending and whatnot but it is possible.
  11. E

    7900 GTX reliability

    I have an XFX 690/1750, overclocked it to 705/1800, played HL2, BF2, Doom 3 for hours, no problems or artifacting whatsoever.
  12. E

    Best 7900 gtx maker ?

    They are all based on reference chips, no difference besides warranty. Evga may be nice for their step-up program, but Evga, XFX, and BFG, that I know of, all offer life time warranties. Either one of those would be of equal and good choice.
  13. E

    x1800xt or 7900gt?

    So please, tell me why every single review out there, when they say "image quality" they compare apples to apples to see which filtering method is better. They compare the same filter method so it is fair. Now, why do they do that? I'm just saying all these people that say ATI has better...
  14. E

    x1800xt or 7900gt?

    Trying to say ATI has "better image quality" just because they can run 5-15 FPS better with 4XAA & 16AF in Oblivion, BF2 and Fear doesn't make sense. So out of those three games, ATI has better image quality because it can perform better. So in Quake 4, Doom 3, Farcry HDR, HL2:Lost Coast...
  15. E

    x1800xt or 7900gt?

    Gotcha. I havn't used an ATI card since my 9800 pro died last year.
  16. E

    x1800xt or 7900gt?

    Having better performance when enabling AA & AF does not mean better image quality. Image quality would be comparing the ground with 16X AF enabled on an Nvidia and ATI, and seeing which is sharper. Image quality would be the texture shimmering BF2 gets when Nvidia's optimizations are on...
  17. E

    x1800xt or 7900gt?

    For the love of god, quit saying the X1800 has better IQ. If you turn off the optimizations in the Nvidia control panel, like many review sites do, the IQ is identical. I've heard countless testimonies of Nvidia having better IQ, and ATI having better IQ. They are both too similar to even...
  18. E

    Hdr + Aa

    First off, who even cares. But, if you think about it, it makes a lot more sense for Nvidia or ATI to implement their hardware and stability into a game before its released, that way, us consumers benefit from it. It doesn't sell more games but TWIMTBP tells the consumer, "Hey, if I have an...
  19. E

    7900GTX or X1900XTX for GATEWAY FPD2185W 21" (Single Card) ?

    Who the? The man asked which card would work for his monitor, he didn't ask which card ran Oblivion better. And according to a more recent review... http://www.xbitlabs.com/articles/video/display/geforce7900gtx_13.html Xbit shows pretty much equal single card performance. If you only...
  20. E

    HDR+AA in Oblivion for Ati X1K!

    http://www.xbitlabs.com/articles/video/display/geforce7900gtx_13.html
  21. E

    Good Oblivion Benchmarks on Firingsquad

    I think you should take another look at the graph
  22. E

    Good Oblivion Benchmarks on Firingsquad

    This is a joke right? Oblivion is probaly the worst engine ever programmed. Its performance is a joke. Its a console port built for the X360 (ATI chipset) and will never be used ever again. That is unless Bethesda releases expansions. You can judge a cards performance on those three...
Back
Top