Search results

  1. D

    Shader model 3 was ATI right or nVidia right?

    1280x1024 @ force 4xAA and 8xAF (can't really tell if it's on though, I get jaggies sometimes, but I think it's just poor texture maping on devs part) on CoR and I get nothing below 32... o-O... In PS3 mode...
  2. D

    The first..."X800XL vs. 6800GT?"

    And um no offence -- because I dun really care... I like my GT, won't be switching because I have NEVER had a problem with it or any other card from either company.. (just don't mess with this that isn't broken) but does anyone remember the last time Drivenheaven put out a "credible" benchmark...
  3. D

    This may sound crazy but....

    ... It's been discussed that the GT doesn't have the number of capacitators to support a second voltage input.
  4. D

    512MB of Video RAM is out

    I know you're joking and all, but minus a written PCB etching, they are no different (other then color) to the PC version :confused:
  5. D

    correct me if i'm wrong, but

    For '01 your timings are HORRID... x-4-4-7.. if you could even close that down to 2-3-3 You'd get about a ~600 point boost (simply cause of memory reading) atleast in my experience.. Is default or by SPD? Also make sure your RAM is running in sync with your HT Bus -- basically running it 1:1...
  6. D

    Damnit NVIDIA get those chips out faster!!!!

    Probably should have thought of that before... The availability is reported not to be solved untill the second half of this year.. it looks like 600 - ~700 is about what you'll have to pay for a Ultra.. I've seen some PCI-Express GT's around for 500... (apparently these clock to real Ultra...
  7. D

    correct me if i'm wrong, but

    You said 1GB of some green ram... what kinda RAM, 2700, 3200? What... Also , that and Timings will make a HUGE difference in 01... 03 is all videocard.
  8. D

    What to upgrade?

    Softmodding the NU doesn't always work... and even if it does, you'll have a hell of a time gettnig them to 400/1100...
  9. D

    Gigabyte's dual core solution gets PWNed

    We already knew this would happen because using the SLI interface, two 6600GT's have the exact same results...
  10. D

    Glad i got an x800xtpe rather than a 6800 series

    I have yet to find a game (Except FarCry) that I can't run at 1600x1200 with 4xAA and 16xAF that I can't run without a hitch. Although most games I play at 1280x1024 with 4xAA and 16xAF... Simply because there is no noticble difference (to me)... Either card should do you without a...
  11. D

    X800XT or BFG 6800 Ultra?

    ... The card still plays it. WTF is the big deal? You gonna go play a game while watching it? I can understand if you do alot of encoding/decoding sorta, but at that point you'd probably have a P4...
  12. D

    ati heritage or not?

    Put names in a hat and draw. The most important factor... You won't be disappointed with either..
  13. D

    first nvidia dual gpu card

    But dual 6600GT cards still loose to the GT and Ultra at most benchmarks above 1280x1024 (or at) and with any sort of AA... This is probably going to be the only core available in this setup (considering heat factor)... I wonder why the COMPANY went out of their way to do this..
  14. D

    Would putting Arctic Silver 5 on my eVGA 6800 be a good idea?

    There actually seems something wrong with the card if it idles at that.. My card idles at 45 with AS5 on it.
  15. D

    x850 xt pe or 6800 ultra pci-e?

    o-O That's what I said, lol.
  16. D

    Am I going nuts?

    I'd actually bet it's the AF optimizations (if you're using AF) when you're not on Highest Quality (in the CP).
  17. D

    FP16 blending (HDR) performance "broken" on Nv40, fixed in Nv45 (PCI-E)?

    Or maybe not? Perhaps, http://www.xbitlabs.com/images/video/farcry13/volcano_hdr.gif http://www.xbitlabs.com/images/video/farcry13/regulator_hdr.gif -- and it looks to me that that is the only test they did where it performs worse than the 6600GT, and they both perform AT HALF the...
  18. D

    Playstation 3 to use nvidia graphics?

    Considering most of the recent ones have come from developers themselves... I'd say they're pretty damn close..
  19. D

    ATI learns some to us more on Longhorn

    Because ATi is just one of about 400 different companies working with Microsoft on Longhorn... they took a "poetic license" in a sense in saying that "yeah we're workint working with them, sure... but we're not gonna mention everyone else."
  20. D

    ATI learns some to us more on Longhorn

    This isn't ATi specific, hell, Matrox cards can run most of it. It's just "hey, we can use Longhorn too"...
  21. D

    ATI learns some to us more on Longhorn

    Um dude.. that isn't just ATI specific.. the interface is scalable. That's how it looks on most currently DX9 compatible cards..
  22. D

    Crytekdemo, enabling HDR for Ati card's possible

    Well from what I've just learned (From both a post here and doing more research) I don't believe ATi uses a FPB method for their HDR implementation (but I couldn't be wrong) I believe the colors are staticly adaptive to the light source. I don't know if that qualifies as FPB but whatever. As...
  23. D

    Crytekdemo, enabling HDR for Ati card's possible

    The ATi effect is similar to the "nVidia implementation" in FarCry 1.3, but it only uses 8bit FPB were as the OpenEXR implementation (1.3) uses 16bit FPB.
  24. D

    Crytekdemo, enabling HDR for Ati card's possible

    The HDR that is implemented in OpenEXR is closer the HDR that CryTek actually wanted. So in terms of which HDR looks better, it's subjective. I do however like the color blending of the nVidia version (probably just because it's 16bit) but the ATi thing is nice, it really doesn't loose much...
  25. D

    TO ALL GF6800 AGP owners

    I turned off HyperThreading and I still get 50% CPU usage on the SiL video at 1920x1200
  26. D

    Crytekdemo, enabling HDR for Ati card's possible

    Now many of you see, lol. Looks good -- but I wasn't too impressed by watching the CryTeck Demo.. It just seems like they added the cenimatic film filter over the render (Sorta like when you get hit in FarCry or die.. the blur/desaturation type effect) It's all been seen before and is possible...
  27. D

    weird accurance

    Drivers, OS, and game version.
  28. D

    No nVidia fall refresh?

    Okay.. Lemme think... For the price range... 6200 = X300 6600(series) = X700(series) Now on the 6800 hundereds, it gets tricky. 6800 = ? This has majorly hurt them in their top market. 6800GT = X800 Pro 6800Ultra = X800XT SLI = X850XT:PE ^^ Based on price at the moment. But...
  29. D

    No nVidia fall refresh?

    X800XT:Pe's have been out the same time.. and they're selling for 560+ in most cases, close to 700 in others... IMO, nVidia knows that people who are buying the 6800either know what they're getting, will be overclcoking, or will put faith in them to do driver revisions (as I have, but I'm...
  30. D

    Speculation Discussion - Developers preferance of engine

    Am I the only one that would like to see companies stop endorsing games, and the game companies just making sure it runs on both hardware.. after the game is out, what the card company wants to do -- they can do... I'm just sick of this nVidia = FarCry, ATi = Half Life 2 stuff... It should...
  31. D

    TO ALL GF6800 AGP owners

    6600GT is the only card I would recommend for this kinda playback.. It beats every other card on the market in the same catagory (Gaming..)
  32. D

    x850 is official, now what nvidia?

    Drivers, 2x vs 4x in FC makes a difference (they do playable levels) and was it AFR or SFR, I dun remember.
  33. D

    TO ALL GF6800 AGP owners

    Um.. if I remember correctly. They only advertised the actual modes that are supported. I'll dig up both my Ultra and GT box and see what they say.. but... So I dun think they technically did much wrong. Plus, the decoder was already fixed.
  34. D

    TO ALL GF6800 AGP owners

    What videos (if you havn't, try SiL, as it's the one most refer to); what drivers?; and what OS.. looks like 2000.. hm..
  35. D

    x850 is official, now what nvidia?

    I dun get why people say Doom3 side.. it'd be the same as someone saying Half Life 2 aside for nvidia. Both cards have leads in those games, and both following companies are catching up quickly. Other then that, the thing is still split 50/50... I stand by my post that because nVidia has the...
  36. D

    Valve sucks

    I said the game calls for full percision... but anyway. They (nVidia) still do it, and offer the same performance for theoretically, more quality (and future..) But anyway. Yes, the FX series borked both FP and PS/VS requirements.. I just don't buy the wholer FP percision thing as the...
  37. D

    TO ALL GF6800 AGP owners

    Couple things.. One, a driver fix won't fix it. It is a hardware related issue. Two, how many of your friends will be doing another task while watching HD videos, which are meant to take up the whole screen... And three, even if the video processor is borked, at 100% ussage, I don't...
  38. D

    Valve sucks

    I already know all that.. by but switching the name, it doesn't switch the physical features of the card (Driver level orientation) thus the game regardless of the name will recognize that the card physically can do 32 bit percision, and because it calls for full percision, it will run 32 bit...
  39. D

    Help me with a Decision..... X800XT-PE or BFG 6800Ultra OC.

    What level of AA and AF are you running, and what drivers.. (Sorry for the Hijack..) As for the thread -- it's personal preference. Niether card wins 100% of the time in one or the other API...
  40. D

    Valve sucks

    1) You cannot blame nVidia anymore, it's done with. So what, Valve knew it the time. 2) How exactly does that explain then that running in DX9 mode, that when a FX card is changed to a ATi card.. it runs without graphical glitches? *** That #2 is still the kicker...
Back
Top