1280x1024 @ force 4xAA and 8xAF (can't really tell if it's on though, I get jaggies sometimes, but I think it's just poor texture maping on devs part) on CoR and I get nothing below 32... o-O... In PS3 mode...
And um no offence -- because I dun really care... I like my GT, won't be switching because I have NEVER had a problem with it or any other card from either company.. (just don't mess with this that isn't broken) but does anyone remember the last time Drivenheaven put out a "credible" benchmark...
For '01 your timings are HORRID... x-4-4-7.. if you could even close that down to 2-3-3 You'd get about a ~600 point boost (simply cause of memory reading) atleast in my experience.. Is default or by SPD?
Also make sure your RAM is running in sync with your HT Bus -- basically running it 1:1...
Probably should have thought of that before... The availability is reported not to be solved untill the second half of this year.. it looks like 600 - ~700 is about what you'll have to pay for a Ultra.. I've seen some PCI-Express GT's around for 500... (apparently these clock to real Ultra...
You said 1GB of some green ram... what kinda RAM, 2700, 3200? What... Also , that and Timings will make a HUGE difference in 01... 03 is all videocard.
I have yet to find a game (Except FarCry) that I can't run at 1600x1200 with 4xAA and 16xAF that I can't run without a hitch.
Although most games I play at 1280x1024 with 4xAA and 16xAF... Simply because there is no noticble difference (to me)...
Either card should do you without a...
... The card still plays it. WTF is the big deal? You gonna go play a game while watching it? I can understand if you do alot of encoding/decoding sorta, but at that point you'd probably have a P4...
But dual 6600GT cards still loose to the GT and Ultra at most benchmarks above 1280x1024 (or at) and with any sort of AA...
This is probably going to be the only core available in this setup (considering heat factor)... I wonder why the COMPANY went out of their way to do this..
Or maybe not?
Perhaps,
http://www.xbitlabs.com/images/video/farcry13/volcano_hdr.gif
http://www.xbitlabs.com/images/video/farcry13/regulator_hdr.gif
-- and it looks to me that that is the only test they did where it performs worse than the 6600GT, and they both perform AT HALF the...
Because ATi is just one of about 400 different companies working with Microsoft on Longhorn... they took a "poetic license" in a sense in saying that "yeah we're workint working with them, sure... but we're not gonna mention everyone else."
Well from what I've just learned (From both a post here and doing more research) I don't believe ATi uses a FPB method for their HDR implementation (but I couldn't be wrong) I believe the colors are staticly adaptive to the light source. I don't know if that qualifies as FPB but whatever.
As...
The ATi effect is similar to the "nVidia implementation" in FarCry 1.3, but it only uses 8bit FPB were as the OpenEXR implementation (1.3) uses 16bit FPB.
The HDR that is implemented in OpenEXR is closer the HDR that CryTek actually wanted. So in terms of which HDR looks better, it's subjective. I do however like the color blending of the nVidia version (probably just because it's 16bit) but the ATi thing is nice, it really doesn't loose much...
Now many of you see, lol. Looks good -- but I wasn't too impressed by watching the CryTeck Demo.. It just seems like they added the cenimatic film filter over the render (Sorta like when you get hit in FarCry or die.. the blur/desaturation type effect) It's all been seen before and is possible...
Okay.. Lemme think... For the price range...
6200 = X300
6600(series) = X700(series)
Now on the 6800 hundereds, it gets tricky.
6800 = ? This has majorly hurt them in their top market.
6800GT = X800 Pro
6800Ultra = X800XT
SLI = X850XT:PE
^^ Based on price at the moment.
But...
X800XT:Pe's have been out the same time.. and they're selling for 560+ in most cases, close to 700 in others...
IMO, nVidia knows that people who are buying the 6800either know what they're getting, will be overclcoking, or will put faith in them to do driver revisions (as I have, but I'm...
Am I the only one that would like to see companies stop endorsing games, and the game companies just making sure it runs on both hardware.. after the game is out, what the card company wants to do -- they can do...
I'm just sick of this nVidia = FarCry, ATi = Half Life 2 stuff... It should...
Um.. if I remember correctly. They only advertised the actual modes that are supported. I'll dig up both my Ultra and GT box and see what they say.. but... So I dun think they technically did much wrong. Plus, the decoder was already fixed.
I dun get why people say Doom3 side.. it'd be the same as someone saying Half Life 2 aside for nvidia. Both cards have leads in those games, and both following companies are catching up quickly. Other then that, the thing is still split 50/50... I stand by my post that because nVidia has the...
I said the game calls for full percision... but anyway.
They (nVidia) still do it, and offer the same performance for theoretically, more quality (and future..) But anyway. Yes, the FX series borked both FP and PS/VS requirements..
I just don't buy the wholer FP percision thing as the...
Couple things..
One, a driver fix won't fix it. It is a hardware related issue.
Two, how many of your friends will be doing another task while watching HD videos, which are meant to take up the whole screen...
And three, even if the video processor is borked, at 100% ussage, I don't...
I already know all that.. by but switching the name, it doesn't switch the physical features of the card (Driver level orientation) thus the game regardless of the name will recognize that the card physically can do 32 bit percision, and because it calls for full percision, it will run 32 bit...
What level of AA and AF are you running, and what drivers.. (Sorry for the Hijack..)
As for the thread -- it's personal preference. Niether card wins 100% of the time in one or the other API...
1) You cannot blame nVidia anymore, it's done with. So what, Valve knew it the time.
2) How exactly does that explain then that running in DX9 mode, that when a FX card is changed to a ATi card.. it runs without graphical glitches?
***
That #2 is still the kicker...