Wow you're clueless. That's what happens when you make assertions on trivial differences between cards, you get corrected and brought back into the realm of reality.
It's like saying "gee I see in randomreviewer.com's review they get .5fps more on card x, so I'm going to buy it" when it's well...
Did you bother to look at the charts? It's the difference between a few decibels... you're kidding when you imply it's too loud right?
And it barely consumes any more power than 3870 at idle.. and actually less under load.
That's totally illogical and invalid.
You can tell when your ping is higher (albeit I don't think you can tell between the numbers you site) because of when your shots should've killed someone in a game and they don't die, the outcome is different than your expected outcome.
The (questionable)...
Um.. your PSU sounds fairly unstable, and it might already be having bad effects on the GPU (underpowering electronics is bad news).
I'd avoid running that system on that PSU asap...before you fry your motherboard or CPU.
Gee, no, SLI is the first two, not the game support. Of course it becomes useful when game support is there, but the game support by itself is by no means anything that can be productized.. only the technology can.
My original point is you should be fair in the distribution of responsibility.
Seeing that the issue spans 2 cards, I'm willing to bet your issue is your PSU. It's not necessarily that it's not supplying the correct power, but some PSU's (like some antecs) are known to have issues with not supplying power to 8800GT's correctly.
Do you have another PSU to swap into the...
Eh? You can't compare spec for spec. This isn't 2003 anymore. Different architectures have totally different setups. You can't compare clocks anymore, as ATI still goes off of core clock while NVIDIA has a lower core clock with way higher shader processor clocks. It's comparing apples to...
Actually, SLI and Crossfire are not inconsistent.. developers are inconsistent in how well they design their games to take advantage of more than 1 gpu. So it really comes down to them, as NVIDIA and ATI cannot recode and run their game on the fly for them :)
For all the insulting you're doing you seem to be pretty clueless yourself.
The point of physics technologies is not to improve framerates... it's not a faster CPU or GPU. The point is to add to the IMMERSION/experience of the game, by adding an element that hasn't really been there up until...
Are you kidding me? No major performance difference? Try 50-65% the performance of the GT.
That's not really a bargain for a card that is 50-65% the performance of a GT, at 65% of the price..
If you don't believe me, here's a random review I found which happened to have both cards in it...
Crossfire is Crossfire is Crossfire. Why? Because AFR is AFR is AFR.
Having 2 chips on 1 pcb doesn't eliminate *any* of the gamble.
Games that traditional 2 board Crossfire doesn't have a profile for still don't have a profile, and still don't scale. Likewise, games that Crossfire does well...
Doubt it.. the 8800GT is notoriously 10-20% faster than the 3870 pretty much across the board.
This is why, now that the cards are the same price (the 8800GT is even cheaper in some cases) this card is such a steal.
Do you mean have the drivers take a different path for this particular proprietary instruction set that is only on this particular family of chips? I guess it would be cool for them to be able to accomodate all instruction sets in this manner but it would make things really hard to QA, with all...
None of the sites in question use a system that will alter results drastically or bottleneck the cards... the review sites know better. That's why I'm curious if something else is at fault here. I just have a hard time believing that benchmarking a rendered cutscene is SO different from in-game...
Ok, maybe not many sites showed the x2 beating the Ultra in Crysis itself, but other games they did.
This is not in line with what [H] showed for CoD4 and UT3. These would probably be better apps to normalize with since their Crysis results may not have been drastically different from what...
*again*, the difference between medium and high settings don't turn an ultra killer into a card beaten by a GTX, so that argument is out the window.
The other thing to point out is that you made a fatal flaw here... performance is *ALWAYS* in terms of relative percentages. 5fps is...
FYI, I'm not questioning the methodology. I'll rephrase one more time for those who still don't get the problem here.
----[H] shows GTX beating x2. Most other sites show the x2 beating the Ultra.
----[H] says the difference is because they tested in real world gameplay and not using canned...
Who cares about the absolute framerates right now? I'm not disputing that the in-game and built-in benchmark framerates will be different, I'm disputing the *relative* performance between the X2, Ultra, and GTX according to [H] vs. the relative perf between the cards according to other sites...
The graphics card won't output video if it doesn't detect a monitor hooked up to it. It sounds like it's not detecting anything there for some reason when it's hooked up by itself.
Run GPU-Z and see what that reports for each card. It's likely that it's windows reporting your system is running PCIE x4 for some reason (maybe it really is somehow).
The NVIDIA CP just reports what the system tells it, so I doubt the NVIDIA CP itself is at fault.
If GPU-Z shows the same...
Remember, it's not just the CPU score itself suffering from a CPU bottleneck.. the GPU score will as well. If the CPU can't feed the GPU enough draw calls fast enough to keep it happy, performance on the GPU-front will suffer. I'm 99% sure this is what is happening here.
It's probably all on...
Still no answer to my question. I'm getting annoyed now by how it's just being ignored.
Initially with their X2 review, [H] published that the reason their results are different is that they use real world benchmarks (A) while other sites use "canned benchmarks" (B)
In this article [H] has...
Yes, I beat a pirate planet and took it over with 4 capital ships and many many frigates I was a little worried but I took them all out. They almost got one of my capital ships, but I retreated that one and it was fine. They couldnt stand up to the rest of my forces ;)
They don't just keep...
Obviously, I addressed that I knew this. The point is why in the world would the devs design the game to basically tire me after fighting 1 EASY cpu opponent at normal speeds? Why wouldn't they just let me scale up the number of opponents if I wanted a longer game?
I shouldn't need to move the...
I used to use opera but the constant mis-rendering of sites really got on my nerves. Certain controls or options simply wouldn't show up like they did on IE or Firefox. I got tired of it and went to Firefox.
Wow, Dell sells 680i's?
Don't let anyone knock your Dell if it has a 680i... the only reason people knock them is because they typically screw you by selling your system to you cheap, then when you open it up it has some proprietary motherboard and PSU which happen to be pieces of shit and...
Of course it's a great choice, but the question is what are your other system specs? Depending on those, you may end up cpu-limited and not be able to experience the full potential of the GT, in which case you might want to look at an upgrade, or wait until then to get a better card.
Actually, you're quite wrong. Fraps is more accurate than the Crysis in-game benchmarks when used carefully (and Kyle made a point of noting that they used it very carefully). I've seen cases in multi-gpu where it shows 90FPS in-game with 3 GTXs when it's actually running at 20fps or less...
I think the in-game testing is the way to go, but:
Just to be the devils advocate here.. there is still the question of why, when you run the timedemo like everyone else, the x2 loses to the GTX. I understand that fraps and the engine reported FPS may not agree, that's not really in question...