This thread had me seriously thinking of getting another 5850 to CF rather than a 7970. $120 vs $699 (7970 price where I live) for same or better performance sounds like a no brainer.
The 7950 is clearly the worst value among the second tier cards. Its the most cut down with 13% less shaders compared to 10% and 8% for 5850 and 6950.
It does not enforce AMD reference design which means cost cutting designs from day one. The only model that stuck to the $459 mrsp has no vrm...
This is without defered 4X MSAA
This from your link:
69fps vs 73fps. 7950 does very badly compared to 6970 as well. Especially if you consider 6950 unlocked only cost about half the price for 5% less performance. Frankly this is even worse result than I expected in BF3
Funny how every game the 7900 performs poor is accused of using drivers cheats. Speculations and rumours repeated often enough becomes the truth and used to excuse AMD for every game they perform bad in. The irony is BF3, Dirt 2/3, Dragon age 2 are AMD GE games and they perform worse in these games.
Less than 10% OC with added voltage. You can set power tune at 0% that will get you alot better clocks like the rest of the guys here. But that will ultimately throttle the performance and defeat the purpose of overclocking.
Reading this thread gives me the feeling alot of the 7970 overclocking hype are based on cherry picked review units. Real world results does not look good at all.
Yup. Its all down to drivers which can make or break CF scaling. If there is any innovation in hardware architecture that directly improves scaling AMD will be all over it in their 6000 series marketing presentations. And I can never understand the logic of going 6850 over 5850 just because they...
900Mhz? I think he is refering to 6870. The 5850 outperforms the 6850 by almost 20% in BF3 and only a hair slower than the 6870
And yes overclocked to the same clocks a 5850 will also outperform a 6870 obviously.
For balance reasons obviously. They can't just turn off shadows/ vegetation or truncate it abruptly at a distance otherwise it will be quite unfair to players who run it with all the eye candy on. Mesh and terrain quality doesnt do much to the performance as well cos the main bottleneck in this...
This is just after a 64p game on caspian border. All ultra except no msaa and motion blur. My overclock is considered mild as there are people who clocks theirs over 1Ghz, Frankly this game just isnt that hard to run wihout MSAA. I use post AA medium and it still looks smooth.
Its a 32p server. Though looking at my AB log again you are right it does drop to the 50s at times. But avg is well above 60fps. At 1920x1080 he should still play fine averaging 50-60fps.
Well I am running ultra without MSAA and it almost never drop below 60fps in a multiplayer match. Now this at 1680x1080 but 1920x1080 should not be too far off. This is with a modestly overclocked 5850.
I suspect disabling ht simply made the cpu render slower and help keep the two gpus frames in sync. Try using a single gpu and test if ht creates the same problem.
Barts shaders are Vliw5. Same as on cypress. They get higher performance per die area because of aggressive stock clocks. Only caymen are using the more efficient Vliw4 architecture.
So there will be 3 different 560s. Way to confuse and cause buying mistakes when they could have named it 565 or something. Still not as bad as AMD releasing cut down 6800 series with less shaders than 5800 series. That was downright deceptive.
First off everyone can definitely see ms. We are all humans and have the same capabilities only some are more fussy than others. Microstutter is simply perceived as lower visible framerate than what's expect from fraps counter. If you say you can't see ms its more likely you have low tolerance...
I am enjoying the game overall. Not having any texture pop in or poor performance either on my ATI card. Sure they screwed up pretty badly for PC and some textures are plain bad. But technicals aside the overall visuals and atmosphere are top notch. There are so many areas that looks like pure...
Fps capped at 60 doesn't eliminate tearing in fact it makes it more visible. Which is why UT3 engines always cap at 62fps not 60. Only vsync solves tearing.
At higher fps it is still possible to notice MS especially with a 120hz monitor. But most games usually become increasingly cpu bottlenecked at high framerates which helps space out the frames uniformly and reduce the perception of MS.
It depends on the game you play and resolution as well as cpu speed. When I have the 4870x2 I find ms not noticeable in CODWAW but very noticeable in crysis. Usually anytime when a single 4870 can run the game fine I won't have problems with ms on 4870x2 as well.
I am the only person you read about. So that makes it fake? How many person reports running out of VRAM in ultra settings 1680x1050 with a 1GB card? And you know for a fact 1GB is not enough for ultra because you have a 1GB card and a 1080p monitor to test with?
Edit: Just adding this for...
1080p is not much more than 1680x1050. Others have tested it to run fine on 1gb cards. Its caching more memory than it needs which is why you are seeing that much. I can confirm the game uses only 850mb at max settings 1050p from gpuz readings.
About ultra not being ultra, we'll see about that.