Is Your PC Ready For Battlefield 3?

That's right, in three of those games, the GTX560 is even faster than the GTX275!
youfunnykid.jpg
 
That's right, in three of those games, the GTX560 is even faster than the GTX275!

Goalposts.

You move them.

k4xEk.jpg


http://hardforum.com/showpost.php?p=1037801536&postcount=84

The GTX560 standard is a good 25% or so faster than the GTX275 (and as said, it has DX11 where the 275 does not)

http://hardforum.com/showpost.php?p=1037802299&postcount=110

Red Orchestra 2: 0.583M, 0.641A -> Overall comparison to GTX560: 0.622/0.875 = 0.711 -> GTX560 40.7% faster
Hard Reset (FSAA): 0.742M, 0.732A -> Overall comparison to GTX560: 0.737/0.875 = 0.842 -> GTX560 18.7% faster
Dead Island: 0.711M, 0.719A -> Overall comparison to GTX560: 0.715/0.875 = 0.817 -> GTX560 22.4% faster
W40K Space Marine: 0.850A -> Overall comparison to GTX560: 0.850/0.875 = 0.971 -> GTX560 2.9% faster
Call of Juarez The Cartel: 0.693M, 0.717A -> Overall comparison to GTX560: 0.705/0.875 = 0.806 -> GTX560 24.1% faster
Heroes 6 Demo: 0.673A -> Overall comparison to GTX560: 0.673/0.875 = 0.769 -> GTX560 30.0% faster

4 out of 6 games. Not a "good 25% or so faster"...

Try again, troll.
 
Nobody's set down a mathematical rule for the expression 'or so'. I'm going to choose 15% variance either way. That covers me for two of those scenarios, which leaves us with 4/6. There will always be exceptions to these cases, some of them extreme. It is for this reason why averages are used. It's fine. I'm quite enjoying the fact you're clinging onto your GTX275 as it happens, because I can't wait to see how pissed off you get when it's woefully inadequate for Battlefield 3.
 
This app says my 2x 4870x2 doesn't meet recommended settings even though I have little doubt it will shit all over this game at high or greater.

But many of you seem like you are under the impression that it's doing some kind of benchmark and actually analyzing the performance of your hardware when it clearly is not. At best it simply takes an inventory of what you have in your system and gives you a recommendation based on a checklist.
 
Jesus, nobody is that retarded. Why do you think I am dividing by 0.875?

For the pre-school among us, the top half is me calculating the exact difference between the GTX560 and the GTX560Ti (since GameGPU do not regularly test the basic GTX560).

This is then applied to the results in the second half to gain the final score.

The GTX560Ti is 40.72% faster than the GTX275.
The GTX560 standard is 23.13% faster than the GTX275.

These people are so blinded by their love for questionable almost 3 year old hardware, that they can't fathom benchmarks and facts.

I tried, you tried, I guess they will just have to keep telling themselves that when they turn all the DX11 effects off, and play at a lower quality they are getting something "just as good" as what they get with their old DX10 era hardware.

I agree with the sense that graphics aren't everything, and that game play is more important. I am almost done spending 40 hours replaying the original Deus Ex from 2000 in preparation for HR to drop in price (I refuse to buy at full price), and as I'm playing it I'm still wishing new games could be as good as it is, despite its rather limited graphics by modern standards.

Graphics are - however - not irrelevant, especially in a game that is trying to be as much of a simulation as it possibly can be, and be as immersion in its telling of a story as it can be. The less real it looks, the less you are going to get sucked into it, plain and simple.

The developers cared enough about the appearance of the game to put the 560 as their recommended specs, so they are essentially saying that this is what is required to get the experience out of this game that they intended.

If these people want to deal with less, that's their prerogative. I'm tired of arguing with people who don't listen to reason.
 
Automated "system check" tools have always been of dubious value. In this case, made even worse given the nVidia bias. General rules of thumb? perhaps.. Enough to prompt a wholesale upgrade? Hell no..

We all know that every painstaking detail will be researched before making a GPU decision :)
 
Zarathustra[H];1037802648 said:
I tried, you tried, I guess they will just have to keep telling themselves that when they turn all the DX11 effects off, and play at a lower quality they are getting something "just as good" as what they get with their old DX10 era hardware.

DX11:
tesson.png


DX10/10.1
tessoff.png


Enjoy your amazing DX11 graphics bro, whatever helps you pat yourself on the back.

Fact is, DX10 users aren't missing much in DX11 games and that is by design.
 
DX11:
tesson.png


DX10/10.1
tessoff.png


Enjoy your amazing DX11 graphics bro, whatever helps you pat yourself on the back.

Fact is, DX10 users aren't missing much in DX11 games and that is by design.

Depends on the game.

Metro 2033 was a DX9 game sprinkled with a few barely noticeable DX11 effects (unless you turned on advanced DOF which was pretty noticeable)

4A games also listed their recommended specs for Metro 2033 as a GTX260, so they obviously didn't find those limited DX11 effects necessary for this game.

Since EA has listed the recommended hardware as a GTX560 and not a 200 series board, there is obviously something about the game they deem they need a GTX560 to truly enjoy they way it was intended. It may be DX11, or it may just be the higher raw performance of the GTX560.

One thing is for sure. The Game developers don't take their system recommendations lightly. They have everything to gain by more people being able to play their games, so they'd rather like the specs to be low. if they went with a 560 for recommended specs instead of a 275, there is probably a reason for it.

Now, on to DX11.

As mentioned it is applied differently in different games, some games - Metro 2033 being an example - are simply DX9 engines with a few minor tweaks so they can advertise DX11. Most DX11 games fall into this category. For these, having DX11 hardware won't make a hell of a lot of difference (though for some reason rendering in DX11 with the effects turned off is usually a few percent faster than rendering in DX10, which is odd)

Then there are games like Civ 5, which when played in DX11 mode don't really look like they are taxing the video hardware that much, until you try to switch to DX9 mode and realize that they were using DX11 for things you hand't even thought of, like real time texture compression via directcompute, making the DX9/DX10 experience miserable by comparison in this game.

I don't know what the case is for BF3. I don't have some secret pre-release version, but if the publisher recommends a higher speced card, I tend to believe them, as their bias tends to be towards recommending lower speced hardware, so they can sell more copies.
 
I checked those filenames - they're from a tesselation test, there's more to DX11 than just tesselation :p
Still, I agree in principle, there isn't a big difference between DX10 and DX11 in a lot of games, that's only a small part of the ridiculousness of calling a GTX275 the same as a GTX560 though.
 
Zarathustra[H];1037802648 said:
These people are so blinded by their love for questionable almost 3 year old hardware, that they can't fathom benchmarks and facts.

I tried, you tried....blah blah blah....

...and a bunch of self aggrandizing bullshit cut out....

BHal3.jpg


If I may politely ask. Why are you commenting on this thread?

Why does any Mac user comment in a thread about PCs or Android phones if they own an iPhone? I didn't know there was strict limits on who could talk where. I tried the tool, I called it into question and a lot of people seemed to agree that what it claimed was suspect. After all, this is a TECH SITE and talking about things like this shouldn't have to have preorder confirmations before someone can say something.

I thought BC2 was a great game, so it's not an easy decision to have to make to boycott BC3 because of EA's continuing involvement in the franchise.
 
and you thought that without playing it, since BC2 was EA too?
His point was, if you aren't going to play the game, why do you care how it runs?
 
i passed the test with 6870's in crossfire.

I disabled crossfire and it simply told me that the 6800 series is not enough.

Either way, i will prolly not be playing this game anyways, if i did my 6870 crossfire would slap it silly at 1080p.
 
and you thought that without playing it, since BC2 was EA too?
His point was, if you aren't going to play the game, why do you care how it runs?

Good god, the sheer hubris of your exceptionally manipulative replies. There's such a good reason you fail to quote nearly every post you respond to (you wouldn't be able to get away with implying someone said something if their own words were right above yours)... Everything you say is trying to be misleading as possible in order to angle your reply to negatively reflect upon what you want other people to think about the people you're replying to.

What you imply and ASSume in your responses and questions are not what the reality of what is being said. Suggesting otherwise is not going to make you correct no matter how much you want to twist and manipulate the words on the screen.

Since your reading comprehension has been called into question... multiple times, at that... I'll just point you to page 6 of this thread. Reread the entire page until it sinks in. Until then, stop responding if all you're going to do is mislead, conflate and manipulate.

My response was *never* about wanting to say the BC3 site is wrong about my ability to play BC3 to supposed "satisfaction" on my 275. (Herp, derp, that's why they have multiple resolutions and quality levels to choose from!)

It was about the misleading nature of the marketing page's classification of hardware. People with SLI setups far better than a single GTX 560 are being told their setup isn't good enough.
 
The only person who's being looked on negatively here is you, and I need no help with that, because everyone else already feels the same.
As for your statement retraction, glad you've realised what you said was wrong :)
 
I checked those filenames - they're from a tesselation test, there's more to DX11 than just tesselation :p

The filenames are what they are because that is what I named them when I uploaded them to my web-server, not sure what "tessellation test" you are referring to. Those shots are 100% representative of DX10 vs DX11 in Metro 2033. Yes, tessellation is the main feature you would be missing in that situation, but IMO, not missing much...

If you have screenshots that show a clear difference between DX10 and DX11 that doesn't have to do with Tessellation, then please post them.
 
A friend is reporting back from his experience:

CPU: i7 920 (not sure what clock)
GPU: HD5850 (stock)
Resolution: 1920x1200

game plays flawlessly apparently (and he's quite fussy over frame rate so I trust his judgement), but he hasn't enabled anti-aliasing. Everything else is max detail.
 
A friend is reporting back from his experience:

CPU: i7 920 (not sure what clock)
GPU: HD5850 (stock)
Resolution: 1920x1200

game plays flawlessly apparently (and he's quite fussy over frame rate so I trust his judgement), but he hasn't enabled anti-aliasing. Everything else is max detail.

Not really relevant, considering the fact that Ultra appears to be disabled in the Beta (or at the very least isn't even working).

The graphics appear to be the same as those found in the Alpha, from the screenshots I've seen.
 
Apparently someone's set Ultra in the beta on a live stream, and it's meant to look very good.
 
im in the BF3 beta gonna test this weekend how unqualified my system is ... bastards.
 
HAHA fuck you nvidia



Uploaded with ImageShack.us

Yeah, that seems like a little bit of a stretch. Depending on the title a 6870 and a 560 are roughly tied. The 6870 takes some titles, the 560 takes others.

Maybe they know something about the final version of BF3 we don't? Otherwise I would have called the 6870 a pass.
 
Nah, it's just because they lump it in with 'series' as it can't detect which card is which. HD6850 falls below the GTX560 so it groups the 6800 series as a whole with a fail.
 
Nah, it's just because they lump it in with 'series' as it can't detect which card is which. HD6850 falls below the GTX560 so it groups the 6800 series as a whole with a fail.

Oh right, I forgot that AMD/ATI's identification strings are product families vs Nvidia identifying individual GPU's...

This is what hands AMD the top spot in the steam hardware survey GPU list.
 
Zarathustra[H];1037803427 said:
Oh right, I forgot that AMD/ATI's identification strings are product families vs Nvidia identifying individual GPU's...

This is what hands AMD the top spot in the steam hardware survey GPU list.

Actually, peeking at the steam hwsurvey, they seem to have fixed that now :confused:
 
Zarathustra[H];1037803427 said:
Oh right, I forgot that AMD/ATI's identification strings are product families vs Nvidia identifying individual GPU's...

This is what hands AMD the top spot in the steam hardware survey GPU list.

ORLY.

voLm7.png
 
Yup, them not being at the top any more will be the result of those series splitting into individual cards. The HD5700 series would have taken 5.15%, thus sitting on top of the 9800 series as a whole. Likewise the HD4800 series at 7.22% (higher, as the HD4830's not in that list), would far surpass any of the other geforce series.
 
All of those people with 8800 series cards are going to need a huge upgrade soon... lol
 
Nah, it's just because they lump it in with 'series' as it can't detect which card is which. HD6850 falls below the GTX560 so it groups the 6800 series as a whole with a fail.

my 6850 OC'd will probably beat a 560. Alas, I don't care it's a silly program detecting what gpu people have.
 
Zarathustra[H];1037803357 said:
Yeah, that seems like a little bit of a stretch. Depending on the title a 6870 and a 560 are roughly tied. The 6870 takes some titles, the 560 takes others.

Maybe they know something about the final version of BF3 we don't? Otherwise I would have called the 6870 a pass.

i've a 6850 albeit OC'd FWIW
 
when you say slap it silly, you mean 50-60fps or so...

i bet you my crossfire 6870's at 1000core and 1200 mem will be well over 60fps at 1080p.

the 6870 by itself can max most games out at 1080p few exceptions, but either way still run close to max detail at 1080p, adding a second card doubled my FPS. Which i did on the cheap not only for bitcoins, but also to max out witcher 2, and metro 2033 and i ventured into eye enfinitity with ease.


6850's even overclocked will only perform on par with a 6870 or slightly above it depending on the clocks your card can achieve.

anyways anyone with a 6850 or 6870 should be just fine for BF3.
 
Back
Top