Anno 2205 crippled on all AMD & Kepler GPUs

What I got out of it is that Nvidia doesn't support last year's cards at all. The chart clearly shows that a 960 is faster than a GTX 780 / 780ti. Those cards last year were the equivalent of the 980 / 980ti. So when Pascal launches expect your 980 / 980ti to be as fast as the slowest sub $200 Pascal card.

Show me when a 780 / 780ti was equivalent to a 980 / 980ti and I will show you a world at peace.

Also, this game is just OK, not great. Why such a fuss about an OK game not running well on older hardware. If this was FO4 or something then I'd get it, but this game is just OK. I woudn't get too worked up over it.
 
He means they filled the same tiers previously, not that the performance was the same.
 
But isn't that the way it goes? I mean you cannot expect a 780 and 780ti to remain top dog forever. Those cards are 2 years old now. This isn't AMD we are talking about, Nvidia doesnt have a top end card that last longer than 2 years as the top dog. Not that i'm saying it's right, but history shows that's not how Nvidia operates. Hell I had a 680 that was only top dog for less than a year before the 780 / Titan came out.
 
Show me when a 780 / 780ti was equivalent to a 980 / 980ti and I will show you a world at peace.

You misunderstood his analogy entirely.

But isn't that the way it goes? I mean you cannot expect a 780 and 780ti to remain top dog forever. Those cards are 2 years old now. This isn't AMD we are talking about, Nvidia doesnt have a top end card that last longer than 2 years as the top dog. Not that i'm saying it's right, but history shows that's not how Nvidia operates. Hell I had a 680 that was only top dog for less than a year before the 780 / Titan came out.

So basically, if a card is two years old, you should upgrade. The fact that a GTX 960 is beating a GK110 chip (GTX 780/780Ti/Titan/Titan Black) is not out of the norm.

Gotcha.
 
You misunderstood his analogy entirely.



So basically, if a card is two years old, you should upgrade. The fact that a GTX 960 is beating a GK110 chip (GTX 780/780Ti/Titan/Titan Black) is not out of the norm.

Gotcha.

No I am not saying you should upgrade, I am saying you shouldn't be surprised it cannot run as fast as newer hardware on newer games. Changes in game engines are made to take advantage of newer hardware and that can hurt older hardware. So no, don't upgrade, just be happy you had a 780 when it was a monster and realize it is getting older and might not be as impressive anymore with newer technology. It's a necessary cycle if you want visuals to improve, new techniques to be developed, etc. How can I be the only one who gets this?
 
No I am not saying you should upgrade, I am saying you shouldn't be surprised it cannot run as fast as newer hardware on newer games. Changes in game engines are made to take advantage of newer hardware and that can hurt older hardware. So no, don't upgrade, just be happy you had a 780 when it was a monster and realize it is getting older and might not be as impressive anymore with newer technology. It's a necessary cycle if you want visuals to improve, new techniques to be developed, etc. How can I be the only one who gets this?

http://www.pcinvasion.com/nvidia-353-06-drivers-released-witcher-3-boost-for-kepler-cards/

Either Anno 2205 developers made the game with only Maxwell in mind or Nvidia just focused their drivers on Maxwell and may or may not release a driver update to boost Kepler performance in Anno 2205
 
Changes in game engines are made to take advantage of newer hardware and that can hurt older hardware. So no, don't upgrade, just be happy you had a 780 when it was a monster and realize it is getting older and might not be as impressive anymore with newer technology. It's a necessary cycle if you want visuals to improve, new techniques to be developed, etc. How can I be the only one who gets this?

The problem is that there are developers that can utilize advanced visual effects that do not punish older hardware. When taking this game into account, it seems like they were only catering to one GPU architecture while ignoring the rest.
 
People keep ignoring the elephant in the room:

geforce-kepler-maxwell.jpg


Some reading here might also help:
http://devblogs.nvidia.com/parallel...ould-know-about-new-maxwell-gpu-architecture/

This should also give some hints:
http://www.hardocp.com/article/2014...orce_gtx_980_video_card_review/1#.VjuMlzZdF9M

People thinking Kepler -> Maxwell (GM10x) -> Maxwell v2 (GM20x) are "minor cosmetic" changes should stop posting now.

Again...the title of the this thread is pure garbage.
 
No I am not saying you should upgrade, I am saying you shouldn't be surprised it cannot run as fast as newer hardware on newer games. Changes in game engines are made to take advantage of newer hardware and that can hurt older hardware. So no, don't upgrade, just be happy you had a 780 when it was a monster and realize it is getting older and might not be as impressive anymore with newer technology. It's a necessary cycle if you want visuals to improve, new techniques to be developed, etc. How can I be the only one who gets this?

Except games in which a 960 beats the 780 is very much the exception than the norm.

Dying Light
Battlefield Hardline
GTA V
Witcher 3
Mad Max

All 2015 games, even in Witcher 3 780 is still 10% faster, while everything else sees a 20%+ gap.
 
This is a game no one has ever heard of, or cares about.
I'm sure the companies don't care if their cards don't perform well in some no-name title.
But i saw the gameplay and i really liked it, anyone playing this game here?
 
This is a game no one has ever heard of, or cares about.
I'm sure the companies don't care if their cards don't perform well in some no-name title.
But i saw the gameplay and i really liked it, anyone playing this game here?
It's a Ubisoft game, and the Anno series is huge. Like any other AAA game.
 
The problem is that there are developers that can utilize advanced visual effects that do not punish older hardware. When taking this game into account, it seems like they were only catering to one GPU architecture while ignoring the rest.

Agreed. Felt the same about TW3 initial release, although it wasn't this extreme, fixed after about a month but enough time to spur some Maxwell sales. Even the guide sponsored by Nvidia was more a Maxwell sales pitch (of course expected some but it was strong more than the previous ones I have used).
 
Again...the title of the this thread is pure garbage.
The title is a fact. Performance is crippled on non-Maxwell architectures. You're assuming I was implying malice. I'm not.

If a game is designed to solely run well on one architecture and poorly on everything else, that's bad game design. And for clarity... That's an opinion.
 
LOL, love people bashing nvidia for not optimized for some DX12 feature with current HW that came out before DX12. All while bashing developer for over use of a DX11 feature that's been around for 5 years that AMD card still can't handle...
 
LOL, love people bashing nvidia for not optimized for some DX12 feature with current HW that came out before DX12. All while bashing developer for over use of a DX11 feature that's been around for 5 years that AMD card still can't handle...

That's how it goes with AMD fans. Async Compute is the future and is as important as the second coming of Jesus even if it has zero support right now but any established feature like tessellation that gives NVIDIA hardware an advantage is cheating and unfair--even if that claim isn't substantiated. Go read that reddit link tainted squirrel posted, the conspiracy theories about gameworks is off the charts with those guys.
 
Last edited:
But of course HDMI 2.0 is a must have for the 5 people who own 4k60hz TVs...
 
But of course HDMI 2.0 is a must have for the 5 people who own 4k60hz TVs...

At least it's available, too bad you can't say the same for AMD's premium $650+ cards like Nano or Fury X.
 
780 tessellates better than 960, and Fury X is at least on par with 960.

I refuse to believe it's a tessellation problem, or at least, SOLELY a tessellation problem.
 
https://www.reddit.com/r/pcmasterra...dia_seems_to_be_tessellating_the_fuck_out_of/

Another report of tessellation poblems. Fixed by capping in CCC.

That's great news for AMD users. So glad that AMD includes options to fix developer silliness. Guess I'll grab the game this weekend. :D

Still feel bad for the Kepler users. They are at the mercy of GeForce Experience or forced hardware upgrades to play their games @60 fps. At least a 780ti owner only has to spend $200 on a GTX 960 to get a better gaming experience than what they already have. Still not right, but what can you do? /shrug

Addendum:
Damn out of stock @greenmangaming. I really like this series. :(
 
That's how it goes with AMD fans. Async Compute is the future and is as important as the second coming of Jesus even if it has zero support right now but any established feature like tessellation that gives NVIDIA hardware an advantage is cheating and unfair--even if that claim isn't substantiated. Go read that reddit link tainted squirrel posted, the conspiracy theories about gameworks is off the charts with those guys.

In the end tessellation is a stick rather then a useful productive development tool Async has been somewhat more productive feature.

In the end Async will not cause stupid stuff like 200 tessellation points in a cape in the name of improving 3d graphics or pushing graphic boundaries ....

I'm not sure why you think that Nvidia has been good at tessellation anyway it was a DirectX 8 feature which Nvidia never supported in the first place.
 
LOL, love people bashing nvidia for not optimized for some DX12 feature with current HW that came out before DX12. All while bashing developer for over use of a DX11 feature that's been around for 5 years that AMD card still can't handle...

You missed that Nvidia cant handle it either, at least Kepler. Sorry I don't share your cynicism. I prefer the rational approach. When A Dev castrates its self by pandering to the minority then it lends it self to harsh questions of its intent in doing so. And yes Maxwell is still a minority in the grand picture of what users own. I still stand by the fact that this is some back alley dealing with Nvidia to promote their Maxwell cards to force sales. Just like TW3 it will get patched after a month and the bench results will be more like every other game with the same parity we usually see with a decently optimized game.
 
Try my solution if you are into PC gaming. Upgrade video cards often. Always buy nvidia.
Thank God I don't have problems with my games.
 
In the end tessellation is a stick rather then a useful productive development tool Async has been somewhat more productive feature.

In the end Async will not cause stupid stuff like 200 tessellation points in a cape in the name of improving 3d graphics or pushing graphic boundaries ....

I'm not sure why you think that Nvidia has been good at tessellation anyway it was a DirectX 8 feature which Nvidia never supported in the first place.

^^Just proved my point about AMD fans.
 
Try my solution if you are into PC gaming. Upgrade video cards often. Always buy nvidia.
Thank God I don't have problems with my games.

I have no issues playing my games new and old so your strategy isn't necessary or sound. Not everyone, and I mean the majority of users, can afford to upgrade often. Honestly it isn't necessary.
 
Good for you. Upgrade part applies to maxing out details of games at release.
No AMD part applies to playing games on release without issues.

Of course neither is necessary. If you are happy with your games that is what matters. For me a second 980 ti wasn't necessary so I sold it off and am still enjoying games all the same albeit at 1440p/1620p rather than 2160p.
 
^^Just proved my point about AMD fans.

You mean that parallel compute makes you an AMD fan ?
Or that having many tessellation points that don't do anything for game design or gameplay.

The only thing it proves that it is bad if AMD would stoop that low to cripple competitors cards in software then I wouldn't mind sending AMD a message.

You ramble on about things that you know nothing about ...
 
You mean that parallel compute makes you an AMD fan ?
Or that having many tessellation points that don't do anything for game design or gameplay.

The only thing it proves that it is bad if AMD would stoop that low to cripple competitors cards in software then I wouldn't mind sending AMD a message.

I ramble on about things that you know nothing about ...

The point that AMD fanboys forget to being reasonable and just become zombie trolls.. you are the type of Fanboy that are a cancer in the gaming market, just trashtalking in forums without any reason that grumbling because can't use properly the competitors features :D

I ramble on about things that you know nothing about ...

Fixed for you. =)
 
Actually he may be a little extreme but he is correct. You tout tessellation but would you still hold your stance if async compute was used to the same degree Tess is?
 
Actually he may be a little extreme but he is correct. You tout tessellation but would you still hold your stance if async compute was used to the same degree Tess is?

yes, of course if it's really were able to push in game graphics features. can you play a game without Tessellation and/or Anisotropic Filtering? (as every AMD benchmark used to show competitive numbers with nvidia). Man, that's just suck.. over tessellation or not, I don't care.

I just want to be able to enjoy more and more graphics features and better immersion always as possible.. Why people never whined about Dragon Age Inquisition Tessellation? it's actually pretty pretty hard to run the game with ultra tessellation, but because it's AMD sponsored game nobody cares?. why AMD people never cried about TressFX being Direct Compute Based which its the stronger point in AMD GPUs? but now nvidia is bad because they base their Graphics pushing features in Tessellation which its their stronger point?. its really tiresome that kind of bullshit hypocrisy..
 
yes, of course if it's really were able to push in game graphics features. can you play a game without Tessellation and/or Anisotropic Filtering? (as every AMD benchmark used to show competitive numbers with nvidia). Man, that's just suck.. over tessellation or not, I don't care.

I just want to be able to enjoy more and more graphics features and better immersion always as possible.. Why people never whined about Dragon Age Inquisition Tessellation? it's actually pretty pretty hard to run the game with ultra tessellation, but because it's AMD sponsored game nobody cares?. why AMD people never cried about TressFX being Direct Compute Based which its the stronger point in AMD GPUs? but now nvidia is bad because they base their Graphics pushing features in Tessellation which its their stronger point?. its really tiresome that kind of bullshit hypocrisy..

Dragon Age Inquisition runs like butter on my system. I would suspect that it runs well on Intel / Nvidia hardware also. That's most likely why you NEVER see people complaining about it. Direct Compute is Microsoft's technology. Again why no complaints.
 
Yes, the Frostbyte engine is leagues ahead of all other game engines in terms of performance and optimizations IMHO.

It's amazing frankly that games built on it can look so gorgeous, scale amazingly well, and just plainly runs so well it can't really be used for benchmarking.
 
yes, of course if it's really were able to push in game graphics features. can you play a game without Tessellation and/or Anisotropic Filtering? (as every AMD benchmark used to show competitive numbers with nvidia). Man, that's just suck.. over tessellation or not, I don't care.

I just want to be able to enjoy more and more graphics features and better immersion always as possible.. Why people never whined about Dragon Age Inquisition Tessellation? it's actually pretty pretty hard to run the game with ultra tessellation, but because it's AMD sponsored game nobody cares?. why AMD people never cried about TressFX being Direct Compute Based which its the stronger point in AMD GPUs? but now nvidia is bad because they base their Graphics pushing features in Tessellation which its their stronger point?. its really tiresome that kind of bullshit hypocrisy..

And claiming excessive tessellation is not hypocritical? It may be fine and good to you because you have Nvidia but it doesn't make it right for others that own NVIDIA. Just remember this moment if or when async manages to get high use in a game and a 290X is beating a 980 TI. I am sure you wont be crying to nvidia for them to immediately reconstruct their architecture, but rather griping at the developer.

Take a step back and look at this rationally. No one believes tessellation should not be used. But rather this dramatic move to game releases with 64x when fidelity is nearly equal at 16x and performance is far better for ALL, is the issue. I have AMD, and want you and I both to play games without any concern for optimizations or possible dealings from either side to gain an advantage at the others cost.
 
Back
Top