Asassin's Creed 5: Unity

"The game (in its current state) is issuing approximately 50,000 draw calls on the DirectX 11 API. Problem is, DX11 is only equipped to handle ~10,000 peak draw calls. What happens after that is a severe bottleneck with most draw calls culled or incorrectly rendered, resulting in texture/NPCs popping all over the place"

very interesting...

This could actually explain why people with 970/980 cards see far fewer problems than everyone else.

But as a developer, I do at least have to give kudos to Ubisoft for even being able to get 50k draw calls. I just tried something similar with just triangles (as simple as you can get it), and my machine cried out in pain. Less than 18 fps with a 5960x & 980, whereas I'm running Assassin's Creed at 60 fps in 1080p

vdHqKRC.png
 
Last edited:
Guys, as rough as performance has been reported, the silver lining of this debacle is the images.

It's probably already been posted, but this will go down in the annals of gaming history as the single most hideous texture cluster f*ck the world has ever seen.

2732024-289650_screenshots_2014-11-11_00002.jpg


2732025-289650_screenshots_2014-11-11_00003.jpg



2732027-289650_screenshots_2014-11-11_00005.jpg


2732028-289650_screenshots_2014-11-11_00007.jpg


2732031-289650_screenshots_2014-11-11_00014.jpg



LONG after the franchise is dead and forgotten, these images will endure! The ubisoft team have achieved immortality.

But let's not forget all that help they had being a gameworks title. Thank you nvidia, the way it's meant to be played!
 
"The game (in its current state) is issuing approximately 50,000 draw calls on the DirectX 11 API. Problem is, DX11 is only equipped to handle ~10,000 peak draw calls. What happens after that is a severe bottleneck with most draw calls culled or incorrectly rendered, resulting in texture/NPCs popping all over the place"

very interesting...

This series has always been one of the first ones that game to my mind when the advantages of DX12 and Mantle first came up, that and Total War.
 
"The game (in its current state) is issuing approximately 50,000 draw calls on the DirectX 11 API. Problem is, DX11 is only equipped to handle ~10,000 peak draw calls. What happens after that is a severe bottleneck with most draw calls culled or incorrectly rendered, resulting in texture/NPCs popping all over the place"

very interesting...

Then it sounds like ubisoft should have implemented mantle in addition to the gameworks stuff to see how that would effect the results since that is exactly what mantle is supposed to help address.
 
Then it sounds like ubisoft should have implemented mantle in addition to the gameworks stuff to see how that would effect the results since that is exactly what mantle is supposed to help address.

As Ubi signed into an exclusive company wide deal with Nvidia that isn't going to happen any time soon, hopefully DX12 will save us all ;)
 
So they designed the game such that it issues way more draw calls than what is feasible on the only API they support?

In addition they designed it so that apparently the CPU load scales quickly with resolution (the baked GI stuff) so that even when they [should] have GPU available (see PS4) they can't raise the resolution. Or does it run like shit on the PS4 because they're simply designed around DX11 everywhere?

It's like they applied their 'whatever, just buy more hardware' ideology from the PC to the consoles, where it's literally impossible.
 
This could actually explain why people with 970/980 cards see far fewer problems than everyone else.

But as a developer, I do at least have to give kudos to Ubisoft for even being able to get 50k draw calls. I just tried something similar with just triangles (as simple as you can get it), and my machine cried out in pain. Less than 18 fps with a 5960x & 980, whereas I'm running Assassin's Creed at 60 fps in 1080p

vdHqKRC.png

The low CPU usage in that picture suggests something else is the matter, but whatever. Maybe if you don't submit to the shackles of Microsoft, you can do better?

https://www.youtube.com/watch?v=DxTrbPhGFt0

"The application achieves a rate of roughly 3.5 million independent draws per-second and is limited by the ability of the tested hardware (an AMD Radeon HD 7970) to process vertices. By making each draw much smaller, I’ve seen rates of consumption of 20 to 40 million draws per-second, depending on the underlying hardware. It seems that we’re not that far off from being able to push through one million draws per frame at a steady 60Hz."

http://www.openglsuperbible.com/2013/10/16/the-road-to-one-million-draws/
 
If you're lucky FC4 won't suffer from any of the same issues. I know it's not the same development team per say, but both are by the same publisher. Who knows.

The PC version of Farcry 4 is being handled by Ubi Kiev, the same team responsible for the highly polished PC versions of Ass Creed 3, Ass Creed 4 and Ass Creed Unity.
 
Finally got FRAPS put on and I'm getting about 25 FPS at 2560x1080 or 3440x1440...I can't drop it lower then that (even if I run it in a window), but surprising they game is still pretty playable without any graphics glitches that AMD cards are suppositly having showing up yet.
 
The low CPU usage in that picture suggests something else is the matter, but whatever. Maybe if you don't submit to the shackles of Microsoft, you can do better?

https://www.youtube.com/watch?v=DxTrbPhGFt0

"The application achieves a rate of roughly 3.5 million independent draws per-second and is limited by the ability of the tested hardware (an AMD Radeon HD 7970) to process vertices. By making each draw much smaller, I’ve seen rates of consumption of 20 to 40 million draws per-second, depending on the underlying hardware. It seems that we’re not that far off from being able to push through one million draws per frame at a steady 60Hz."

http://www.openglsuperbible.com/2013/10/16/the-road-to-one-million-draws/

I realize I'm clearing the background, but that only removes around 2 fps.

http://paste.ofcode.org/FYfvcBj9fNedsTMrPEzVsT

You're right, it should be CPU bound. If I remove the for loop, one of the processors does hit 95-100% usage. Right now the program is GPU bound. DX12 also removes a lot of these limitations (along with OpenGL), and you're right, there are better ways to do this, even with DX11. I just aimed for simplicity as they stated 50,000 draw calls.

*EDIT*

Have to redact this. Made a stupid error by just the sheer size of the triangle. When I shrank it so that (Draw Calls * Triangles) covered the screen, I was able to get about 1.75 million draw calls per frame to render @ 60 fps. Granted, we're talking tiny triangles here with only shading, and no other effects or processes going on.
 
Last edited:
The cutscenes didn't seem laggy? Some are reporting 17fps for certain cutscenes with GTX 980 SLI.

Nothing to that kind of egregious degree.

What I'm wondering about in the sea of chaos is if we are we seeing the all too often issues for the folks on multiple GPU setups in the early goings.

I'll run FRAPS today and let everyone know what I see.

<snip>


But let's not forget all that help they had being a gameworks title. Thank you nvidia, the way it's meant to be played!

Wow. I just played through those parts last night and didn't see that nightmare. I wodner what in the world happened there?


It's a longshot but I wonder if whoever had that problem needed or needs to "verify file integrity" because that just flat out looks like missing data.




The PC version of Farcry 4 is being handled by Ubi Kiev, the same team responsible for the highly polished PC versions of Ass Creed 3, Ass Creed 4 and Ass Creed Unity.

This is good news.
 
You are running an ATi card. What a joke.

/Ubi and nVidia

Oh give me a break. That's even worse then if the game was purposefully crippled for hardware that didn't pay to play. You've been here almost 12 years, stop acting like a 12 year old in a which console is better thread.
 
I wasn't aware people still buy Nvidia products, what with their anti-consumer Nazi-esque business practices designed around ruining the gaming experience for anyone who isn't willing to buy their overpriced rehashed proprietary garbage.
Some of us want to support companies who are interested in moving the industry forwards, not backwards. But I understand not everyone has the same principles. Gotta have that PhysX! /rolleyes.
 
Er, the consoles are both AMD based from top to bottom.

No, the problem is likely that AMD have yet to properly implement deferred contexts / multithreaded rendering into their Windows driver. The new Call of Duty, Dead Rising 3, and Watch Dogs all exhibit that same behavior on high-end AMD GPUs when coupled with low-end CPUs. It's been a well known issue for years now; it just didn't matter because games didn't push that many draw calls before.
 
Er, the consoles are both AMD based from top to bottom.

No, the problem is likely that AMD have yet to properly implement deferred contexts / multithreaded rendering into their Windows driver. The new Call of Duty, Dead Rising 3, and Watch Dogs all exhibit that same behavior on high-end AMD GPUs when coupled with low-end CPUs. It's been a well known issue for years now; it just didn't matter because games didn't push that many draw calls before.
But NVIDIA has and they still exhibit similar problems, though not to the extent AMD does. I really think this just boils down to a lack of QA by Ubisoft, and they're trying to obfuscate the truth. It's hard to release a game like AC in 9 months, so they just need to stop with the annual release schedule.
 
But NVIDIA has and they still exhibit similar problems, though not to the extent AMD does. I really think this just boils down to a lack of QA by Ubisoft, and they're trying to obfuscate the truth. It's hard to release a game like AC in 9 months, so they just need to stop with the annual release schedule.

They use different dev teams and stagger them.

It's not one team that's pumping out new AC games in a single year of development time.

Ac Unity had four years of development, not one.
 
So this is the first time since I built my PC and have it overclocked since day 1 that it crashed with a CPU over temperature error. lol.

Even by putting this game on SSD it crashes and has constant hitching. I have tried lowering settings but they don't do anything.

Anyone know of SLi bits that actually work and might be worth a look? I think I will just put it aside until performance issues are fixed.

Game has potential to be great but the bugs, LOD transitions, cutscene issues and performance is just making it not worthwhile to put time in it right now.
 
I had some kind of update for the game come over Steam just now. Not sure what it was. Didn't look big. 332.6 MB.
 
But NVIDIA has and they still exhibit similar problems, though not to the extent AMD does. I really think this just boils down to a lack of QA by Ubisoft, and they're trying to obfuscate the truth. It's hard to release a game like AC in 9 months, so they just need to stop with the annual release schedule.

Depends how you define 'similar'. Watch_Dogs sees ~40% better performance on low-end CPUs when coupled with a Geforce over a Radeon. Dead Rising 3 (another draw call heavy game) more than doubles performance under the same scenario while CoD can see a ~60% improvement.

These aren't small numbers. Most sites just tend to benchmark at high resolutions using the fastest CPUs which masks the performance differential.
 
Looks like a new patch fixes the crashes in the pantheon area. Will try it when I'm home from work
 
Notes for Patch 2


  • Fixed an issue where under some circumstances, Arno can fall through ground when free running on various props or when dropping down on to an NPC.
  • Fixed an issue where it is possible for Arno to get stuck in a hay cart next to the Pantheon in the Bièvre district.
  • Fixed an issue that may cause a game crash when joining the co-op session of another player who has a specific single player and side missions progression.
  • Improved performance in cinematic on low HW configurations
  • Fixed various flickering issues on nVidia SLI and AMD CrossFire configurations
  • Fixed various Graphic issues
  • Fixed light artifacts in PCSS shadows
  • Fixed problem with refresh rate on a first game launch.
  • Fixed various Input issues
  • Other keyboard/mouse fixes in several menu pages.
  • Fixed issues with “Invite to game” option when it was becoming unavailable after host migration, when it was active for all recent players instead of only for friends


Note that even with this patch, players may still experience

  • Most framerate issues that have been reported since the release of the game
  • Performance issues

...
 
With latest FRAPS on:

I did see the fps go from mid teens to below 30 throughout cutscenes. A cutscene between Arno and Elise right after the opening credits and a certain other event was cruising in the teens and this was noticeable to the eye for sure as you would expect.

In game I'm seeing 30s and 40s. Vsync is on. I'm running with 4x MSAA. TXAA was worse. The rest of the eye candy is maxed out. As stated before this is on 1920x1200 60Hz, GTX 780 3gb, 16 GB of DDR3, Intel 4770k. Latest drivers and everything.

I have not attempted anything from outside of the game yet.


EDIT: Turned MSAA down to 2 and turn vsync off just so I get a better feel of what's going on here. Very inconsistent performance that vacillates. Probably no surprise but I notice the higher I climb away from crowds and such the higher the FPS gets.


EDIT: Dropped HBAO+ to SSAO and dropped MSAA to FXAA. Dropped shadow quality to "high". Vysnc off and I can get into the 70s. Never drops below 55 ish anywhere. I have some tearing without vsync.


^^ Frame of reference.



Note that even with this patch, players may still experience

Most framerate issues that have been reported since the release of the game
Performance issues

Fo' sho'.
 
Last edited:
once they patch in the tessellation I'll take the plunge...until then I'll patiently wait on the sidelines
 
The game does not need tessellation, it needs more performance.

it needs both...more graphics options are never a bad thing...the game looks gorgeous as is (so I hear) so adding tessellation will only make it better...

&#8220;In Unity, GeometryWorks is being used to add tessellation to roof shingles, roof tiles, cobblestones, brick roads and paths, archways, statues, architecture, and much, much more. And because this tessellation is real geometric detail, as opposed to simulated detail from bump mapping or normal mapping, tessellated detail in Unity is accurately shadowed by NVIDIA HBAO+ and NVIDIA PCSS, significantly improving image quality.&#8221;
 
This was priceless:

pV200Sf.png

Hahaha, I just looked that review up on Amazon, and he has since adjusted his rating from 5 stars down to 1. And he updated it to add this:

Update: Ubisoft has mentioned that they will be issuing a patch that may "fix" some of these historically accurate non-issues. For this reason, I have lowered my score to 1 star. If I wanted to play a game that was finished, I would not have purchased a Ubisoft game on day one. Boo to Ubisoft, we expect less of you.
 
Stupid iPad. I meant fraps.

I am finally able to run this game with the settings that I desired. I am on sequence 3 and memory 1. The game has staggering amount of content. I love that I can now walk into most buildings and watch people go about their daily lives. It is just phenomenal what they have done with this game.

I am actually enjoying it again. Most of the issues were pertaining to:
- Install on HDD instead of SSD
- CPU overheating (I spent 1 hour cleaning out all my case after 1 year)
- Me changing graphics settings and not restarting the game

1080 P
Ultra High everything, HBAO+ except Shadows at High
4xMSAA
No vsync and 144 Hz

Runs about 45-60 fps depending on where I am at.
 
Last edited:
Back
Top