Assassin’s Creed Origins

Denuvo cracker claims the high CPU overhead is being caused by a layer of VMProtect, as part of the Denuvo implementation.

Update:
About AC: O, one guy at [REDACTED] send me the binaries of the game and asked me which version of Denuvo the game uses, so I decided to take a look. I found it very strange when I saw that VMProtect layer over Denuvo only because Denuvo dropped VMProtect early this year. I thought it's some desperate move by Denuvo but it seems now that Ubisoft could have implemented it themselves, tanking your game performance by a lot, especially if they hooked Denuvo and VMP to time-critical functions of the game. So, if you have high cpu usage while playing the game, this is reason.

It seems Ubisoft simply does not care about you nor your gaming experience.

Original comments below:
ACO is NOT using 4.8/4.9/5.0/50.0, it's actually just a minor update to what SP had... VMP layer is also NOT coming from Denuvo but directly from Ubisoft (as they still do have a valid license).
I'm just telling some facts that I noticed while tracing the code
wink.png
that VMP layer on top makes it extremely difficult to trace the things... The only thing I can tell is that Denuvo itself brings ~30-40 % to CPU usage (!!!) that's why the game is so CPU demanding
wink.png
yes because of that the actual CPU usage is nuts, up to 30-40 % of additional power...
 
Last edited:
Denuvo cracker claims the high CPU overhead is being caused by a layer of VMProtect, as part of the Denuvo implementation.

Update:


Original comments below:

I believe it. The game generally seems to run fine, but out of no where I get weird frame drops among other things.

I'm going to generally shelve the game for now. It runs well enough, but i'm about 1/4 of the way through the game now and i'm experiencing bugs in almost every side mission at this point. None have prevented progression, but they are pretty laughably bad with some of the dumb shit the NPC's are doing.
 
Despite "popular opinion", this is a poorly optimized game.

I realize 6C is now mainstream with the R5 1600 and 8400/8600K but both of these chips are <6 months old. And prior to that, any chip in excess of 4/4 was $300 minimum. You shouldn't need a brand new 6 month old mid-range PC or $300+ CPU to get 60fps.

Someone who did a $250 7600K build in Q1/Q2 (pre-Ryzen) can barely maintain 60fps in AC:O. That chip is only 10 months old.

If this were October 2018, then maybe we could talk about multi-threaded optimization. Like I said on Reddit, if Ryzen didn't exist, this game would be getting raked over the coals for being a bad port. Affordable 4/8 and 6C chips are far too new to the market to justify these kinds of demands. Budget/Mid-range builders are getting fucked across the board here.

Scalable graphics options so that you could lower the settings to hit 60fps on weaker hardware, that's fine. But AC:O doesn't have that. Even on the lowest settings, they still won't hit 60. Also, using extra cores to hit higher fps targets (like 144 Hz) is fine, but AC:O doesn't do that either.

This kind of short product lifecycle reminds me of the 1990's PC industry, and that's not a good thing. We could be seeing this repeat again next year or 2019 with 6C -> 8C CPUs.
 
Last edited:
Playing from SSD?

Yes - It has nothing to do with disk usage / whatever. BF1, Wolf 2, GTA 5, Forza 7, etc all run just fine. Forza 7 and BF1 in particular are pretty decent indicators of an unstable system and they run no problem.

This is about the only game I can recall recently that has these problems and I assume it's due to this weird DRM they are running.

The only thing I haven't tried to do is disable G-Sync as I know G-Sync causes issues with some games some of the time, but it's not something I should have to do / deal with. Although NVidia's shit driver interaction with g-sync doesn't help as the application specific profiles to disable g-sync does not work. G-Sync is either completely off or completely on regardless of what application specific setting you may choose for it. So having to manually flip it on/off is not something I like doing.
 
Last edited:
Denuvo cracker claims the high CPU overhead is being caused by a layer of VMProtect, as part of the Denuvo implementation.

Update:


Original comments below:

Has it ever actually, truly been the case that Denuvo totally caused fucked up performance in any of its titles?

And I mean, it was removed or cracked and it made a blatant difference.
 
Has it ever actually, truly been the case that Denuvo totally caused fucked up performance in any of its titles?

From what I recall any game using that version of denuvo tends to use more CPU then it should. So when you have a game that is already fairly poorly optimized it really doesn't help.

Needless to say while I recognize my 5820k is getting old - It's still way more then enough for every other AAA title out there, and AC shouldn't be any different.
 
Has it ever actually, truly been the case that Denuvo totally caused fucked up performance in any of its titles?

And I mean, it was removed or cracked and it made a blatant difference.
Rime, allegedly. Although I don't think many games using Denuvo + VMP like this. At least not added by the developer themselves.
Also the cracks are still running emulated Denovo. There won't be a performance difference.
 
I'm running a 5820k @ 4.5ghz. It absolutely does not run smoothly. The benchmark may report 80+ FPS for me w/ my 1080TI, but there are weird frame drops/stuttering that happens.

It generally runs pretty smooth, but there is something in some areas that causes frame dropping.

What does it say your processors are at?

For me, Core 0 is always at 100%, and the rest barely go above 30%. CPU 1 barely goes above 10%. There are definitely issues with the performance of the game though.
 
Despite "popular opinion", this is a poorly optimized game.

I realize 6C is now mainstream with the R5 1600 and 8400/8600K but both of these chips are <6 months old. And prior to that, any chip in excess of 4/4 was $300 minimum. You shouldn't need a brand new 6 month old mid-range PC or $300+ CPU to get 60fps.

Someone who did a $250 7600K build in Q1/Q2 (pre-Ryzen) can barely maintain 60fps in AC:O. That chip is only 10 months old.

If this were October 2018, then maybe we could talk about multi-threaded optimization. Like I said on Reddit, if Ryzen didn't exist, this game would be getting raked over the coals for being a bad port. Affordable 4/8 and 6C chips are far too new to the market to justify these kinds of demands. Budget/Mid-range builders are getting fucked across the board here.

Scalable graphics options so that you could lower the settings to hit 60fps on weaker hardware, that's fine. But AC:O doesn't have that. Even on the lowest settings, they still won't hit 60. Also, using extra cores to hit higher fps targets (like 144 Hz) is fine, but AC:O doesn't do that either.

This kind of short product lifecycle reminds me of the 1990's PC industry, and that's not a good thing. We could be seeing this repeat again next year or 2019 with 6C -> 8C CPUs.

If the CPU overhead is due to DRM then, yes, optimization is an issue. However, that might not be the full answer. Games are eventually going to require more and more threads. Game engines are becoming increasingly more heavily threaded and that is going to mean that, in time, more games are going to see bigger advantages on CPUs with more cores and threads. We've had mainstream quad-core CPUs for the last decade, the stagnation of core count on the consumer side is almost entirely Intel's fault for continuing to release dual and quad core processors, and putting the higher thread count chips at enthusiast prices. With the consoles relying so much on higher core-count CPUs it is inevitable that it will have an effect on PC games.
 
If the CPU overhead is due to DRM then, yes, optimization is an issue. However, that might not be the full answer. Games are eventually going to require more and more threads. Game engines are becoming increasingly more heavily threaded and that is going to mean that, in time, more games are going to see bigger advantages on CPUs with more cores and threads. We've had mainstream quad-core CPUs for the last decade, the stagnation of core count on the consumer side is almost entirely Intel's fault for continuing to release dual and quad core processors, and putting the higher thread count chips at enthusiast prices. With the consoles relying so much on higher core-count CPUs it is inevitable that it will have an effect on PC games.

I'm not debating that.

A 5820k is 6 core w/ 12 logical.

For those interested my usage, even on Core 0 isn't going beyond 70-80%.
 
I'm not debating that.

A 5820k is 6 core w/ 12 logical.

For those interested my usage, even on Core 0 isn't going beyond 70-80%.

I'm seeing the same CPU performance as you are with the same model. I'm hoping that an eventual patch will equalize the workload across all cores.
 
I'm seeing the same CPU performance as you are with the same model. I'm hoping that an eventual patch will equalize the workload across all cores.

If high core 0 usage is due to VMP I doubt they can release a path to fix that, outside of removing VMP entirely. If VMP only runs on a single core that would explain the abnormally high core 0 load.
 
If the CPU overhead is due to DRM then, yes, optimization is an issue. However, that might not be the full answer. Games are eventually going to require more and more threads. Game engines are becoming increasingly more heavily threaded and that is going to mean that, in time, more games are going to see bigger advantages on CPUs with more cores and threads. We've had mainstream quad-core CPUs for the last decade, the stagnation of core count on the consumer side is almost entirely Intel's fault for continuing to release dual and quad core processors, and putting the higher thread count chips at enthusiast prices. With the consoles relying so much on higher core-count CPUs it is inevitable that it will have an effect on PC games.
I haven't played the game yet. Is there anything revolutionary going on in the game world to justify that performance?
Keep in mind, CPU-bound frames in Ubi's recent games like Watch Dogs 2 and Syndicate are about twice as high over AC:O. Syndicate and Unity already offered incredibly dense and detailed open worlds. You can speed down the streets of San Fran and London in a sports car/carriage and not even drop a single frame. Perhaps they have other elements running off the CPU, like shadows, but who knows.

Those are my concerns, anyway. If it's being caused by VMP then that would explain a lot. Otherwise it just looks like needlessly lost performance, aka bad optimization. And just because a game uses 8+ threads doesn't mean it's well optimized, especially if devs are offloading their sloppy work onto excess cores.
 
Last edited:
I haven't played the game yet. Is there anything revolutionary going on in the game world to justify that performance?
Keep in mind, CPU-bound frames in Ubi's recent games like Watch Dogs 2 and Syndicate are about twice as high over AC:O. Syndicate and Unity already offered incredibly dense and detailed open worlds. You can speed down the streets of San Fran and London in a sports car/carriage and not even drop a single frame. Perhaps they have other elements running off the CPU, like shadows, but who knows.

Those are my concerns, anyway. If it's being caused by VMP then that would explain a lot. Otherwise it just looks like needlessly lost performance, aka bad optimization. And just because a game uses 8+ threads doesn't mean it's well optimized, especially if devs are offloading their sloppy work onto excess cores.

Again, I have no idea what is actually causing the issue. All I know is that the game generally runs pretty smooth, but especially during the cinematic cut scenes the game stutters quite frequently. During normal gameplay it's mostly pretty good and maybe drops a frame every once in a while. I'm getting around 90FPS average @ 1440p and Ultra settings - It's just the random frame drops that are making it a sub-optimal experience.

For reference Syndicate, and all the other games I've mentioned run just fine. There is nothing revolutionary from a technical standpoint about this game. While it is a beautiful game that's mostly due to solid shader and texture work - It's not really due to pushing technical boundaries.
 
my ancient 4690k is at 100% across all cores at most times. no other game this year has done that.
 
sooooo. i seem to have lost my mount. i whistle and it doesnt come anymore, even bought a new one, that one doesnt show up either. its really frustrating with having to run everywhere, anyone know a solution to this?
 
sooooo. i seem to have lost my mount. i whistle and it doesnt come anymore, even bought a new one, that one doesnt show up either. its really frustrating with having to run everywhere, anyone know a solution to this?

No. But you can walk up to any riderless horse/camel and get on it.

I think I have called my mount twice in 8 hours.
 
No. But you can walk up to any riderless horse/camel and get on it.

I think I have called my mount twice in 8 hours.
see, i dont usually do the whole mount thing either, but it worked in the beginning and now it doesnt and that bugs the hell outta me. i want my stupid mount back.
 
see, i dont usually do the whole mount thing either, but it worked in the beginning and now it doesnt and that bugs the hell outta me. i want my stupid mount back.

Yeah your issue is kind of odd, I did notice that after that after leaving Siwa, my mount no longer was showing in my inventory, but it would come when called. However once I learned you can just jump on any horse/camel or hijack another riders horse I stopped bothering.
 
More...
Ok, so I played the game for a bit, tried to trace what is happening and here it is, complete proof that the game is calling VMProtect section (.vmp0) at run-time non-stop. God only knows how deep it goes.

Proof: https://image.prntscr.com/image/_6qmeqq0RBCMIAtGK8VnRw.png
While I was playing, I put memory breakpoint on both VMProtect sections in the exe to see if it's called while I'm playing. Once the breakpoint was enabled, I immediately landed on vmp0, called from game's code. Which means it called every time this particular game code is executed, which game code is responsible for player movement, meaning it's called non-stop.
 
Interesting. This cannot be accidental.

Of course it isn't.

My guess is that Ubi will likely remove the VMProtect crap in the first 1.1 patch that we will likely receive end of this week / next week. They'll claim they 'optimized' the game when in fact all they did was remove this protection. They'll have achieved their desired goal of preventing pirates for about a week to make more people cave into purchasing the game, and will mostly sweep the VMProtect performance issues under the rug as week 1 optimization claims.
 
hearing about how this uses up to 8 cores actually makes me want to try this out...I've had a 6-core for years which barely any games take advantage of
 
I could see them removing it if it's creating enough of a fuss due to performance reasons. I agree that they'd typically never remove it otherwise.
 
i really can't stand how they've always stuck in the cheesy shit with the animus and the present/future scenes. they have enough with these titles to make a good game using anything they want from history. i just gained access to arya and of course they have you go through in some absolute worthless bullshit in present day.
 
From what I recall any game using that version of denuvo tends to use more CPU then it should. So when you have a game that is already fairly poorly optimized it really doesn't help.

Citation needed. Never been proven with a shred of any evidence.

Rime, allegedly. Although I don't think many games using Denuvo + VMP like this. At least not added by the developer themselves.
Also the cracks are still running emulated Denovo. There won't be a performance difference.

RIME was a buggy mess in the first place. And it was still a buggy mess after the developer removed Denuvo.
 
Denuvo cracker claims the high CPU overhead is being caused by a layer of VMProtect, as part of the Denuvo implementation.

That's not actually what he claimed. All he stated was that he was noticing calls to VMP happening during gameplay. Well no shit. How much CPU overhead that actually means - if any at all - he has no way of quantifying, nor does anyone else but the developer - not without a Denuvo/VMP free copy. Salty pirates are going to remain salty they can't play AC:O, and kick and scream and create FUD all they can.
 
That's not actually what he claimed. All he stated was that he was noticing calls to VMP happening during gameplay. Well no shit. How much CPU overhead that actually means - if any at all - he has no way of quantifying, nor does anyone else but the developer - not without a Denuvo/VMP free copy. Salty pirates are going to remain salty they can't play AC:O, and kick and scream and create FUD all they can.

I own the game, so I'd love to know if the Denuvo/VMP concoction is actually bogging down the performance. Regardless of the what's being claimed, I'd still like to know if a future patch actually would help the increase the framerate w/ or w/o Denuvo/VMP. If not, is the game engine utilizing all cores with good purpose?
 
I own the game, so I'd love to know if the Denuvo/VMP concoction is actually bogging down the performance. Regardless of the what's being claimed, I'd still like to know if a future patch actually would help the increase the framerate w/ or w/o Denuvo/VMP. If not, is the game engine utilizing all cores with good purpose?

i'm running an aging i5 haswell and am seeing 100% cpu utilization
 
Not really surprised, Watch Dogs II made me shelf my old 4670k, still haven't finished Syndicate lol...
 
Back
Top