Assassin's Creed Odyssey Will Only Work on CPUs That Support AVX

There isn't, because an i7 930 is miles ahead the shitty Jaguar within the PS4/Xbox One. Miles and miles ahead.

Irrelevant, it's missing a feature and is thus out of parity with what they want as a baseline.

It's technical debt on the codebase and I wouldn't doubt they have a solid chunk of hand optimized and vectorized AVX code that needs a fallback. To which certain PC processors are the exception. Doesn't matter if they're faster, they simply lack the feature, and it's more code to maintain for it.
 
Last edited:
Irrelevant, it's missing a feature and is thus out of parity with what they want as a baseline.

In all honesty, I'd doubt AVX is used out of "laziness" and more likely an attempt to squeeze every last drop of power out of the ancient AMD-based consoles....

Path of Exile also recently implemented AVX for something or other, but it's not a requirement.

Am I disappointed I can't play? Certainly.

Am I surprised? .... Not. Really....
 
https://forums.ubi.com/showthread.php/1941337-Update-on-AVX-Support

"Hey everyone,

We’ve been actively monitoring player reports about PC crashes and want to take a moment to provide you with an update on this.

First and foremost, we want to thank everyone who got in touch with us via the various channels to provide additional information. Thanks to your support, we were able to identify the common cause of a few instances of reported crashes: the impacted CPUs didn’t support AVX – more details below.

We heard your feedback and are now actively working on a solution to extend the supported CPUs for our players to be able to run Assassin’s Creed Odyssey without AVX support, within the minimum requirements.
As this is an ongoing process, we are not able to provide an ETA just yet, but rest assured this is a high priority for us and we will keep you updated on the progress."

200.gif


.... Gonna pop in the X3470 later this week. (prolly just OC it to 3.8-3.9ish)

Andddd my birthday is in 10 days, so hopefully we get an update soon :)
 
I’m still on X79 (sandy-e) so I guess the time of doom and gloom is nearing for this build as well.

To be honest though it seems like a lot of people here fail to realize you can upgrade GPUs separate of CPUs.

My 4.6 3930k doesn’t feel long in the tooth especially when I finally over locked my RAM and 980ti. What the fuck do I care though, I wasn’t gonna play the new AC anyways. I’ve got everyone back to the first on a backlog already.

They made this decision easy.
 
We heard your feedback and are now actively working on a solution to extend the supported CPUs for our players to be able to run Assassin’s Creed Odyssey without AVX support, within the minimum requirements.
As this is an ongoing process, we are not able to provide an ETA just yet, but rest assured this is a high priority for us and we will keep you updated on the progress."

this always happens...I think Monster Hunter World and a few other AAA releases over the past few years were released without support for older CPU instruction sets and were immediately patched
 
this always happens...I think Monster Hunter World and a few other AAA releases over the past few years were released without support for older CPU instruction sets and were immediately patched

Reminds me of a scene from Frasier:

- Martin asks Roz why there aren't more radio shows aimed at his generation:
Roz: You don't spend it all on fast food and beer.
Martin: Yes I do!
 
So, early in development, someone switched on the AVX compile flag and now they can't turn it off without it blowing up?
 
Old-scale Nehalem i7 systems are still pretty capable, yes, but I can tell you there was a noticeable improvement in frame times and minimum frame rates when I went from a 4.3GHz i7 930 to a 4.4GHz i7 4770K back in 2014. Not just in benchmarks, it was easily felt in the gameplay too. And that was 4 years ago.

So while I agree that this is likely an arbitrary decision by the developer, you are really selling some of the processor advancements made in the last decade short. It's worth the upgrade to a newer platform these days.
 
I'm sure the missing tech could be virtualized with a patch... but why bother? If you're still gaming on an Pentium 4 or Athlon 2, it "might" be about time to go spend $200 and upgrade your rig.
 
Who cares? The unsupported cpus are 10 years old. If you still using cpus that old you then more then likely the rest of the system is not even close enough to run this game.

That's fantastically inaccurate; Bloomfield is still quite capable when it comes to pc gaming. The only thing holding back Bloomfield is lack of pcie 3.0, which has a toll in certain games around GTX 1080 horsepower, but never to a point where it's tangibly noticeable.

https://www.pcgamer.com/bloomfield-takes-on-skylake/

AlDqfD7.png
qjtop4d.jpg
MznfRJQ.jpg
XuPZOBB.jpg
Rl47WFW-1.png
 

Attachments

  • MznfRJQ-1.jpg
    MznfRJQ-1.jpg
    45.6 KB · Views: 0
That's fantastically inaccurate; Bloomfield is still quite capable when it comes to pc gaming. The only thing holding back Bloomfield is lack of pcie 3.0, which has a toll in certain games around GTX 1080 horsepower, but never to a point where it's tangibly noticeable.

And there are a ton of good games available that are slightly older and do not use the AVX instruction set.

I should also note that there are plenty of good games that are seven years or older which are still quite playable and enjoyable today, even if they require a little tweaking to get running. Try saying that about older games and consoles. In some cases, because of the improved graphics cards, higher-resolution monitors with higher refresh rates, and improved CPUs, I would suggest that some of then are better playable now than at release. There are exceptions of course (cough Stalin vs Martians cough).
 
Well, while it is faster, it is an older first-generation Core i7 quad-core, and at 2.8GHz, is around ~15% faster overall, in full-SMP, than the Jaguar CPUs in the PS4 Pro and XBoneX consoles.
Not bad for a CPU from 2008, though!

Mmmmmmmmm nope.

Jaguar is a very, very weak cpu core to begin with. It is a low-power branch, it doesn't hold a candle against a fully fledged i7, even such an old one.

https://www.neogaf.com/threads/anan...-cpu-beats-amds-jaguar-in-performance.677101/

58072.png


According to hwbot.org, a i7 920 @ 2.8ghz scores 4.5 points.

So... yeah. A Jaguar core is slow as fuck. And it should be, for it is a low power, low performance sku.
 
I think you misunderstood a bit - you aren't taking into account that the Jaguar in the consoles is an 8-core CPU, not a quad-core like the desktop/OEM variants like what you posted on Cinebench.
So in Cinebench with what you posted, a quad-core Jaguar @ 1.5GHz scores 1.5 - let's match that up with the 8-core Jaguar @ 2.1GHz in the PS4 Pro:

1.5 ÷ 2.1 = ~0.714
1 - 0.714 = 0.286
(so 2.1GHz is roughly 28.6% faster than 1.5GHz)


1.286 x 1.50 = 1.929
(so at 2.1GHz, a quad-core Jaguar would score roughly 1.929 on Cinebench)


Now, we take that 1.929 Cinebench score and multiply it by 2, since we need to double the amount of cores from 4 to 8 to match the PS4 Pro's CPU:
1.929 x 2 = 3.858


So the overall score for an 8-core Jaguar @ 2.1GHz would be roughly 3.858.
Let's compare that to the quad-core i7 920 @ 2.8GHz with a score of 4.5:


4.5 ÷ 3.858 = ~1.16
So roughly, the quad-core Intel i7 920 @ 2.8GHz is roughly 16% faster than the 8-core AMD Jaguar @ 2.1GHz in Cinebench.

Well, while it is faster, it is an older first-generation Core i7 quad-core, and at 2.8GHz, is around ~15% faster overall, in full-SMP, than the Jaguar CPUs in the PS4 Pro and XBoneX consoles.
Not bad for a CPU from 2008, though!


Hope that helps with the point I was trying to make! :)
 
Last edited:
I think you misunderstood a bit - you aren't taking into account that the Jaguar in the consoles is an 8-core CPU, not a quad-core like the desktop/OEM variants like what you posted on Cinebench.
So in Cinebench with what you posted, a quad-core Jaguar @ 1.5GHz scores 1.5 - let's match that up with the 8-core Jaguar @ 2.1GHz in the PS4 Pro:

1.5 ÷ 2.1 = ~0.714
1 - 0.714 = 0.286
(so 2.1GHz is roughly 28.6% faster than 1.5GHz)


1.286 x 1.50 = 1.929
(so at 2.1GHz, a quad-core Jaguar would score roughly 1.929 on Cinebench)


Now, we take that 1.929 Cinebench score and multiply it by 2, since we need to double the amount of cores from 4 to 8 to match the PS4 Pro's CPU:
1.929 x 2 = 3.858


So the overall score for an 8-core Jaguar @ 2.1GHz would be roughly 3.858.
Let's compare that to the quad-core i7 920 @ 2.8GHz with a score of 4.5:


4.5 ÷ 3.858 = ~1.16
So roughly, the quad-core Intel i7 920 @ 2.8GHz is roughly 16% faster than the 8-core AMD Jaguar @ 2.1GHz in Cinebench.


Hope that helps with the point I was trying to make! :)

Cinebench is hardly gaming though.
 
It could be 32 cores, but Jaguar still sucks. Weak single threaded performance is the main reason so many games are stuck at 30fps on console.

Ryzen in consoles cannot come soon enough.
haha, I never try to defend the Jaguar CPU unless absolutely necessary, and I'm one the biggest proponents on here (with a few others) who have proven first-hand that the 8-core Jaguar in the current-gen consoles aren't enough to push them beyond 30fps in most AAA games - though as I have stated before, if the game is 2D or doesn't have much demand on the CPU like lower-end 3D games, the consoles can easily do 60fps at 4K resolutions, it just really depends on the game.
I know what you are saying, though, and do agree, and I don't believe scaling outwards with more weak cores with SMP is going to help as much as continuing with 8-threads and either 4 Ryzen-based cores with SMT or 8 Ryzen-based cores (probably lower-clocked for TDP and heat envelope) without SMT - but that is just a guesstimate of the new consoles right now.
 
Back
Top