PlayStation 5 Rumored to Sport Ryzen 8-Core CPU, Cost $500

Remember PlayStation 3 was supposed to be 8 cores, and how amazing that was at the time. They werent even full CISC cores.
It was pretty amazing at that time!
The Cell was PowerPC-based, which would have made it RISC, at least on the PPE.

The PlayStation 3 units are all equiped with an IBM Cell CPU @ 3.2GHz with 1 PPE (PowerPC general-purpose CPU core with SMT) and 7 SPEs (8 natively on the Cell, with 1 SPE disabled in the PS3 Cell and 1 dedicated to the OS/hypervisor) with roughly 25 GFLOPS SP32 in the PPE and 150 GFLOPS SP32 in the 6 available SPEs (25 GFLOPS per SPE).
PPE = Power Processing Element
SPE = Synergistic Processing Element (FPUs)

The Cell also allowed software technologies like NVIDIA PhysX to be capable of even running on the PS3, since the GPU in it was the generation right before the G80 which did not have a unified-shader architecture and was not itself capable of running PhysX - the Cell CPU did all of that with its SPE units.
Without the Cell, that wouldn't have been possible on the PS3 with games like Batman Arkham Asylum.

The Cell really is the middle-man, so to speak, between classic CPUs and APUs with GPGPU compute, and was certainly a huge technological innovation at the time, especially when x86-64 CPUs were only just becoming dual-core designs.
 
The 8-core Jaguar is much less powerful than even the FX Bulldozer 8-core CPUs, even when clock-for-clock, so it would be more similar to gaming on an AMD Athlon 64 (San Diego) Socket 939 "8-core" (just pretend one existed, heh) and clocked between 1.6-2.3GHz.
I know what you mean, though, and I do agree.

The Jaguar was a low-power CPU meant for embedded systems and thin clients back in 2013, let alone in 2018+, so it was hardly ever a high performance CPU!
You are being ridiculous again. The Jaguar CPU clock for clock is at least as fast as Bulldozer and nearly in line with Phenom II IPC. It doesn't take a whole lot of research to figure that out but I'll post some links later on for you tonight when I get home so you can be wrong about something else here. In the meantime please entertain us by trying to get even 30 FPS on a socket 939 CPU at 1.6 to 2.3 for one of the many games that runs at 60fps on the consoles... Good luck.
 
You are being ridiculous again. The Jaguar CPU clock for clock is at least as fast as Bulldozer and nearly in line with Phenom II IPC. It doesn't take a whole lot of research to figure that out but I'll post some links later on for you tonight when I get home so you can be wrong about something else here. In the meantime please entertain us by trying to get even 30 FPS on a socket 939 CPU at 1.6 to 2.3 for one of the many games that runs at 60fps on the consoles... Good luck.

The drive behind framerates is how the game engine works not the hardware. If you link the wikipedia page for PS4 pro and PS4 with the different stats you will see that it makes no sense hammering it further into the ground and older consoles had a good deal of 60 fps games. Battlefield was ~60 fps on the PS4......
 
You are being ridiculous again. The Jaguar CPU clock for clock is at least as fast as Bulldozer and nearly in line with Phenom II IPC. It doesn't take a whole lot of research to figure that out but I'll post some links later on for you tonight when I get home so you can be wrong about something else here. In the meantime please entertain us by trying to get even 30 FPS on a socket 939 CPU at 1.6 to 2.3 for one of the many games that runs at 60fps on the consoles... Good luck.
Well, clock-for-clock, the Jaguar is a bit faster than 939, I will give you that.

Let's do the math with an SMP-based bench:
https://www.cpubenchmark.net/compare/AMD-GX-420CA-SOC-vs-AMD-Athlon-64-X2-3800+/2121vs1511

AMD GX-420CA 4-core @ 2.0GHz - 2362
AMD Athlon 64 X2 3800+ 2-core @ 2.0GHz - 958

Now to directly compare them core-for-core:
2362 ÷ 4 = 590
958 ÷ 2 = 479

So my math was a bit off, the Jaguar clock-for-clock is about ~23% faster, so I suppose that wasn't the best comparison.


You said the Jaguar is similar to Bulldozer, lets do that comparison as well:
https://www.cpubenchmark.net/compare/AMD-GX-420CA-SOC-vs-AMD-FX-4100-Quad-Core/2121vs255

AMD GX-420CA 4-core @ 2.0GHz - 2362
AMD FX-4100 4-core @ 3.6GHz - 4070

Let's make that FX-4100 downclocked to 2.0GHz for a direct comparison now:
3.6 ÷ 2.0 = 1.8
4070 ÷ 1.8 = 2261

AMD GX-420CA 4-core @ 2.0GHz - 2362
AMD FX-4100 4-core @ 2.0GHz - 2261

Now core-for-core:
2362 ÷ 4 = 590
2261 ÷ 4 = 565

This shows the Jaguar is about ~4.4% faster, which is pretty damn accurate - very nice, you were right! (y)
I will do my best to remember this and have gone back and edited the post you referenced.

Also, if my math is off, please correct me on that as well.



In the meantime please entertain us by trying to get even 30 FPS on a socket 939 CPU at 1.6 to 2.3 for one of the many games that runs at 60fps on the consoles... Good luck.
Well, I did say:
AMD Athlon 64 (San Diego) Socket 939 "8-core" (just pretend one existed, heh) and clocked between 1.6-2.3GHz
For real-world Socket 939 CPUs playing modern games, yep, there is no way those would reach 60fps on most titles other than perhaps low-end 3D games or 2D games that aren't heavy on the CPU.

It doesn't take a whole lot of research to figure that out but I'll post some links later on for you tonight when I get home so you can be wrong about something else here.
At least I can admit when I'm wrong, and I just proved myself wrong above by going with what you said.
I don't tout things for ego, and I legitimately want to learn more - that's more than I can say for most individuals on this forum.
 
What about the graphics? I don’t think AMD can make anything competitive with any type of reasonable power consumption for a console.
PS4 and XBOX 360 both used AMD before. What makes you think they need a competitor's GPU, unless they only plan on shipping it with one title?

SPACE INVADERS
 
Here is a list of games that run at 60 fps on the OneX:
https://www.google.com/amp/s/www.windowscentral.com/xbox-one-x-enhanced-60-fps-list?amp

It seems to be a good mix of dx12, dx11, and Vulkan. Not bad for 2.3 ghz of Jaguar.

Now look at how what it takes for some of these games to hit 60 fps. This test puts the focus on CPU performance. Let's pretend for now that the One X has similar processing power as an R3, given perfect scaling:
https://www.techpowerup.com/reviews/Intel/Core_i7_9700K/12.html

Things to note:
1. Certain games like Wolf 2 and Tomb Raider could hit 60 fps using a potato as a CPU

2. The fact that games like Hitman and Helblade hit 60 fps is rather impressive as they are a challenge for some modern cpus.

3. Others like Assassins Creed simply do not scale that well with cores and simply need a higher frequency or better ipc if they ever want to hit 60 fps.

Overall, rather impressive on what they managed to leverage with the One X CPU.
 
The drive behind framerates is how the game engine works not the hardware. If you link the wikipedia page for PS4 pro and PS4 with the different stats you will see that it makes no sense hammering it further into the ground and older consoles had a good deal of 60 fps games. Battlefield was ~60 fps on the PS4......
Looks like the game engine is holding Fallout 76 back as well:

 
I haven’t had an issue with any specific generation of consoles and their graphics capability.

Sure, better graphics is always better... but better games trumps all.
 
This spec bickering is like arguing a Honda Civic having +/- 5 more horsepower than a Corolla. Who cares.

ALL consoles are garbage from a hardware spec standpoint. They live and die by exclusives (I realize this is obvious). And if MS would ever remember that (they once understood it very well) then they'd finally and actually give Sony a run-for. X One X didn't light the world on fire unfortunately since multiplats and backward-compats won't carry a console.
 
Last edited:
ALL consoles are garbage from a hardware spec standpoint.

Not sure that an RX580 goes in the garbage category, even today.

Here is a look at some older games and what it takes to hit 60 fps:
https://www.techpowerup.com/reviews/AMD/Ryzen_3_1300X/11.html

Red Falcon mentioned Fallout 76. Here we see that Fallout 4 takes alot of CPU to hit 60 fps. In fact, open world third person games like Watch dogs, Assasins Creed, and Fallout seem to be really the only place where the One X will not hit 60 fps.

These games are mostly locked at 30 fps on the One X. Alot here will raise their nose to that, but it seems to work all right with those types of games. This is still a big improvement over the Xbox one/ Ps4 which struggle to do even that.
 
It matters if even casual gamers ever want those said exclusives to get beyond 30fps.

From what I have seen, casuals don't care about frame rates, they base their purchases on either word of mouth (which mostly won't mention how the game runs unless it runs really badly) or screenshots (which means graphics sell, not frame rates).

PS4 Pro's handling convinced me that games will remain on 30fps for the forseeable future, the extra hardware will be used to push graphics instead. I don't know how bad Jaguar is, but my view is that 60fps isn't a problem on that CPU unless you are talking about online gameplay.
 
This spec bickering is like arguing a Honda Civic having +/- 5 more horsepower than a Corolla. Who cares.

ALL consoles are garbage from a hardware spec standpoint. They live and die by exclusives (I realize this is obvious). And if MS would ever remember that (they once understood it very well) then they'd finally and actually give Sony a run-for. X One X didn't light the world on fire unfortunately since multiplats and backward-compats won't carry a console.
Dude , consoles are great they show how much you can do with dedicated programmers to push the hardware to the limit, that is something we will never see on the PC where compatibility between all kinds of hardware is more important then anything else. And yes there are PC titles which are bogging down high end hardware but those are just poorly coded and we know that because of consoles as well.
From what I have seen, casuals don't care about frame rates, they base their purchases on either word of mouth (which mostly won't mention how the game runs unless it runs really badly) or screenshots (which means graphics sell, not frame rates).

PS4 Pro's handling convinced me that games will remain on 30fps for the forseeable future, the extra hardware will be used to push graphics instead. I don't know how bad Jaguar is, but my view is that 60fps isn't a problem on that CPU unless you are talking about online gameplay.

You expected the "maximum profit" business to adjust their game engines for a piece of hardware that is out there as a niche (the way it was marketing was not as a replacement for the PS4)?

The way the console market was created is to make money, many have been driven there because of the sales pitch that sales are better then on the PC and currently I'm not to sure that is the case any more. Because every one is making games on consoles....
 
If the exclusives are not there not interested. Will be interesting how they change the design but not interested in anything else.
 
Here we see that Fallout 4 takes alot of CPU to hit 60 fps.
There's them good good CPU-processed-shadows doing us all huge framerate favors! Thanks Bethesda! (y) lol

You know... thinking about it, that may be a further bad-thing to give the consoles more CPU power without a GPU that is overly-capable. IOW it's another GPU-bound console and developers will continue to offload things to the CPU, even on PC. Just seems silly. Looking at the shader files in No Man's Sky for example, if a company like Bethesda removed the hardware-shadows code entirely (assuming they did and it's not something caused by GameWorks in FO4) just to save some space in the code, then there must be a ton of code for it. NMS by comparison, every shader file has everything in it for Xbox, PS4, OpenGL and DirectX 12... and the game on PC doesn't even have DX12, running locked to OGL! Seriously, there's a lot of fluff in their code. Maybe it's just PC that the files contain all the code, and the consoles have that code-fat trimmed out.

Anyways, point there being that if it wasn't a huge change to the Creation Engine, it'd have been nice for at least an INI option for hardware rendering of shadows on PC. SO if the devs defer rendering to the CPU when there is limited GPU power, what's going to happen in the future if the consoles get an even better CPU that overshadows the GPU? :confused: Now I really hope Navi delivers, but on the console side, so that more devs don't start taking a page from Bethesda's book...
 
Yeah right, folks are going to pay $500 for a PS5? LOL. They would not do that with a PS4 and most definitely would not do that with a PS3. I would also imagine that the next major Xbox will also have the same processor, since the Jaguar cores are getting a bit long in the tooth.
 
Yeah right, folks are going to pay $500 for a PS5? LOL. They would not do that with a PS4 and most definitely would not do that with a PS3. I would also imagine that the next major Xbox will also have the same processor, since the Jaguar cores are getting a bit long in the tooth.

I may be in the minority, but I don't overly look at the pricetag on something like a console.

If the XB is $20 cheaper than a PS, that isn't going to make me jump over to XB if I were going to buy a PS.

Consoles are all about the investment into the ecosystem. If you have a lot of friends on one platform, and/or that platform as the exclusives you want - that's the platform your going to get as well. Pricetag isn't entirely irrelevant, you do need to be able to afford it, but that's more a binary condition and it's way down there on the priority list.
 
I may be in the minority, but I don't overly look at the pricetag on something like a console.

If the XB is $20 cheaper than a PS, that isn't going to make me jump over to XB if I were going to buy a PS.

Consoles are all about the investment into the ecosystem. If you have a lot of friends on one platform, and/or that platform as the exclusives you want - that's the platform your going to get as well. Pricetag isn't entirely irrelevant, you do need to be able to afford it, but that's more a binary condition and it's way down there on the priority list.

Yeah but, you and I are not the norm. I bought the original Xbox One for $500 and gave it to my niece last year for a Christmas gift. Her son plays on it at a lot and it is still going strong, definitely got $500 worth out of it. Also, I purchased the Xbox One X on release for $500 and it is just kicking along without an issue, worth every penny I spent on it.

That said, I also have 3 desktop computers at home that I built myself. I am an IT pro and love building desktop computers. I have no tablets and only one phone. (I do have a laptop that is good enough for me.) Consoles are worth what we pay for them but, so many people think that MS and Sony must sell them at a loss.
 
I disagree, devs are lazy, they always have been, that is why consoles hold back PC development. It was never about vram, or CPU, or anything, it has always been about devs developing for the largest demographic with the lowest average performance (consoles, not igpus). Then they sloppily port it to PC, by increasing that lowest average, things are pushed forward.

Yeah, some studios won't be lazy, but the reality is many are and will be. So moving that bar moves quality forward.

People get twisted up on specifications and lost in the weeds. They lose sight of how business operates, in a limited production budget we optimize to the largest audience and the rest gets what they get.

Edit: how many games today do we still see that have 420p or 720p textures for objects? Why make everything 1080p when consoles will need to checker board render anyway just to hit 30fps.

I can't expect devs to stop being lazy, because lazy is the wrong word, they are businesses and operate to make a profit. I can't fault console manufacturers because they aren't devs. It's not anyones fault, it's just a reality of business, so when consoles improve, the rising tide raises all ships.

you are so wrong on so many levels its laughable.
example:

@KyleBennet , KazeoHin , Flogger23m have the same PC, with the exact same specs down to the clocks, but one of them is having huge problems, crashes, micro stuttering etc but the other two have no problems and everything is butter smooth.

Are the Developers then lazy or is it user incompetence, PC gamers should stop blaming devs for being incompetent, its not their fault you installed malware or your PC is film with performance sapping junk.
 
Back
Top