Sony Working on AMD Ryzen LLVM Compiler Improvements, Possibly for PlayStation 5

The CPU that'll end up in the PS5 would likely be clocked much lower than your traditional Ryzen 8 core. I wouldn't be shocked if it ran at 1.8Ghz or around there. HBM makes sense, even though its expansive. Console makers tend to like to avoid price fluctuation and would rather get a contract that guarantees them a fixed cost for the production of HBM. We're also years away from when a PS5 is released, so who knows what'll happen in the memory market by then.


It definitely won't be a 4+GHz part, however I would imagine it being over 2.5 GHz with HT. Already, AMD can get 4-core with HT alongside 10 core VEGA GPU parts in a 35w envelope on 14nm, and clock it reasonably high. So getting a high-clocked 8-core + HT + VEGA GPU in the ~200w of a console on 7nm should be completely doable.
 
Hang on a second, it doesn't say anywhere in that article that Sony will be using an AMD GPU in the PS5. The article is specific to CPU optimization. Sony could well be using an nVidia GPU for the PS5.
 
I really thought Sony would go with nVidia for the GPU for the PS5. I fail to see what Vega offers that is above and beyond the current Xbox One X. Do they really think AMD will be able to engineer something better in time? This is a risky move by Sony. Sure, Ryzen is a fantastic CPU choice. But I can really see the GPU side of things ending very badly for them.

Not really, Nvidia does not do Semi-custom. If the next gen of consoles wants something other than what's already on the market, they can't commission it, even then, if they want something more powerful than an ARM chip, they will need a separate chip commissioned for the CPU, as Nvidia only does ARM.

AMD can not only provide Semi-Custom CPUs and GPUs, but they can do it as a single chip, which will be cheaper to buy on the BOM, and cheaper and more simple to use on the product as well.

Not to mention, the next gen of products are going to be on 7nm, which will drastically change how current architectures perform (for better or for worse). MicroSony don't need the best, most efficient architecture, they need the good enough architecture, that doesn't bring up their costs.

Hang on a second, it doesn't say anywhere in that article that Sony will be using an AMD GPU in the PS5. The article is specific to CPU optimization. Sony could well be using an nVidia GPU for the PS5.

Not Likely, as I said above, Nvidia doesn't do semi-custom anymore.
 
Not really, Nvidia does not do Semi-custom. If the next gen of consoles wants something other than what's already on the market, they can't commission it, even then, if they want something more powerful than an ARM chip, they will need a separate chip commissioned for the CPU, as Nvidia only does ARM.

AMD can not only provide Semi-Custom CPUs and GPUs, but they can do it as a single chip, which will be cheaper to buy on the BOM, and cheaper and more simple to use on the product as well.

Not to mention, the next gen of products are going to be on 7nm, which will drastically change how current architectures perform (for better or for worse). MicroSony don't need the best, most efficient architecture, they need the good enough architecture, that doesn't bring up their costs.



Not Likely, as I said above, Nvidia doesn't do semi-custom anymore.
Perhaps Volta will be so efficient that they won't need anything "semi-custom". Heck, Pascal probably is efficient enough already.
 
Perhaps Volta will be so efficient that they won't need anything "semi-custom". Heck, Pascal probably is efficient enough already.


Pascal is more than enough. It has more than enough performance for next-gen console games at 4K, it's just more than enough dollars as well.

Even if Microsony get a special deal, and get a 1080Ti GPU for literally half price, the price is still astronomical, and that's just the GPU. They'd still need a CPU and entire BOM. Imagine trying to sell that for $499.
 
Extremely bad speculation, considering that Backwards compatibility is a real thing now on the Xbox ecosystem. Also, the Zen cores would have to run at around 2GHz just to fit into the proper power envelope, and I just do not see an 8 core with 48 Vega Cores, unless folks are willing to spend at least $750 on the console, at least from the way I am seeing it.
I think you quoted the wrong person.
It was KazeoHin who stated the Zen clock speed and Vega cores...
 
Pascal is more than enough. It has more than enough performance for next-gen console games at 4K, it's just more than enough dollars as well.

Even if Microsony get a special deal, and get a 1080Ti GPU for literally half price, the price is still astronomical, and that's just the GPU. They'd still need a CPU and entire BOM. Imagine trying to sell that for $499.

Well... don’t forget running 4k on consoles means something different to the peasants than to us at [H].
 
Pascal is more than enough. It has more than enough performance for next-gen console games at 4K, it's just more than enough dollars as well.

Even if Microsony get a special deal, and get a 1080Ti GPU for literally half price, the price is still astronomical, and that's just the GPU. They'd still need a CPU and entire BOM. Imagine trying to sell that for $499.
Sony has done it before. You really would put this one past them? Do you not remember the Cell CPU in the PS3?

I could also point to the GPU in the Xbox 360. It was revolutionary and beyond anything available for the PC at the time.

Sony's console division is flush with money right now. I would be shocked if the PS5 were anything but stunning and revolutionary. And I would be equally shocked if they put an AMD GPU in it.
 
Extremely bad speculation, considering that Backwards compatibility is a real thing now on the Xbox ecosystem. Also, the Zen cores would have to run at around 2GHz just to fit into the proper power envelope, and I just do not see an 8 core with 48 Vega Cores, unless folks are willing to spend at least $750 on the console, at least from the way I am seeing it.

48 Vega cores would net a ~175w GPU at ~1000mhz on 14nm. Now put that on 7nm. Now eliminate any mining demand as it is a semi-custom part. It is not only achievable, but actually conservative. XB1X already uses a 36 Core Polaris chip on 14nm, do you really think the next gen, on 7nm would be hard-pressed to squeeze a 50% upgrade on a new node?


Sony has done it before. You really would put this one past them? Do you not remember the Cell CPU in the PS3?

I could also point to the GPU in the Xbox 360. It was revolutionary and beyond anything available for the PC at the time.

Sony's console division is flush with money right now. I would be shocked if the PS5 were anything but stunning and revolutionary. And I would be equally shocked if they put an AMD GPU in it.

The PS3 was sony's least successful console. And their current console, the PS4, based on all-AMD hardware, is doing exceptionally well. AMD simply offers the industry something nobody else can, and Sony has seen huge success while using AMD hardware. I don't actually see your train of thought.
 
I really thought Sony would go with nVidia for the GPU for the PS5. I fail to see what Vega offers that is above and beyond the current Xbox One X. Do they really think AMD will be able to engineer something better in time? This is a risky move by Sony. Sure, Ryzen is a fantastic CPU choice. But I can really see the GPU side of things ending very badly for them.

Nvidia will never supply a GPU for a x86 based console. Consoles are expensive enough to manufacturer without having to include a CPU and a discrete GPU. The only way Nvidia gets back into a Playstation is if Sony switches to ARM.
 
48 Vega cores would net a ~175w GPU at ~1000mhz on 14nm. Now put that on 7nm. Now eliminate any mining demand as it is a semi-custom part. It is not only achievable, but actually conservative. XB1X already uses a 36 Core Polaris chip on 14nm, do you really think the next gen, on 7nm would be hard-pressed to squeeze a 50% upgrade on a new node?




The PS3 was sony's least successful console. And their current console, the PS4, based on all-AMD hardware, is doing exceptionally well. AMD simply offers the industry something nobody else can, and Sony has seen huge success while using AMD hardware. I don't actually see your train of thought.
My train of thought is that Vega is no better and really no different from what we currently have in the Xbox One X. Are you saying Sony will release a new console with a Vega GPU? Because AMD is still several years away from their follow-up to Vega. And no, a die shrunk Vega core is not going to be significantly better or faster. The only hope they would have would be if they somehow found a bottleneck that they could easily fix in Vega's design. More ROPs perhaps. But I would be shocked if that were the case. And if it is, Raja should have been *fired* by AMD, nevermind the vacation/exit he just had. The same goes for multiple others in AMD's management. I would be shocked. Not completely shocked. But shocked.

Why would Sony come out with the PS5 years after the Xbox One X came out with nothing but a faster CPU? Because it's cheaper? That's the only reason I can think of.
 
My train of thought is that Vega is no better and really no different from what we currently have in the Xbox One X. Are you saying Sony will release a new console with a Vega GPU? Because AMD is still several years away from their follow-up to Vega. And no, a die shrunk Vega core is not going to be significantly better or faster. The only hope they would have would be if they somehow found a bottleneck that they could easily fix in Vega's design. More ROPs perhaps. But I would be shocked if that were the case. And if it is, Raja should have been *fired* by AMD, nevermind the vacation/exit he just had. The same goes for multiple others in AMD's management. I would be shocked. Not completely shocked. But shocked.

Why would Sony come out with the PS5 years after the Xbox One X came out with nothing but a faster CPU? Because it's cheaper? That's the only reason I can think of.

Well, first thing first, the bottlenecks experienced by AMD GPUs are a non-issue when you're designing software specifically for them. Secondly, If you take Vega64, running at 300w, it is 40-50% faster than an RX580, which uses a similar CU setup as XB1X, but the 580 is clocked a LOT higher. Lets assume that 7nm will reduce the power consumption, and lower the clockspeed to be in line with the XB1X (around about 1000Mhz) and you would easily have a 150w part, if not 100w. This would be at least 40% faster than the X, which is already 30% faster than a PS4 Pro and a REALLY impressive console. Couple that with a 50w equivalent Ryzen 8/16 part, and you have a 64 CU Radeon GPU and an 8/16 CPU on a 150w to 200w package.

so essentially, you'll have at minimum 40% more GPU power and a CPU that is 4-5x faster.


What is so wrong with that?
 
Well, first thing first, the bottlenecks experienced by AMD GPUs are a non-issue when you're designing software specifically for them. Secondly, If you take Vega64, running at 300w, it is 40-50% faster than an RX580, which uses a similar CU setup as XB1X, but the 580 is clocked a LOT higher. Lets assume that 7nm will reduce the power consumption, and lower the clockspeed to be in line with the XB1X (around about 1000Mhz) and you would easily have a 150w part, if not 100w. This would be at least 40% faster than the X, which is already 30% faster than a PS4 Pro and a REALLY impressive console. Couple that with a 50w equivalent Ryzen 8/16 part, and you have a 64 CU Radeon GPU and an 8/16 CPU on a 150w to 200w package.

so essentially, you'll have at minimum 40% more GPU power and a CPU that is 4-5x faster.


What is so wrong with that?
It's possible, but doubtful quite frankly. Have you seen the lack of progress on the node shrinks lately? The transistors are reaching a point whereby they are only a few atoms wide. We are physics limited. Universe limited. I would be shocked to see a node shrink suddenly "fix" Vega. I'm glad you bring up the power efficiency and the general lack of efficiency of the chip. For that reason it is not suitable for a game console. An nVidia chip would be much better. Sure they could potentially code Vega to the metal but I still don't think that could fix it. Vega is a real turkey. It would be like Sony using a Matrox Parhelia chip in the original Playstation console. It's madness IMO.
 
so essentially, you'll have at minimum 40% more GPU power and a CPU that is 4-5x faster.
Your math is a bit off on that.
I did the math earlier in this thread comparing Ryzen to Jaguar clock-for-clock, and if Ryzen stayed at 2.0GHz like everyone here is speculating due to the power envelope, the performance improvement would be around 2 to 3 times faster (in general), not 4-5 unless a medium to large clock speed boost is included.
 
I wouldn't just compare the GFLOPS rating, as floating point operations is only one part of the CPU (and GPU), and does not directly correlate with CPU performance in total, as integer functions and IPC both play a large factor.
This is one area where a synthetic benchmark can give a better rough guestimate or ballpark of where CPUs and GPUs fit next to one another performance-wise in general.

The best comparison I can attempt to give between Jaguar and Ryzen is this:
https://www.cpubenchmark.net/compare/AMD-Ryzen-5-1500X-vs-AMD-GX-420CA-SOC/3001vs2121

Both the Ryzen 5 1500X and AMD GX-420CA are quad-cores, so we can do a nearly identical comparison of them, at least with IPC.
Please keep in mind that the synthetic benchmark's scores in the link above by itself doesn't mean anything specifically other than a placeholder for where each CPU falls performance-wise to one another.

The Ryzen 5 1500X quad-core @ 3.5GHz scored 10685.
The AMD GX-420CA (Jaguar) quad-core @ 2.0GHz scored 2299.

Now lets do the math on this!


So, if we want an apples-to-apples GHz to GHz comparison, we want to bring the Ryzen 5 1500X down to 2.0GHz as well, along with the score itself:
2.0 ÷ 3.5 = ~0.571428571
(3.5 is roughly 57% faster than 2.0)

Now, we want to take the score of the Ryzen 5 1500X and bring it down in the same manner:
~0.571428571 x 10685 = 6106 (rounding up from a long decimal)
(lowering the Ryzen CPU's score by 57% as well to match the equally reduced clock speed to allow a direct comparison)

So, a Ryzen 5 1500X quad-core @ 2.0GHz would have a score of 6106.
How does this compare in IPC in general performance-wise to the AMD GX-420CA?

6106 ÷ 2299 = ~2.655937364

Thus, the general performance core-for-core and clock-for-clock for the Ryzen 5 1500X quad-core @ 2.0GHz would be roughly 2.66 times faster than the AMD GX-420CA quad-core @ 2.0GHz.



Now, I should state that these scores do not implicitly compare whetstone (floating point) or drystone (integer) operations or performance, and this is just a performance-in-general comparison.
Hope this helps! :D
 
Last edited:
I'm starting to think that this whole Vega GPU inside the PS5 is wishful thinking on AMD's behalf along with that of their employees. I would think that Sony using nVidia in the PS5 would set off a bit of a panic inside AMD's graphics division. The current consoles have been a big win for them. And I really think that without those contracts, AMD's graphics division would no longer exist in its current form, if at all. They would be relegated to integrated CPU graphics, at best.
 
I'm starting to think that this whole Vega GPU inside the PS5 is wishful thinking on AMD's behalf along with that of their employees. I would think that Sony using nVidia in the PS5 would set off a bit of a panic inside AMD's graphics division. The current consoles have been a big win for them. And I really think that without those contracts, AMD's graphics division would no longer exist in its current form, if at all. They would be relegated to integrated CPU graphics, at best.
Unless they do officially move to ARM, why would Sony switch to NVIDIA for their next-generation GPU?
 
Unless they do officially move to ARM, why would Sony switch to NVIDIA for their next-generation GPU?
How could they not? Vega is a turkey and the Xbox One X is already using it. You think Sony wants to release a new console that is an also-ran? What would be the point?
 
An Athlon X4 is not a Jaguar processor - your comparison makes absolutely no sense.
Clock-for-clock and core-for-core, Ryzen is around 2.66 times faster than Jaguar, of which I gave a direct comparison of the two.

I don't know why you are even comparing Ryzen to an Athlon X4 since it has nothing to do with anything in this thread or consoles, and is totally nonsensical,
You have anything to back your statement up?

I could have sworn Bristol Ridge was a spin off off Jaguar, i stand corrected
 
I could have sworn Bristol Ridge was a spin off off Jaguar, i stand corrected
Thanks for stating that, I really appreciate your honesty. (y)


How could they not? Vega is a turkey and the Xbox One X is already using it. You think Sony wants to release a new console that is an also-ran? What would be the point?
Well, to be fair, by the time the PS5 is released, which could be well over a year or more away, we may not even be using Vega anymore at that point, so it might just be used as a stepping stone for the prototype or development stage right now.
You could definitely be right, though.
 
Well, to be fair, by the time the PS5 is released, which could be well over a year or more away, we may not even be using Vega anymore at that point, so it might just be used as a stepping stone for the prototype or development stage right now.
You could definitely be right, though.

No way would modern console manufacturers go for a split-chip design again. They've both found out that good-enough is good-enough. Neither the PS4 or XB1 were crazy cutting-edge supercomputers on launch, and the PS4 Pro continues to outsell the XB1X, even though the XB1X is much faster.

Expect an AMD APU in both Microsoft and Sony's next-gen consoles.
 
The nvidia fanboi love is a bit over the top. Vega is on par with the 1080. Regardless Sony and MS are not looking for TOP end performance they are looking for very good mid range performance they can sandwich into a SOC pacakge. (No Sony and MS are never going to include a CPU + GPU design ever again)

Nvidia is not capable of supplying Sony what they would be looking for.

Integrated Vega+Zen is more then good enough to cover Sony for years. They don't need 10x the performance of a PS4... 2x would even be more then required. Lets all face reality a little bit Console developers are not going to stop targeting PS4/Xbone hardware for YEARS. Why would Sony feel the need to make nothing on hardware for years trying to jam some over priced NV junk into a box... when the developers will still be targeting AMD hardware in the PS4. Its far more logical to use beefed up AMD hardware. GNM and GNMX make the AMD GPU sing, and I have no reason to believe it will not do a great job on Vega hardware. Sonys in house API guys are very good at what they do, they will squeeze every drop out of a die shrink vega if that is what they have to work with.

Sticking with AMD is the only logical option at this point... game developers are not going to want to target drastically different GPU and CPU architectures. Sony has done as well as they have keeping developers happy. Perhaps there is an extreme outside chance of Sony or MS going Intel/AMD vega APUs... that only happens though if someone at Intel decides they need a big expensive marketing win and they drop their pants.
 
Your math is a bit off on that.
I did the math earlier in this thread comparing Ryzen to Jaguar clock-for-clock, and if Ryzen stayed at 2.0GHz like everyone here is speculating due to the power envelope, the performance improvement would be around 2 to 3 times faster (in general), not 4-5 unless a medium to large clock speed boost is included.

You are forgiving CCX. They could make a custom CCX for 4 cores. Even if they were clocked slower, they would be pretty powerful. Add integrated Vega, you have a pretty substantial console. Maybe it's even Navi or die shrink on Vega.
 
Last edited:
Guys what about Infinity Fabric? What if AMD uses Infinity Fabric to strap multiple Polaris GPUs together? That would be interesting actually. Much more interesting than a Vega GPU going in the PS5.
 
Guys what about Infinity Fabric? What if AMD uses Infinity Fabric to strap multiple Polaris GPUs together? That would be interesting actually. Much more interesting than a Vega GPU going in the PS5.

That's interesting, but would probably be cost prohibitive and I dunno that they are there yet with infinity fabric. They are still trying to perfect CCX on Zen.
 
FAKE NEWS. Faked entry by pumper troll trying to give AMD stock a 'helping hand'..

PS5/XBOX Next will use ARM CPU's in AMD/NVIDIA SoC's.
 
The PS3 was sony's least successful console. And their current console, the PS4, based on all-AMD hardware, is doing exceptionally well. AMD simply offers the industry something nobody else can, and Sony has seen huge success while using AMD hardware. I don't actually see your train of thought.
I see his train of thought. And oh boi is it obvious.
 
Last edited:
Guys what about Infinity Fabric? What if AMD uses Infinity Fabric to strap multiple Polaris GPUs together? That would be interesting actually. Much more interesting than a Vega GPU going in the PS5.
Vega at low power is much more efficient. It struggles currently at break-neck voltages.
So of all things, why are you wanting soon phased out Polaris w/infinity fabric, and not Vega(+) w/infinity fabric?
Why would you even choose a non-shrunk higher power using less feature supported architecture?
 
You are forgiving CCX. They could make a custom CCX for 4 cores. Even if they were clocked slower, they would be pretty powerful. Add integrated Vega, you have a pretty substantial console. Maybe it's even Navi or die shrink on Vega.
This is true, and while a more powerful GPU would certainly help, the main bottleneck on the existing top-end consoles is the CPU, though, which in turn is holding back the existing GPU far more than it should.
By "holding back" I mean making the consoles not capable of native 4K @ 60fps with AAA games, and mostly being limited to 4K @ 30fps.

With 1080p @ 60fps with AAA games, though, which was the original goal of the first iteration of this generation of consoles, that has been pretty much met in quite a few AAA games, save for a few like Fallout 4 - again, this is mainly due to the low-power (weak) CPU in the consoles.
 
Pascal is more than enough. It has more than enough performance for next-gen console games at 4K, it's just more than enough dollars as well.

Even if Microsony get a special deal, and get a 1080Ti GPU for literally half price, the price is still astronomical, and that's just the GPU. They'd still need a CPU and entire BOM. Imagine trying to sell that for $499.
PS3 and PS2 I believe were sold at a loss. The days of Sony doing that are over. They will never use a high priced solution, end of story.

Guys what about Infinity Fabric? What if AMD uses Infinity Fabric to strap multiple Polaris GPUs together? That would be interesting actually. Much more interesting than a Vega GPU going in the PS5.
I would expect this for dGPU first, possibly navi, but now with updated roadmaps it appears it's still using GCN, so I would expect likely the 'next generation' solution AMD has mentioned. There is a possibility they will remove the geometry starving/limited bottleneck with Navi which will breathe some more life into GCN, hence this move.
Polaris already has IF links and they are used for SSG and oil/gas exploration, it revolutionised that.
Vega has 500gb/sec IF link(s) and more than enough bandwidth to do it, they likely are using vega dies to test this in lab currently.


Those of you arguing for game streaming are making the same mistake as pcgamer. Not everyone can stream, so you by default cut yourself out of emerging markets, 3rd world and many rural areas. Also you run into the same issue netflix and similar are having with bandwidth and net neutrality.
it's too early for mass adoption. Maybe in 10 years it will be more feasible, especially if the satellite internet that Space-X is developing becomes feasible.
 
Vega at low power is much more efficient. It struggles currently at break-neck voltages.
So of all things, why are you wanting soon phased out Polaris w/infinity fabric, and not Vega(+) w/infinity fabric?
Why would you even choose a non-shrunk higher power using less feature supported architecture?
Yeah but if you're going to undervolt and underclock Vega, you're just going to wind up with something slower than the current Xbox One X. That would be a huge embarrassment for Sony if they released the PS5 only to be slower than a console that has been available already for well over a year. I just think Polaris is a better architecture overall. If they die shrink it and strap a bunch of them together using Infinity Fabric it could turn out to be a very potent GPU with very decent efficiency. I don't think Vega's inefficiency was simply due to being pushed too far in terms of clockspeed. I think the architecture itself is inherently inefficient. I actually think it's the worst GPU design I have seen since the Matrox Parhelia, and that says a lot. And we all know what happened to Matrox after the Parhelia came out (and flopped).
 
Yes compression for sure you loose a bit...

The point is the VAST majority of gamers on not on 1080ti level hardware, they are on 1060... 750 ti... and the number on integrated chips is even higher. For average gamers streaming compression is no where near an issue if they are streaming 60fps of ultra PC setting games. I will take a little bit of compression at 60fps ultra over no compression 1060 level hardware any day.

Problem is Ultra settings don't do much for image quality today. You'll find it hard to see any extra benefits using it. Also, compression will lose image quality to the point where all the benefits from Ultra are lost. So something like a GTX 1050 is going to produce a sharper image than a video stream. If you use 4k, then the compression artifacts are going to be more noticeable.
There is no doubt that a super high end gaming PC is still going to be superior... the real question is how does streaming compare to the average gamers experience. IMO from what I have seen it destroys what most people would expect out of a current console and is beyond anything the average gamer has even seen at a rich friends house. ;)
Considering from what I've heard, cloud gaming has not been good. The most positive thing I've heard is that the delay you experience is something you get used to. Not very encouraging. Mind you so very few people express their experience with cloud gaming services.

The compression artifact issue arguments remind me exactly of what the movie industry said about streaming 5 years ago... but but Blu has so much less compression on it. Ya guess what for average consumers Netflix was a massive step up from the Wal Mart Bin DVDs they where buying at the time. As fantastic as Blu was average people just didn't and still don't have cinephile level gear at home. Same applies to gamers... the masses don't have high end gaming PCs. (streaming movies, tv, music the one thing it all has in common is its aimed at the masses not aficionados... gaming services will be no different)
Video streaming and cloud gaming two very different things, and you know that. Blu-Ray has been fucked because Sony got greedy. Remember in 2006/2007 the average cost of a Blu-Ray player was $1000, if you don't buy the PS3. The cost of a movie was like $50-$60. On top of that, most movies were just upsampled DVD movies, which gave no real value of buying these. Also, up until Netflix there was no way to watch movies without putting on sneakers and sneaker-netting your way to your local Blockbuster.

Also unlike cloud gaming, Netflix did offer much cheaper alternatives than Blu-Ray. Plus, you don't need to buy each movie to watch it, unlike cloud gaming services. See how well Netflix will work out when each new movie released requires a $20 purchase. Old games are free on cloud gaming services, but not new AAA games. Also people definitely do run Plex and EMBY servers at home, despite electricity cost. I know I do.
 
I really thought Sony would go with nVidia for the GPU for the PS5. I fail to see what Vega offers that is above and beyond the current Xbox One X. Do they really think AMD will be able to engineer something better in time? This is a risky move by Sony. Sure, Ryzen is a fantastic CPU choice. But I can really see the GPU side of things ending very badly for them.
Sony won't be using a Vega 64 or 1080 Ti equivalent GPU's. They'll be using whatever is mid ranged, and AMD has done extremely well in the midrange PC gaming market. That is if people weren't buying them up to mine crypto.

Also, the benefit of going with AMD on both CPU and GPU is that you can have an all in one package, where as Nvidia needs to be on a separate chip. You have to factor in cost, cause afterall a console is just a cheap gaming PC. Think Kaby Lage G.
 
Sony won't be using a Vega 64 or 1080 Ti equivalent GPU's. They'll be using whatever is mid ranged, and AMD has done extremely well in the midrange PC gaming market. That is if people weren't buying them up to mine crypto.

Also, the benefit of going with AMD on both CPU and GPU is that you can have an all in one package, where as Nvidia needs to be on a separate chip. You have to factor in cost, cause afterall a console is just a cheap gaming PC. Think Kaby Lage G.
Well wouldn't it make a whole lot more sense if the PS5 used an nVidia Tegra ARM CPU paired with an integrated Volta GPU? This rumor seems like complete nonsense to me. Sony is nuts using AMD for the GPU in the PS5.
 
Also unlike cloud gaming, Netflix did offer much cheaper alternatives than Blu-Ray. Plus, you don't need to buy each movie to watch it, unlike cloud gaming services. See how well Netflix will work out when each new movie released requires a $20 purchase. Old games are free on cloud gaming services, but not new AAA games. Also people definitely do run Plex and EMBY servers at home, despite electricity cost. I know I do.

Sure Sony in charge of any format have never been a good idea. lol

When I talk about ps5 or 6 going streaming I'm not talking about a simple you buy game and stream game service. That is what is out now sure the tech is in the cradle. In 3-5 years from now I suspect someone is going to launch a proper streaming service that will include a good number of AAA games. Heck we may even live in a world in 10 years where every big publisher like EA attempts to start their own service with exclusive games. If more countries do things like make Loot boxes illegal they will be quite open to the idea of on going subscription money. Having seen streaming on a good connection... I know its technically already very possible, what needs to improve is the average quality of broadband. That will happen no question... how long it takes is the only real unknown. (I do doubt PS5 is anything less then a ZEN VEGA powerhouse... I would bet myself that is the last of the self game powering consoles, but I could be wrong and that may still be further out)
 
Well wouldn't it make a whole lot more sense if the PS5 used an nVidia Tegra ARM CPU paired with an integrated Volta GPU? This rumor seems like complete nonsense to me. Sony is nuts using AMD for the GPU in the PS5.

HAHAHA yes wouldn't it make so much more sense to try and sell $2000 playstations, that play the same games developers will be targeting to run smooth on PS4 hardware for years after its release.

Nintendo could go Tegra cause they where building a Mobile+ gaming console... and they are Nintendo. Their games are 90% in house and designed to be good games not 3D effect demos. As development was in house and their own teams had made plenty of ARM compiled titles for earlier systems it likely wasn't a big conversion for those teams.

If sony goes ARM+Nvida. Developers have to now target 2 complete different architectures. Sony could build GNM tools to make that seemless for average PS developers... for the higher end PS developers that use GNMX (their lower level API that allows direct control of hardware) Sony would be doubling their workload.

Tegra is not equal remotely in terms of horsepower to an AMD Ryzen/Vega APU. And there really is no way in hell Sony is going to shove a GPU into a console that even if they get a smoking deal from Nvidia would DOUBLE at least their production cost.

I get it you love you your nvidia. Still I think you need to be a bit honest with yourself. In the Mid range where Sony is buying, between NV and AMD... AMD wins. That isn't likely going to change when NV releases their new cards either. The reason miners have loved AMD so much is simple their cards are more powerful, and operate at low voltages very well... and Sony has done a really good job harnessing that with their GNM APIs. Nvidia in PC land has done a really great job of paying off developers, I'll give them that... sometimes I really wish AMD would get their evil on like everyone else. (joking about that last bit)
 
Back
Top