Next gen PS5, close to 2080 level

I dunno, isn't that extreme look all the rage with the kids these days? I wouldn't be too surprised if the consumer model does look similar to that design. Perhaps the 'Pro' version will take the aesthetic a step further and add RGB elements.
 
I dunno, isn't that extreme look all the rage with the kids these days? I wouldn't be too surprised if the consumer model does look similar to that design. Perhaps the 'Pro' version will take the aesthetic a step further and add RGB elements.
It is just the dev kits. The consumer version will be nothing like it. Dev kits normally look radically different from the shipping unit.
 
It is just the dev kits. The consumer version will be nothing like it. Dev kits normally look radically different from the shipping unit.

One can dream.

The rumored PS5 dev kit isn't too far removed from the PS4 dev kit. Similar venting and bulkiness.

ps4cont-640x353.jpg

https://www.extremetech.com/gaming/...otype-with-move-a-display-and-a-speaker-leaks
 
That thing is ugly as shit and over designed to be a dev kit. Unless it needs all those vents to even function as a dev kit.

Dev Kits are for developers. They don’t need to be particularly pretty or even quiet. They need to function.
 
It's insane how much people are building up the performance of a console that's do for release in a year or less like some groundbreaking performance advancements are going to happen between then and now.

Even something like the PS4 Pro was the equivalent of a low \ mid range gaming PC at launch. I think best case scenario you might get 3600\3700 + 5700\5700xt in an APU package. And even that level of performance I find doubtful for $400.

A 5700 xt will, next year, be right in line with the level of mid-high tier GPU that has historically gone into most recent consoles e.g. 7850-ish on ps4, 570/580 on ps4 pro/xbox one x.

I do agree that we won't see the glory days of past consoles (ps3, xbox 360, xbox OG) of early 2000s where top-end gpus were de rigueur, but a 5700 xt is not out of bounds.

CPU is a bit less clear, but I'm guessing frequency-limited 3600 level.
 
None of these received 'top-end' GPUs ;)

Actually the 360 was...the embedded Dram made it be able to do 4x AA with virtually no performance hit and it was faster then any desktop GPU at it's release. It was faster then a x1900 on Team Red and faster then the 68xc series from Nvidia IIRC.

The 360 I believe had a custom x1800 GPU (ala the E DRAM) I believe it was faster than AMD offerings, but I’m not sure about the 7800 series which barely beat the 360 to market that same year.

Not that it mattered much since the 8000 series launched the next year eclipsing anything earlier.

But that’s rather irrelevant as top end doesn’t mean “best” it means in the top range of products. Which, the PS3 had a custom chip based on the 7800. Arguably top-end despite not being “best”.
 
as long as it does a decent upscaled 4K at 30 fps that will be good enough. 60FPS 4K TV's are pretty few and far between. This can only help the PC gaming market as it will mean less work for developers and a better starting point for overall game design, at this stage I will take what ever bones they throw at us, I am old and cantankerous.
 
I mean, I think MOST people would be hard-pressed to tell the difference between 75% up-scaled 4K (16XXp) with newer contrast adaptive sharpening algorithms versus native 4K.. Since RDNA seems to do this CAS basically 'free', I'd imagine it will be used heavily in the newer consoles to do exactly that: upscale lower resolutions to 4K.
 
as long as it does a decent upscaled 4K at 30 fps that will be good enough. 60FPS 4K TV's are pretty few and far between. This can only help the PC gaming market as it will mean less work for developers and a better starting point for overall game design, at this stage I will take what ever bones they throw at us, I am old and cantankerous.

4k 60hz TVs are rare? I didn't know that, but then again I bought one that is basically a gaming monitor supersized.
 
I mean, I think MOST people would be hard-pressed to tell the difference between 75% up-scaled 4K (16XXp) with newer contrast adaptive sharpening algorithms versus native 4K.. Since RDNA seems to do this CAS basically 'free', I'd imagine it will be used heavily in the newer consoles to do exactly that: upscale lower resolutions to 4K.
Hell, just look at current consoles with their checkerboard method to hit 4K. It’s a damn good solution for what the hardware is. Noticeably more detail than the 1080 option. (Though the smoothness on 1080...)

4k 60hz TVs are rare? I didn't know that, but then again I bought one that is basically a gaming monitor supersized.

I think they used to be rare....before HDMI 2.0 was standard. I recall those relatively cheap seiko sets on amazon only being 30Hz years ago.

I got a highish end Sony in 2016 and it’s a 120/60Hz panel. (1080/4K respectively) I can not imagine 3 years later and 60Hz 4K still being rare at all.

Honestly outside of the HDMI1.4 specs I’m not sure it was ever rare.
 
4k 60hz TVs are rare? I didn't know that, but then again I bought one that is basically a gaming monitor supersized.
Not rare but not the norm, probably a 70% of the 4K TV's sold over the last few years have been 60Hz screens that use motion scaling to simulate a higher refresh rate, the remaining 30% are made up of 120 and 240Hz panels. Even a lot of the TV's that have 120Hz screens are still using older HDMI versions that will limit it to 60Hz and even the new stuff from the "Budget" brands if it is less than 60" it is 60Hz.
 
Not rare but not the norm, probably a 70% of the 4K TV's sold over the last few years have been 60Hz screens that use motion scaling to simulate a higher refresh rate, the remaining 30% are made up of 120 and 240Hz panels. Even a lot of the TV's that have 120Hz screens are still using older HDMI versions that will limit it to 60Hz and even the new stuff from the "Budget" brands if it is less than 60" it is 60Hz.
I feel like you’re sending mixed messages. Your previous post states that 4k60Hz panels are “far and few between” and then this post your postulate that most (4K)TVs are 60Hz, yet not the norm and a few being 120Hz and 240Hz.

Which...are there even true 240Hz 4K TVs(TVs not monitors)? Many manufacturers use their motion blur smoothing tech and tout higher than actual refresh. Hell I think my Sony touts 960Hz which is obvious marketing BS.
 
I feel like you’re sending mixed messages. Your previous post states that 4k60Hz panels are “far and few between” and then this post your postulate that most (4K)TVs are 60Hz, yet not the norm and a few being 120Hz and 240Hz.

Which...are there even true 240Hz 4K TVs(TVs not monitors)? Many manufacturers use their motion blur smoothing tech and tout higher than actual refresh. Hell I think my Sony touts 960Hz which is obvious marketing BS.
A 60hz TV only refreshes at 30fps. TV screens always display interpolated regardless of source. So you have to half their numbers right out the gate.
 
Last edited:
Not rare but not the norm, probably a 70% of the 4K TV's sold over the last few years have been 60Hz screens that use motion scaling to simulate a higher refresh rate, the remaining 30% are made up of 120 and 240Hz panels. Even a lot of the TV's that have 120Hz screens are still using older HDMI versions that will limit it to 60Hz and even the new stuff from the "Budget" brands if it is less than 60" it is 60Hz.

Thanks but you just proved my point...You know, that nearly 70%+ of all modern 4k sets are using a native 60Hz panel. We weren't discussing if they have the motion smoothing "120/240hz" crap.

The guy said that nearly all 4k TV's only do 2160p30 when I knew that was false.

My set is really great, in that you get the option of running 4k 60hz with 4:4:4 or you can run 1080p/1440p 120hz (native not motion smoothed) with FreeSync to boot. I think it fills the gap of 4k 60hz not bring able to be done with current HW with great eye candy and FPS numbers outside of the 2080ti and the edge cased RTX Titan.
 
Uhhhhhh. Okay....
A TV advertised at 120hz can mean one of two things, the actual screen displays at 120hz or the back strobe is capable of displaying at 120hz and they have not stated what the screen can do, a TV's rated Hz rate more than not has nothing to do with the actual panel.
 
Furthermore the displayed refresh rate of the TV is not always that for the screen, they often state the rate of the strobing backlight.

You are reaching here. Nearly every non Chinese brand set, and I'd wager 98% of the ones sold in the last 2 years have at least 1 HDMI 2.0 port and have a native 60hz panel.
 
A TV advertised at 120hz can mean one of two things, the actual screen displays at 120hz or the back strobe is capable of displaying at 120hz and they have not stated what the screen can do, a TV's rated Hz rate more than not has nothing to do with the actual panel.
Fuck me, well it looks like most 4K TV's use real 60Hz pannels, with the exception of VISIO and a lot of the house brands for anything under 60", anything over 60" and you are basically guaranteed a bare minimum of a 60HZ panel from the stand point that nobody seems to make a panel that does slower than that in that size outside of China and those ones stay in China.
 
You are reaching here. Nearly every non Chinese brand set, and I'd wager 98% of the ones sold in the last 2 years have at least 1 HDMI 2.0 port and have a native 60hz panel.
Yep you are correct I am wrong, unless you are buying under 55" from Walmart you are getting a 60Hz bare minimum.
 
None of these received 'top-end' GPUs ;)

The 1800xt-like GPU in the 360 and 7800 GTX in the PS3 were top-end $400+ ($500+ inflation adjusted) PC cards in 2005/6. The Xbox OG had a Geforce 3 in it which was top dog at the time of its release. A far cry from the cut-down 78xx derivatives in the PS4 and XBOne.
 
At 360 launch the first party games (IE Kameo, for example) looked better than anything I’d ever seen on PC gaming.

I haven’t seen a game on modern consoles that bears PC graphics.

Next gen will be good though. You can do an awful lot with the base hardware of 16 threads and 1080ti speeds!!! (Though it’ll be 2-3 year old top end PC speeds by the time it launches)
 
The 360 I believe had a custom x1800 GPU (ala the E DRAM) I believe it was faster than AMD offerings, but I’m not sure about the 7800 series which barely beat the 360 to market that same year.

Not that it mattered much since the 8000 series launched the next year eclipsing anything earlier.

But that’s rather irrelevant as top end doesn’t mean “best” it means in the top range of products. Which, the PS3 had a custom chip based on the 7800. Arguably top-end despite not being “best”.

Damn straight the 8000 series blew both the 360 and PS3 out of the water on release a year later It basically obsoleted the consoles for years, at least on graphics performance. G80 nearly doubled performance from prior top-end GPUs of the time. I recall it being most apparent in Oblivion where the console version had to be massively gimped in detail and AA on the 360 vs what a PC with an 8800 GTX could do. The PS3 version was gimped even more in textures (mostly due to the smaller availalble 256mb VRAM). I fondly recall looking down with disdain on console peasants from my 8800 SLI ivory tower.
 
None of these received 'top-end' GPUs ;)
PS2 had a custom chip which was competitive with PC Gpus at the time of release.

Xbox pre-dated the Geforce 4, had a custom chip on it which fetaures-wise was basically a tweaked Geforce 3 and outperformed Geforce 3, due to an extra vertex unit.

Xbox 360 was the first product to market with fully unified shaders and peforms as well or better than Geforce 7. Which was top dog at the time. The Xbox 360 enjoyed this market position as the only fully unified shader hardware, for about 1 year.

The RSX chip in the PS3 is based on Geforce 7900 and performs similarly. At the time the PS4's specs were finalized, 7900 was Nvidia's top dog. However, the Geforce 8 series released in the same month as PS3 (around a year after the Xbox 360). The 8800 gtx is of course, MUCH faster than a 7900. And has a fully unified shader architecture. Thus ending 360's 1 year feature advantage.

PS4/Xbone were really the first time the consoles took a modest and affordable approach to their hardware. The reason being a strict view to not lose money on each hardware sale. That was a large issue with the PS360 generation.
 
Last edited:
At 360 launch the first party games (IE Kameo, for example) looked better than anything I’d ever seen on PC gaming.

I haven’t seen a game on modern consoles that bears PC graphics.

IDK, RDR2 with 4K and HDR looks really fucking good on the Xbox with a Z9D.

But I totally agree, it is hard to find a game that looks better on the console compared to PC.
 
Yeah, the launch 360 games looked mostly on-par with the PC games at the time. Red Dead 2 looks great, but I still feel like the framerate hinders it. It looks like a PC game when it isn't moving (or in an interpolated video), but it's immediately an obvious console game when you see it in motion.

The Xbox boss keeps making references to this generation also focusing on high framerates, so I'm encouraged by that. He keeps mentioning 120 specifically. Hopefully they realize 90%+ of their base is going to be capped at 60fps for a while.
 
Solid 60fps is fine for me. I haven't made the ascension to 144hz, and don't plan on it. Read too many posters state that they can never go back to 60 after experiencing 144, and my wallet isn't ready for that jump. I'd rather live with my ignorance.
 
You know people that keep crying about how weak consoles are they are comparing them to current mod range PC. PS is going on 6 years now. The consoles are hard to beat for the price when they are released. Sure they don't compete against the tip top PC specs of the time which itself is a small market. is just the past few generation have been dragged out. PS5 and the next Xbox are going to be a beat when they are released but again 5 years down the line the so called PCMR people going to look down on the console. I have a PS4, switch and a high end PC and guess which gets the most play time. It ain't the PC.
 
Solid 60fps is fine for me. I haven't made the ascension to 144hz, and don't plan on it. Read too many posters state that they can never go back to 60 after experiencing 144, and my wallet isn't ready for that jump. I'd rather live with my ignorance.

Honestly, 144 is beautiful, and it’s a “sacrifice” going back to 60 but to me it’s hardly a never can go back situation.

It did make me bitch at my console buddies that they don’t play Destiny 2 on PC though. 144 on that and then back to PS4 to play with them it felt so much slower, but not impossible to adjust to.
 
Honestly, 144 is beautiful, and it’s a “sacrifice” going back to 60 but to me it’s hardly a never can go back situation.

It did make me bitch at my console buddies that they don’t play Destiny 2 on PC though. 144 on that and then back to PS4 to play with them it felt so much slower, but not impossible to adjust to.

Stop trying to tempt me with your sweet words, Temptress.
 
Solid 60fps is fine for me. I haven't made the ascension to 144hz, and don't plan on it. Read too many posters state that they can never go back to 60 after experiencing 144, and my wallet isn't ready for that jump. I'd rather live with my ignorance.

I ain't going back. I was already on the PC for higher framerates and higher IQ with lower input lag all around (and no godsdamned controller for everything), and going to 120Hz+ along with VRR is a game-changer.

These consoles aren't going to hit 4k60 at full resolution, and certainly won't be hitting 4k120 with anything that isn't a cartoon. MS and Sony can keep them.
 
They stopped from locking games forever ago... Unless they got back into it. Why on Earth would xbone support freesync if frame rate was limited and synced with vsync, that would make 0 sense. This hasn't been the case since like ps1 and maybe a couple of early ps2 games.
Because they use dynamic resolution into meet 60 fps it isn't running at native, and dynamic frame rates are better than image quality drops so it's better to have that option.
 
Last edited:
Solid 60fps is fine for me. I haven't made the ascension to 144hz, and don't plan on it. Read too many posters state that they can never go back to 60 after experiencing 144, and my wallet isn't ready for that jump. I'd rather live with my ignorance.
I went from 60hz to 144hz, it was good, but nothing game changing, unless you play pong which can actually run at 144fps. I went back to 75hz from 144hz and it took about 2 seconds to get used to it again, I choose IQ over high refresh rate any day.
 
I went from 60hz to 144hz, it was good, but nothing game changing, unless you play pong which can actually run at 144fps. I went back to 75hz from 144hz and it took about 2 seconds to get used to it again, I choose IQ over high refresh rate any day.

I choose both, including a large Freesync Range. It is incredible what can be done on a Vega 56 and RX 5700. Oh, and although the pong reference was sarcasm, you do realize that most games from 2017 back are going to be able to be played at high fps, right?

Heck, I just checked out Batman Arkham Knight at 1440p 144hz Freesync and the fps were extremely high and smooth. :)
 
Maybe they will make a special adapter so that you can get 2 consoles and hook them up in SLI...
they did that with PS2, which was perfect for beowulf cluster projects, then with PS3 it kinda failed because the stacks kept falling over

not sure about PS4, but I'm sure people have tried
?u=http%3A%2F%2Fmedia.fyre.co%2FS80GYgYaQFmjXsAbvZdu_ps4%2520stacked.jpg
 
Back
Top