Will PC Gaming Requirements Jump with New Console Release?

psy81

Gawd
Joined
Feb 18, 2011
Messages
605
With the new PS4 and Xbox720 around the corner do you think the game requirements on the PC will jump significantly over the next couple years? Knowing that developers are able to squeeze more out of the hardware in consoles do you foresee this happening? I can't remember but when the PS3 and Xbox360 dropped I believe Nvidia's GTX 7800 was the top dog. For those of you who remember, did the 7800 become dated quickly when these systems dropped? I'm just a bit concerned that my GTX 670s in SLI will become dated much quicker as a result of the new consoles coming to market this year. Maybe I should've held off but oh well, everything gets dated quickly.

Let me know what you think.
 
If you are scared of you hardware being outdated quickly then you would practically be waiting on forever. There is a thin line between buying hardware which is future proof and waiting forever and not buying anything at all!... If you have bought two 670`s that is your rough guess that this setup would serve you well for some years to come and that is I think is what everybody should do. Electronics keeps on changing at a very brisk pace and at this rate nobody can keep up with it. You have to be bill gates to own the latest and the greatest in tech!. So what I am saying is, just make a rough guess and go for the best setup you can afford at the moment, dont fret after buying.

About the console thing, I dont think consoles are ever going to come even close to gaming rigs in the near future. The gap is immense, console makers dont have the necessity to invest in rigorous R&D to create something special and powerful because console gamers dont need more than a modest 30 fps with low details in game so obviously manufacturers would try only to reduce the cost of manufacture but not increase it to come on par with PC hardware. So, to answer your question, A big NO to "Will consoles ever increase requirements of PC`s for games?"
 
I think dual 670's will be considered high end for several years, especially at 1080p. Certainly much more horsepower than any PS4/720 hardware will have!
 
I would be more worried about that cpu as it cant even fully push your current gpus in every game now at 1080.
 
Dépend on what sony and ms will do they might just do us a wii consile, bumpung resolution without any change or just run game At 1080p ultra pc equivalent... Then you would be ok, if they go full fledge then we will all crawl, like it's the normality since 3d console exist
 
If you've a 670/680/7950/7970 (very close performance) you should be good for sometime, if anything getting 2 of these would for sure future-proof you for a while.

And like some one said, if you wants to be top of the line for ever, you'll never get around to buying anything, because electronics move pretty fast.
That I believe goes for the latest i5,i7,AMD-83XX series CPUs too.
 
If the new xbox ends up having a true 8 core cpu,its possable we could see 8 core cpu as a requirment in a couple years. Rumors are not really showing anything that would call for new gpu requirements yet.
 
I'm aware that my CPU is not as fast as what Intel has to offer but at least it is an 8-core CPU so if next gen consoles are 8-core as rumoured we'll probably see more PC games that utilize all cores. Most games currently run 2 cores or 4 cores while the remaining cores idle. For $200, its not a bad CPU. I'm planning an Intel Build in 2014, but this is beside the point. The point of this thread is to discuss GPUs.

The question is do GPUs become dated "more quickly" with the arrival of new consoles as opposed to say the last 4 years where no new consoles came to market. Keeping in mind that although the hardware in consoles will never on par with PC, game developers are able to squeeze more out the HW since they do not have to deal with the fragmentation.

I'm just curious for those of you who had the GTX 7800/8800, did you find that your GPUs weren't cutting it shortly afterwards? Is this the trend?
 
If the new xbox ends up having a true 8 core cpu,its possable we could see 8 core cpu as a requirment in a couple years.

rofl. no.

I'm aware that my CPU is not as fast as what Intel has to offer but at least it is an 8-core CPU so if next gen consoles are 8-core as rumoured we'll probably see more PC games that utilize all cores.

I hope that the move to x86 and multiple cores will force developers into actually utilizing multiple cores, which will translate very very nicely into desktops cause you are running in the same platform. Your cpu is ok, not perfect, but multitudes more powerful than what will be in the new consoles.

Secondly, the only reason why those GPU's from 2004/2005 were outdated quickly is because the gpu inside the xbox was ACTUALLY relevant and pretty darn good by those standards. If i'm not mistaken, the gpu in the xbox was considered high end at the time. Not sure if you have heard the rumors - but the gpu inside the new consoles will be around the same heft as an HD6670 - which is the LOW end of the gpu line TWO generations ago. By the time they actually come out, the gpu will be THREE generations old. So no - your hardware will by no means become irrelevant.

We will see a nice jump in graphics though - from consoles to the desktop space. I believe the most beneficial thing will be moving everyone to the same platform (X86 processors) which will allow developers to code more efficiently for desktops MUCH easier and everything will scale very nicely.
 
pc graphics have gradually overhauled the consoles ever since the arrival of the 5870, so, there is a sweet point less investment is required to provide a similar visual quality on the PC.

that said, i don't expect it to happen until at least three years after the arrival of the 720/PS4.

so, yes, certainly for the next three years console gaming will haul up visual quality in pc gaming rather than drag it down as at present, which will push up hardware requirments.
 
My speculation is that current GPUs will get dated quickly much like the GTX 7800/8800 series cards. It wasn't until the GTX 200 series came out that the hardware caught up to the software. The transition from 4:3 to 16:9 during this time may have also led to the increase in hardware requirements.
 
... I believe the most beneficial thing will be moving everyone to the same platform (X86 processors) which will allow developers to code more efficiently for desktops MUCH easier and everything will scale very nicely.

Good point!
 
I'm aware that my CPU is not as fast as what Intel has to offer but at least it is an 8-core CPU so if next gen consoles are 8-core as rumoured we'll probably see more PC games that utilize all cores. Most games currently run 2 cores or 4 cores while the remaining cores idle. For $200, its not a bad CPU. I'm planning an Intel Build in 2014, but this is beside the point. The point of this thread is to discuss GPUs.

The question is do GPUs become dated "more quickly" with the arrival of new consoles as opposed to say the last 4 years where no new consoles came to market. Keeping in mind that although the hardware in consoles will never on par with PC, game developers are able to squeeze more out the HW since they do not have to deal with the fragmentation.

I'm just curious for those of you who had the GTX 7800/8800, did you find that your GPUs weren't cutting it shortly afterwards? Is this the trend?

It honestly depends.

You have to think how game developers have been treating console and PC games for the past several years since the 360 and PS3 were released in 2005.

The advantage the consoles had over the desktop PC was programming close to baremetal. Almost always it is a small OS kernel sitting between the hardware and the game. For example, for the PS3, the game is separated between a hypervisor (for security) and the OS and then the hardware. The 360 is modeled similarly to Windows on the PC but obviously a lot more optimized-- small kernel, single hardware configuration meant more focused drivers for each component of the console, and so on.

To put in it perspective: It was not until the first games released for DirectX 10 around 2007 on the PC showcased graphics that the 360 and PS3 have been doing since 2005. That meant games were a good two years behind the next gen console at the time of their release. And, these consoles were sporting either a modified Radeon 1950 GT for the 360 or a modified Nvidia 7800-series GPU in the PS3. The 360 was using a highly modified DirectX 9 that had features that wouldn't show up in DirectX 10 in Windows Vista released in 2006. Unfortunately, first DX10 games didn't show up ONE YEAR later. Crysis in 2007, as unoptimized the game was or how unprepared the hardware was at that time, showcased what's possible with DirectX 10. Unfortunately, both the 360 and PS3 were showcasing graphics on par to DX10 games released two years after their console's respective launches. As for the PS3, I would not be surprised the graphics API and console drivers/extensions had added features to OpenGL 2.0 that wasn't available until OpenGL 3.0 hardware released later on.

That's how optimized a game console is compared to a PC. You're talking about a single hardware configuration focused on doing one thing-- play games. That's not including media services such as movie streaming and music playback.

It wasn't until better games were released with higher resolution textures and higher polygon count on the PC and then DirectX 11 came about that the consoles actually were behind in graphics quality. So, we're talking about sometime around 2010, about five years after the 360 and PS3 launch.

It's unfortunate though that many games today on the PC are console ports not taking advantage of the full potential of a computer's graphical hardware. Of the very few games, and I'm talking about a handful of them, on the market today and to be released this year do indeed push graphics hardware on the PC quite a lot. Games such as the upcoming Bioshock game, Battlefield 3, Farcry 3, Metro Last Light and a few other games.

So, when we talk about the next generation consoles from Sony and Microsoft that may be released either the end of this year or some time in 2014, we can apply that same logic to the console and PC games again. Why? Just take a good hard look at game development between consoles and PCs, and console ports in general. Then, take what I said above. Put that altogether and you'll have a very good idea what the games will be.

The next generation consoles will, again, use mid-range graphics hardware released in the last six to seven years if I'm assuming research into a console successor began one year after the 360's and PS3's launch dates. And, if AMD is the rumored GPU in both of Sony's and Microsoft's consoles, we're looking at a mid-range Radeon 4000, 5000, and 6000 series GPU.

Now, we begin the process of elimination. Assuming that Microsoft will opt for DirectX 11 at minimum, we can eliminate the Radeon 4000 series altogether. That leaves the Radeon 5000 and 6000 series. For the next PS4, it'll be OpenGL and for that to have features on par with DX11, we're looking at OpenGL 4.0 minimum. That would mean OpenGL 4.2 capable hardware-- Radeon 5000 and 6000 series.

Next, we consider power usage and the case size of the next consoles. The 360 has been known to have problems with overheating. The PS3 has had similar cases but not at the same number of failures as the 360. Therefore, we eliminate Radeon 5800-series and Radeon 6900-series. This leaves us with Radeon 5700-series and Radeon 6800-series.

Unlikely that Microsoft will repeat the same mistakes, the next GPUs will have to be on a 32nm or 28nm process. This should allow something like a Radeon 5770 or Radeon 6850/6870 to fit within a power profile of either console. And, that would be the HIGHEST possible GPU in my assumption.

If they are going to use an AMD APU of some kind, we're looking at no higher than a Radeon 6550 GPU (Llano) or Radeon 7660D (Trinity). So, these would be the lowest GPU possible in either console.

At bare minimum, I assume these games will be running at constant 30 FPS or 60 FPS, with 2X to 4X AA (MSAA or similar), 1080p maximum resolution (1920x1080), and the equivalent to a PC game with medium to medium-high settings. Textures and 3D data will need to fit within at minimum 1 GB of VRAM to at most 2 GB of VRAM at the same time.

Given that, take the same hardware optimization of the PS3 and 360 before them, and we're talking about the next gen console games that are on par with PC games that are currently DX 10.1 or DX 11 games. Give them more optimization from the developers themselves, the next gen console games will probably look better than PC games within two years if going by my two year assumption from above.

Looking at it that way, current DirectX 11/OpenGL 4.2 graphics hardware we have today will be the bare minimum to keep up with the next generation consoles. So, we're talking about Radeon 5000 or Geforce 400 series at the bare minimum. To stay on par or slightly ahead of the curve, or on par with the consoles coming out, Radeon 6000 or 7000 series, or Geforce 500 or 600 series minimum.

Unfortunately, if the trend continues that PC games look no different than their console counterparts, I wouldn't expect PC games at minimum two years after the next Xbox or PS4 release to actually match or exceed them in graphics quality. Again, assuming the two year gap as it was before between the PS3/360 and DirectX 10 games. So, we're looking at graphics cards like the Radeon 8000/9000/10000 or Geforce 700/800/900 series graphics cards with DirectX 11.1 or DirectX 12 capability. And, that's assuming PC game developers continue to push game graphics further on the PC that would't be possible in the limitations of a console. But, for that to happen, it'll take time and that is why I assume at least two years if going by the previous 360/PS3 launch and DirectX 10 game releases.

So, if you have a Radeon 4000-series or Geforce 200-series, upgrade to the generation or two above that-- Radeon 5000 and higher, or Geforce 400 and higher. You will be fine until next gen console games become more optimized and look better than the PC counterparts. And, since most PC games will be nothing more than console ports once these next gen consoles are released, you can still keep that video card. Until game developers on the PC take advantage of more powerful hardware available to PC users and possible new DirectX or OpenGL APIs like DX11.1, DX 12 or OpenGL 4.3 or 5 (?), then consider upgrading again. But, you will have to wait for that to happen and that's probably some time after these next generation consoles are released-- games and video cards released in 2015 at minimum to 2016.
 
I imagine with the current state of things that console manufacturers will have a more pronounced emphasis on controling hardware costs than the previous generation. I would hope Sony especially should know that now would not be a good time to launch a monstrously overpowered console with a resulting $600 price tag. The Vita is their shining example that awesome hardware will not save you from glaring inadequacies such as a lack of 3rd party support. On top of all this, they're going to undoubtedly have to work hard to offset the fact that all the games you buy will be worthless to anyone else.

There are 3 big reasons I think the newest generation of consoles will not grossly overtake the PC in graphics quality and overall power. 1. 7 years of time since the last generation means that even modest current gen hardware will be a massive improvement. 2. Anything too expensive is going nowhere in mainstream right now. 3. Experience with the current gen has likely shown console manufacturers that adequate hardware with good games, features and support wins in the long run while powerful hardware with high costs, high failure rates, and lacking features will hold you back considerably.

Still, I'm hugely looking forward to these new consoles coming out for the sole reason that developers can finally lose the ball and chain holding back the PC versions of so many games.
 
There are 3 big reasons I think the newest generation of consoles will not grossly overtake the PC in graphics quality and overall power. 1. 7 years of time since the last generation means that even modest current gen hardware will be a massive improvement. 2. Anything too expensive is going nowhere in mainstream right now. 3. Experience with the current gen has likely shown console manufacturers that adequate hardware with good games, features and support wins in the long run while powerful hardware with high costs, high failure rates, and lacking features will hold you back considerably.
I believe you are 100% right on all 3 points. To add to point#3, the high failure rate is something that neither Sony or MS can afford right now with the current economy, moreso with Sony. Sony has been repeatedly kicked in the balls for the past few years on their balance sheets, and they can't afford to be subsidizing $300 for the cost of each console they sold like in the early days of the PS3.

I highly believe that if anything, the graphics cards capabilities in the next gen consoles will be nothing more capable than what a $150 off the shelf graphics card can produce. The console makers order in massive quantities and can get the parts cheaper, but I'm willing to bet that you could build a computer with similar specs within $100 of the cost of the console's launch price.
 
The fact that a mid-range graphic card from 2008 can still play everything relatively well at a high resolution leaves me very little reason for concern.
 
It is most likely both consoles are going to have an AMD 6670 GPU, people like to balk alot about optimization but please be serious you cant take a 6670 and optimize 4-8 x performance out of it. Maybe in the worst conditions, IE no SLI support horrible driver performance or something, and for whatever reason the same company that was that lazy on the PC will be incredibly diligent on the console at optmizing, not very likely.

I do not know about the xbox but the PS3 is looking to run 1080p at 60 which we are all already running or better. This generation of consoles looks like it is going to be about trying to give consumers even less hardware IMO.
 
It is most likely both consoles are going to have an AMD 6670 GPU, people like to balk alot about optimization but please be serious you cant take a 6670 and optimize 4-8 x performance out of it. Maybe in the worst conditions, IE no SLI support horrible driver performance or something, and for whatever reason the same company that was that lazy on the PC will be incredibly diligent on the console at optmizing, not very likely.

I do not know about the xbox but the PS3 is looking to run 1080p at 60 which we are all already running or better. This generation of consoles looks like it is going to be about trying to give consumers even less hardware IMO.

Fair enough. It sounds like both Microsoft and Sony are playing it safe this time around rather than go all out. And you're right, you can only squeeze so much juice out of the orange. Hardware specs are still all speculative. I was just looking at the Fox Engine and the graphics look amazing. If the demo is for PS3, it's amazing what they are doing with ancient hardware!

From what I read based on litigation between former AMD employees and AMD, Microsoft is going with an 8 core APU where as Sony is going with a 4 core APU paired with discrete graphics. Not sure how reliable the source of the article is...
 
I imagine with the current state of things that console manufacturers will have a more pronounced emphasis on controling hardware costs than the previous generation. I would hope Sony especially should know that now would not be a good time to launch a monstrously overpowered console with a resulting $600 price tag. The Vita is their shining example that awesome hardware will not save you from glaring inadequacies such as a lack of 3rd party support. On top of all this, they're going to undoubtedly have to work hard to offset the fact that all the games you buy will be worthless to anyone else.

There are 3 big reasons I think the newest generation of consoles will not grossly overtake the PC in graphics quality and overall power. 1. 7 years of time since the last generation means that even modest current gen hardware will be a massive improvement. 2. Anything too expensive is going nowhere in mainstream right now. 3. Experience with the current gen has likely shown console manufacturers that adequate hardware with good games, features and support wins in the long run while powerful hardware with high costs, high failure rates, and lacking features will hold you back considerably.

Still, I'm hugely looking forward to these new consoles coming out for the sole reason that developers can finally lose the ball and chain holding back the PC versions of so many games.

I agree that consoles won't overtake PCs anytime, mainly because console design considerations place a lot of emphasis on quiet operation and low power use. That obviously is contrary to what the best silicon can do, while they're great in PCs they simply don't have a place in consoles.

In 2005, the xbox 360 had a GPU *at release* that was somewhat similar to then current PCs, although it was quickly surpassed in a few months by better PC GPUs. Now, that just isn't going to happen. Something like a 7970 or GTX 680 would add far too much cost, not to mention that the cooling and sound characteristics would not be feasible for a console type device. The current xbox 360 GPU uses something akin to what a voodoo5 5500 used for cooling, that is a super tiny ball bearing fan. By contrast, a GTX 680 blower cooler just will not work in a console form factor. Even if sony or microsoft wanted to, they would not be able to use anything even close to the best silicon for graphics rendering.

Basically, PC's will easily retain the performance lead this time around. That said, I do expect developers of multi platform games to take more advantage of high resolution textures and DX11 features with the release of next gen consoles -- and that will have a filter down effect on PC titles being overall higher quality than current games.
 
PC reqs will continue to gradually climb, as always. They've already surpassed current consoles; new consoles would only be playing catch-up. I know my 9800GTX+ isn't as old as a PS3 or 360, but it still beats what a 360 or PS3 can churn out.
 
I think dual 670's will be considered high end for several years, especially at 1080p. Certainly much more horsepower than any PS4/720 hardware will have!

Dual 670s might have more raw calculating power than a new console, but those dual 670s won't have thousands of developers poking and prodding and learning and utilizing their capabilities for two thirds of a decade. The optimization just isn't there.

Most of the horsepower in a given PC goes to waste relative to the hardware utilization levels found in console games. It's pretty remarkable that a $300 box can be a target platform for five years, but yet it keeps happening.

That said, the real threat to high-end PC GPUs is the fact that said market is stagnating. Mobile GPUs are the big thing now.
 
It's possible they might climb a little bit more than usual. A lot of games are made multi-platform, and so the game is made to support both the latest and greatest PCs and the aging XBox 360 & PS3. Leaving the old consoles behind, developers may make different design decisions knowing they don't need to worry about running their game on old hardware
 
There is no answer to your question because none of us know what the specs will be on the new xbox and ps consoles. Therefore you will just have to wait like the rest of the world and find out.
 
Evidently the Xbox 720 will be using a Radeon 6670 and will be roughly 20% faster than the Wii U.

http://www.guru3d.com/news_story/xbox_720_will_get_gpu_based_on_radeon_6670.html

If that thing's going to be sharing titles with the PC another 7 years from now, they better go ahead and start signing deals with Big Fish Games so they can get something both platforms can handle.

I didn't expect them to shoot for the stars but damn.

Old, actually.

Latest reports have AMD in both PS4 and the 720 (both processor and GPU) and that it's going to be from the 8xxx series family. I have a feeling it's going to be based off of mobile 8xxx GPU to keep everything on one PCB, though. PS4 has been confirmed to have 8gb DDR 3 (at least the prototype) and an 8-core AMD processor running at something around 1.8ghz.

Not sure how this is going to work on a machine that's supposed to be sub-$500.

Still: 'direct to metal' optimization for a GPU from the 8xxx series will probably be impressive.

EDIT: Revised my assessment. They could very well be putting an 8450 in there for all I know.
 
Last edited:
Sony and MS aren't going to put out a machine that's a small jump, no one would buy it

I'm expecting early PS4/next Xbox titles to look as good or better than the most demanding PC titles out now, though granted, PC will still have the edge in terms of resolution. I'm betting they still target 720p
 
the next consoles will be targeting 1080p @ 30fps. the main reason why is UHD or 4k tv is just around the corner (even though being affordable is another thing). playing games on a 4k tv at 720p will look like 480p on a 1080p today. that is if you sit in the correct distance from the tv.
 
I doubt that the new generation of consoles is going to push anything like a Crysis 2/Metro 2033 level of graphics in the first place. I'm not that worried.
 
Given that the next generation consoles will be significantly weaker than current high-end PCs, I'm going to say "no."
 
aren't both new consoles (PS4 and Xbox 720) going to use DX11?...so I don't see how that will make PC requirements go up as the PC has been using DX11 for 2 years now...the only thing that will change is that almost every new game released will support DX11 with DX9 getting the boot
 
Now that they confirmed the specs it looks like they are moving to X86 which hopefully translates to more games coming to PC. Even if its a straight port, I'd rather have the option of playing it on PC than having no option. From the press conference, it looks like next gen consoles won't exactly be pushing our GTX 600 series. Sort of relieved but sort of bummed that consoles won't be pushing PC gaming graphics which is consistent with what the guy at Crytek was alluding to prior to the press conference. Watch Dogs looked like it was struggling on PS4...
 
You old ass Pc from 3 years ago will pwn the ps4 so who cares?
 
You old ass Pc from 3 years ago will pwn the ps4 so who cares?

When PS3 came out it was a high end PC. Sadly, not this time around... I care cause consoles often influences the graphical improvements in games (consoles ports). I recognize that games like Crysis will push the PC regardless but this is not the norm...
 
When PS3 came out it was a high end PC. Sadly, not this time around... I care cause consoles often influences the graphical improvements in games (consoles ports). I recognize that games like Crysis will push the PC regardless but this is not the norm...

I wouldn't claim it was a high end PC, rather, more of an upper mid quality PC. Still, due to being a loss leader, it was an acceptable piece of gaming hardware.

Now, new consoles are always good for getting better quality PC games. Even if the current bleeding edge PC's are 1000% faster, I'd rather be playing the majority of ports with 2010 technology than 2005 technology.

But the one thing that scares me is the whole movement integration. AAA developers write for accepted hardware, and not homebrewed drivers.

I do think it's a safe route that Sony and Microsoft are taking though. Video games have been declining in popularity the past couple of years, and with no real breakthrough technology (4k still isn't here for the masses yet), it's probably better not to go all out.
 
ps4 will have nvidia titan. low noise, small form factor will power any game for the next 6 years.
 
actually, the PS4 has a custom radeon chip, slightly faster than a 7850 (1.84 Tflops, a 7850 does 1.76 Tflops for comparison)

the PS4 will actually be pretty badass, figuring the GPU it'll be using, i was expecting something along the lines of a budget card (like an HD77xx, GT 650), but a 7850+ is quite nice for a console
 
Back
Top