4K = Temporary stagnation?

nakedhand

2[H]4U
Joined
Apr 21, 2008
Messages
2,127
Will real time graphics development stagnate temporarily during the onset of a 4K mainstream gaming reality? #Skip to bottom for short version :)

We are entering a time when suddenly video cards will have to push four times as many pixels as the days of 1080p. Does this mean that GPU power will not to the same extend be available to handle new GPU cycle dependent bells and whistles? And that therefore we are going to see a slowdown in new real time technology implementations that require more GPU power? Meaning fewer improvements to IQ beyond resolution.

Surely there are many new implementations that result in improved executing of already existing technologies (hurray). And I am aware of DX12, but not to the extend it is being adopted by devs yet.

Personally I would hate the resolution game to slow down the progress in achieving more photo real, real time graphics. Not only in terms of 3D capability but also in terms of better post processing (take DoF which so far looks horrific). Should I worry or will these aspects of visual computing not affect each other as negatively, as I fear, despite the fact that there is finite GPU power?

Having to choose between the two I would rather see a focus improvements in real time 3D (such as shaders and lighting) and post process capability (DoF, AA, CC etc.), than an endless increase in resolution alone. I am aware that resolution is straight forward and realistic real time shader technology of decent DoF is not :D


#BOTTOM
Should I be an optimist and think that we are getting all three below:

  1. x 4 resolution
  2. Significantly lowered power draw while getting more GPU power
  3. General advancements in real time graphics resulting in improved visual fidelity
If yes, is number 3 going to progress slower because of of 1?





PS: I know that many here game at these, and beyond, resolutions already, it is besides the point of discussion.
 
I think the biggest 2 factors here are going to be:
1) A lot of games are developed with consoles in mind and those often can't even handle 1080p
2) 4k is a good bit pricier still

Pushing 4 times the pixels does not require 4 times the GPU, fortunately. It probably requires more like 2 times the GPU (including memory bandwidth).

It is certainly true that as the graphical assets of games increase in quality, more GPU will be required, even without an increase of resolution. This may indeed mean that you can play some games in 4k but others only in 2k/1080p.

I think GPU improvements are going to keep happening for a while, but unfortunately slower. Nvidia used to release 2 GPUs (or "rounds" of GPUs, anyway) a year. But as nodes get smaller, R&D gets harder and more expensive, hence why we've had 28nm GPUs for several generations now. Unfortunately it's only going to get worse as more and more people start finding integrated graphics to be good enough. This means Nvidia/AMD will need to do more research with less money.

So in short, improvements in GPUs and in graphical assets will keep coming, but don't expect progress to speed up anytime soon. Hopefully 20nm (or lower) will allow for some good improvements in GPUs next year, and it probably will, but they will probably stick on that node (be it 20nm, 18nm or 16nm) for several years again.
 

extent

1) A lot of games are developed with consoles in mind and those often can't even handle 1080p

That's the rub right there. Until 4K is mainstream for TVs, 1080p is the standard developers will design to.

Pushing 4 times the pixels does not require 4 times the GPU, fortunately. It probably requires more like 2 times the GPU (including memory bandwidth).

Yeah, it pretty much does. Try maintaining 60FPS in any recent game at 4K. It takes a beastly rig to do it.
 
But isn't 4x resolution an improvement in fidelity too?
 
But isn't 4x resolution an improvement in fidelity too?

I think they mean improvements in assets to match. Increasing only resolution doesn't typically give you much improvement without increases in asset resolution/detail. You'll get smoother lines and such, but that's about it, on its own.

Yeah, it pretty much does. Try maintaining 60FPS in any recent game at 4K. It takes a beastly rig to do it.

Gaming in 4k being tough doesn't mean it takes 4x the GPU. You've not made any valid point there.
 
I think they mean improvements in assets to match. Increasing only resolution doesn't typically give you much improvement without increases in asset resolution/detail. You'll get smoother lines and such, but that's about it, on its own.
.

Yet I've experienced the switch to 4K as such a leap forward, as nothing I've seen since the leap from 320x200 (software rendering days) to Glide and 640x480.
 
Maybe I wasn't clear enough in my original post. But I was wondering what the consequences are going to be (as described) once we are in a 4K mainstream gaming reality.


I think the biggest 2 factors here are going to be:
1) A lot of games are developed with consoles in mind and those often can't even handle 1080p
2) 4k is a good bit pricier still

Pushing 4 times the pixels does not require 4 times the GPU, fortunately. It probably requires more like 2 times the GPU (including memory bandwidth).

I think GPU improvements are going to keep happening for a while, but unfortunately slower. Nvidia used to release 2 GPUs (or "rounds" of GPUs, anyway) a year. But as nodes get smaller, R&D gets harder and more expensive, hence why we've had 28nm GPUs for several generations now. Unfortunately it's only going to get worse as more and more people start finding integrated graphics to be good enough. This means Nvidia/AMD will need to do more research with less money.

So in short, improvements in GPUs and in graphical assets will keep coming, but don't expect progress to speed up anytime soon. Hopefully 20nm (or lower) will allow for some good improvements in GPUs next year, and it probably will, but they will probably stick on that node (be it 20nm, 18nm or 16nm) for several years again.

Interesting input, however depressing. I am just hoping that speed of progress is not slowed too much and I really hope that we PC enthusiasts are not going to be completely neglected going forward. There sure are many trends pointing in that direction. Consoles being the good old primary road block.

So in essence you think this mechanism, that I mention and fear, becomes negligible and might not have much impact, solely because consoles will continue to dictate and dominate any and all progress in real time 3D fidelity?

What about improvements in architecture and instruction sets at existing nanometer scales - would that not allow for some improvements?



Parja, I am sure no one would understand the meaning without you pointing that out. It was a great catch man ;) ...I am Danish, English is not really my forte. I am sure you can find more in many posts here..

That's the rub right there. Until 4K is mainstream for TVs, 1080p is the standard developers will design to.

Consoles again killing every hope in their wake of mediocrity. Yes like I said in the original post, a 4K gaming reality, this means TVs are 4K. So at this point will visual fidelity beyond resolution stagnate because of the enormous computation demands for 4K? Will there be space, so to speak, for progress in shaders, polycount, PP, CC etc. for a good while?


But isn't 4x resolution an improvement in fidelity too?

Sure, but we are talking about visual fidelity beyond resolution.

I think they mean improvements in assets to match. Increasing only resolution doesn't typically give you much improvement without increases in asset resolution/detail. You'll get smoother lines and such, but that's about it, on its own..

Exactly. I want higher polycounts, better implementation of tesselation, displacement mapping and shattering beyond the improvements in shaders and lighting I already called for. Barrels are still not round, weapon models look good, but plain old barrels still have way to many edges. I dont care about barrels, its just an example. We have seen videos years ago showcasing DX11 to be able to run destructible geometry on a vertex basis. Not predefined volumes of geometry that can shatter off. I think it was an alien being shot with an energy weapon. Also adding geometry would be useful, on a vertex basis. Cant recall if that was possible.
 
I don't think visual improvements will stagnate. 4K gaming will not be considered by developers for at lest a few years to come. They'll make games to run on 1080p, but those with a good enough pc will be able to experience the same improvements in 4K.

But it's also a question of how expensive it is to create content for games with better fidelity. That's why I'm looking forward to something game changer. The same old technology was used for at least 20 years now, the only thing changing is the polygon count, texture resolution, and shaders. Other things like tesselation can only be used in some very specific situations. So in my opinion we were stagnating for decades. Something needs to happen.
 
So in essence you think this mechanism, that I mention and fear, becomes negligible and might not have much impact, solely because consoles will continue to dictate and dominate any and all progress in real time 3D fidelity?

Well, I don't think consoles are 100% dominating here, but they have a very large impact since a lot of developers work around the consoles and then port to PC. Why? Probably so they know their starting point is something consoles will be able to handle rather than having to potentially rewrite a lot more of the game just to get it to a point where the consoles can run it.

What about improvements in architecture and instruction sets at existing nanometer scales - would that not allow for some improvements?

Sure. That's what we've been getting. Nvidia's 900 series does cut down on power usage by a fair amount while delivering a little bit of a performance upgrade. But the improvements we're getting within process nodes, and as technology itself gets more advanced, are getting smaller. There is no Geforce 256, Geforce 2 GTS, Radeon 8500, Radeon 9800, Geforce 8800, etc. anymore. The increases in performance those products (and some others) delivered were great AND we were getting these new products on a more regular basis than what we get today.



Consoles again killing every hope in their wake of mediocrity. Yes like I said in the original post, a 4K gaming reality, this means TVs are 4K. So at this point will visual fidelity beyond resolution stagnate because of the enormous computation demands for 4K? Will there be space, so to speak, for progress in shaders, polycount, PP, CC etc. for a good while?

I think all of these things are going to continue evolving, sure, and there will likely always be PC games that pushes the technology to its limits in this way. But, at the moment, consoles are where most people do their gaming, and most of these people don't seem to care about the technology or the resolution or anything. Most of my friends, including gamers, can barely (if at all) tell the difference between composite video and component (or even HDMI with a decent resolution). This is going to limit the amount of people that take advantage of this, and will mean that most games will not focus on this too heavily. Or at least that's what I think.
 
4K us still a pipe dream for the uber niche of PC gamers. Too expensive to even get started right now.

People do not even have mass adoption of 1080P much less 4K. Hell, the broadcasters only send out signal in 720P and STILL charge a HD premium fee.

4K will be stagnant for a LONG time.
 
The hardware just isn't there at the moment for 4k. Not only on the graphics side, but on the display side. Yes, there are 4k monitors, but how many of them that aren't TN panels, support G-Sync, have 10+ bit color display and support 60 Hz+ on both portrait and landscape?
 
Assuming a 4K mainstream reality (which won't happen for several years), you can probably expect some stagnation in terms of graphical fidelity. I personally don't consider that a problem — we're past the knee of the curve in terms of graphics. The same can be said for resolution, in most cases.

Occasionally there are these tipping points. It used to be that 640x480 was what everyone doing 3D acceleration was aiming for, and that quickly became 800x600, 1024x768 and so on. Within a fairly short span of time, we were at 1600x1200. You should expect 4K to be the last real frontier on resolution for a while, so once 4K becomes a mainstream notion and 4K performance targets are met, developers will get back to pushing more polygons and throwing more shaders at things.

4K is a good thing for gaming overall, in my opinion. When you can push more pixels into the same-sized display, a lot of quantization issues start becoming much less obvious. For the level of visual fidelity we have today, quantization limits end up actually being a pretty big problem.

Higher resolution means you don't need AA.
Greater pixel density means AA is less important, but you still want as much of it as you can get. And you want the good stuff, too: stuff that hits transparency aliasing, shader aliasing and you still want temporal filtering.

People do not even have mass adoption of 1080P much less 4K. Hell, the broadcasters only send out signal in 720P and STILL charge a HD premium fee. 4K will be stagnant for a LONG time.
Gaming is really going to drive the ship this time, not broadcast media. You have to base your outlook on what happens on the gaming side, not on the traditional media side. It's not hard to sell 4K to gamers, but selling it to Netflix subscribers is a difficult task.
 
The hardware just isn't there at the moment for 4k. Not only on the graphics side, but on the display side. Yes, there are 4k monitors, but how many of them that aren't TN panels, support G-Sync, have 10+ bit color display and support 60 Hz+ on both portrait and landscape?
Non4k monitors that support g-sync are too expensive anyway. I can't justify their price for a glorified v-sync implementation. And if you aim for 60+fps you don't have need for it anyway.
Frankly I don't see why all the hate for TN-panels, it seems more like a fashion trend to me. I always owned TN monitors and never had the slightest inclination to switch to anything else. Especially for gaming"s sake.

And the hardware is there, granted you need at least 2 hi-end GPUs, but it's definitely available.
 
Gaming is really going to drive the ship this time, not broadcast media. You have to base your outlook on what happens on the gaming side, not on the traditional media side. It's not hard to sell 4K to gamers, but selling it to Netflix subscribers is a difficult task.
Movie makers are starting to go 4K though, probably so that their stuff will still look good 10 years later. But yea, for most consumers the extra cost is just not practical. Watch Gone Girl at 4K and it's not super noticeable the benefit over HD, at least not in most theater setups.
 
...So in my opinion we were stagnating for decades. Something needs to happen.

I agree to a certain degree. It has been by and large been refinements of existing ideas and technologies, not a whole lot of rethinking the way real time graphics is constructed and displayed. The latter comes with a whole lot of caveats and restrictions that a commercial environment rightfully so have a very hard time accepting.

Well, I don't think consoles are 100% dominating here, but they have a very large impact since a lot of developers work around the consoles and then port to PC. Why? Probably so they know their starting point is something consoles will be able to handle rather than having to potentially rewrite a lot more of the game just to get it to a point where the consoles can run it.

Makes sense, thanks for explaining.

There is no Geforce 256, Geforce 2 GTS, Radeon 8500, Radeon 9800, Geforce 8800, etc. anymore. The increases in performance those products (and some others) delivered were great AND we were getting these new products on a more regular basis than what we get today.

Listing part of my video card journey here! :D And good point, we have been fairly spoiled in terms of these performance increases of that era.

Most of my friends, including gamers, can barely (if at all) tell the difference between composite video and component (or even HDMI with a decent resolution). This is going to limit the amount of people that take advantage of this, and will mean that most games will not focus on this too heavily. Or at least that's what I think.

It makes me weep on the inside. My biggest fear is that the masses at some point completely forget what quality is, hence end up reversing progress bringing us to a point where things are worse than before. Look at the music industry and the film industry, farther between masterpieces than ever before IMHO.


The hardware just isn't there at the moment for 4k. Not only on the graphics side, but on the display side. Yes, there are 4k monitors, but how many of them that aren't TN panels, support G-Sync, have 10+ bit color display and support 60 Hz+ on both portrait and landscape?

Great point and also deeply depressing. The UN and WHO needs to put a ban on TN panels naow! I all seriousness, I had not thought that my 2005 PVA panel would still be in use on my desk as a secondary monitor in 2014. I would have waged my salary on the opposite scenario in 2008, glad i didn't :D.

Assuming a 4K mainstream reality (which won't happen for several years), you can probably expect some stagnation in terms of graphical fidelity. I personally don't consider that a problem — we're past the knee of the curve in terms of graphics. The same can be said for resolution, in most cases.

Several years you say, interesting. Thought it was more imminent. I keep forgetting how slow the masses are to adopt these technologies (why do they want a new phone every 6 months for selfies, but not better displays and videocards!).

I am not sure I agree that we are past the knee, well in one respect surely, but look at Crysis 3 maxed out, it's not really that impressive. Sure we have come a long way, but comparing real time and pre rendered CGI for film from the last few years and it speaks volumes about the length of the journey that real time is on and how we are still in the very dawn of it.

You should expect 4K to be the last real frontier on resolution for a while, so once 4K becomes a mainstream notion and 4K performance targets are met, developers will get back to pushing more polygons and throwing more shaders at things.

I am elated to hear this. 4K is comparable to 35mm film resolution, which is why filmmakers are so happy to finally get there. You prove my point here that developments in visual fidelity will pick up again once 4K performance targets are met. I am just slightly more displeased with any set backs or stagnation getting there, since I feel that things are moving too slow in the real of photo real, real time graphics. I refuse to die before we hit a better approximation of reality with real time technology - I will be rocking my "occulus" style implants at the home for the elderly and fidelity (and my eyes) better be good then!


4K is a good thing for gaming overall, in my opinion. When you can push more pixels into the same-sized display, a lot of quantization issues start becoming much less obvious. For the level of visual fidelity we have today, quantization limits end up actually being a pretty big problem.

Well yes at the relevant time I certainly agree. I just think they are getting their priorities wrong by pushing 4K before pushing better shaders, polycounts, lighting etc. I am going to cherish 4K once it gets here, but I wish the road was different, leaving us with a wider scope of awesome real time technologies implemented to enjoy at 4K, not just a 4K version of existing visuals (which I realize it wont be exclusively).


Greater pixel density means AA is less important, but you still want as much of it as you can get. And you want the good stuff, too: stuff that hits transparency aliasing, shader aliasing and you still want temporal filtering.

Indeed. We do want that yes. 4K is no magic fix for shotty transparency issues and bad filtering.

Gaming is really going to drive the ship this time, not broadcast media. You have to base your outlook on what happens on the gaming side, not on the traditional media side. It's not hard to sell 4K to gamers, but selling it to Netflix subscribers is a difficult task.

We need more PC gamers.. :D

Frankly I don't see why all the hate for TN-panels, it seems more like a fashion trend to me. I always owned TN monitors and never had the slightest inclination to switch to anything else. Especially for gaming"s sake.
.

The vibrancy and depth of color simply isnt there. This is no big deal for some, surely we have different fascinations. Color is hugely critical to me, it adds immersion and mood in gaming and obviously is essential in digital arts processing.

Movie makers are starting to go 4K though, probably so that their stuff will still look good 10 years later. But yea, for most consumers the extra cost is just not practical. Watch Gone Girl at 4K and it's not super noticeable the benefit over HD, at least not in most theater setups.

I hope that was just a bad cinema. The difference should be immense. It should feel more like going from full HD to 35mm film.
 
The vibrancy and depth of color simply isnt there. This is no big deal for some, surely we have different fascinations. Color is hugely critical to me, it adds immersion and mood in gaming and obviously is essential in digital arts processing.

There are serious differences between monitors with TN panels. You can't just say TN=BAD My monitor has good colors, as good as any IPS I use. I don't notice the difference, maybe if they were placed right next to each other, but I stand up from my IPS monitor at work, go home and sit in front of my TN screen and it doesn't feel inferior, In fact if I didn't check the specs I wouldn't be able to tell them apart on panel type just by looking at them.

I'm not suggesting noone could see the difference. But I bet many people want IPS panels just because of the all out bashing of TN panels everywhere.

Of course if you do work on it, where precise color reproduction is a must, that's a whole different case. But I thought we're talking gaming here.
 
There are serious differences between monitors with TN panels. You can't just say TN=BAD My monitor has good colors, as good as any IPS I use. I don't notice the difference, maybe if they were placed right next to each other, but I stand up from my IPS monitor at work, go home and sit in front of my TN screen and it doesn't feel inferior, In fact if I didn't check the specs I wouldn't be able to tell them apart on panel type just by looking at them.

I'm not suggesting noone could see the difference. But I bet many people want IPS panels just because of the all out bashing of TN panels everywhere.

Of course if you do work on it, where precise color reproduction is a must, that's a whole different case. But I thought we're talking gaming here.

Fair enough, not all TN panels are the same. I am no expert on TN panels at all. I just noticed the difference between the ones I have seen (quite a lot and also recently as friends have picked up fast gaming monitors) and the PVA and various IPS that I own.

I am sure enough people want things, such as IPS, because others want them. I try to make my own informed decision and with monitors, I need to actually see them in action first (which is getting more and more difficult nowadays).

We certainly are talking gaming indirectly as real time graphics mostly exist in gaming. I agree on your point. I just added to the point and thought I made that clear, sorry.
 
Gaming is really going to drive the ship this time, not broadcast media. You have to base your outlook on what happens on the gaming side, not on the traditional media side. It's not hard to sell 4K to gamers, but selling it to Netflix subscribers is a difficult task.

No it isn't. Price is going to drive the ship as always, as well as delivery of content.

If 4k remains associated with thousands in hardware it will remain a niche market that will be tough for any game studio to focus on. It is already hard to justify the extra development needed to satisfy the PC gamers now, and most of them are 1080P.

Again how people think 4K is going anywhere when we still have a major portion of the market barely at HD levels, content providers who still consider HD a premium service worthy or extra charges, and hardware that cost a buttload is beyond me.

Not to mention the infrastructure that the ISP say can't handle the current Netflix streaming. (this is horseshit and we know it but they still scream about it).
 
I think 4K is already much faster in penetrating the market than HD ever was. 10 years ago noone was using HD in europe apart from a few hobbyists. And the whole of EU had ONE single Hd channel, that showed a 15 minute demo loop most of the time.
 
No it isn't. Price is going to drive the ship as always, as well as delivery of content. Again how people think 4K is going anywhere when we still have a major portion of the market barely at HD levels, content providers who still consider HD a premium service worthy or extra charges, and hardware that cost a buttload is beyond me. Not to mention the infrastructure that the ISP say can't handle the current Netflix streaming.
Oops! You didn't read my post. Not sure there's anything I can do about that other than to suggest you read it again.
 
Well yes at the relevant time I certainly agree. I just think they are getting their priorities wrong by pushing 4K before pushing better shaders, polycounts, lighting etc.
I could go either way on it, really. I'm happy with the current level of visual fidelity — maybe another click of Moore's law would be nice — and generally happy with typical 21"-24" 1080p displays (at least in terms of resolution, refresh rates and general image quality). More than what we have in both areas today is essentially icing on the cake.

I do make that distinction between photorealism and more Pixar-level quality, and my thinking is that the former is not necessarily all that wonderful compared to the latter. We have good approximations of Pixar-level quality today, and I think that's a pretty reasonable place for graphics to rest for a while while displays (and, notably, game interaction and physics) do some catching up.
 
there are no 120 or 144 hertz 4k monitors, no one wants to play counter strike on 4k 30hz monitor.
 
I think the biggest 2 factors here are going to be:
1) A lot of games are developed with consoles in mind and those often can't even handle 1080p
2) 4k is a good bit pricier still

Pushing 4 times the pixels does not require 4 times the GPU, fortunately. It probably requires more like 2 times the GPU (including memory bandwidth).

It is certainly true that as the graphical assets of games increase in quality, more GPU will be required, even without an increase of resolution. This may indeed mean that you can play some games in 4k but others only in 2k/1080p.

I think GPU improvements are going to keep happening for a while, but unfortunately slower. Nvidia used to release 2 GPUs (or "rounds" of GPUs, anyway) a year. But as nodes get smaller, R&D gets harder and more expensive, hence why we've had 28nm GPUs for several generations now. Unfortunately it's only going to get worse as more and more people start finding integrated graphics to be good enough. This means Nvidia/AMD will need to do more research with less money.

So in short, improvements in GPUs and in graphical assets will keep coming, but don't expect progress to speed up anytime soon. Hopefully 20nm (or lower) will allow for some good improvements in GPUs next year, and it probably will, but they will probably stick on that node (be it 20nm, 18nm or 16nm) for several years again.

Consoles aren't really the problem you think they are.

The vast majority of desktops are on 1080p monitors with barely enough GPU power to play mid range games there. The vast majority of game capable laptops are 1080p as well. Resolutions higher than 1080p aren't really used for gaming. Those resolutions are for content creators... ie video, animation, CAD/CAM, photo, also known as "systems that are not used for gaming".

So even on computers, ultra high resolutions are a very small minority. And of that small minority, the vast majority are used for professional applications on machines that will never play a video game and by people who do not play PC games. A tiny fraction of those displays are purchased by PC hobbyists who always by the most expensive thing. But that's such a small portion of the market it would be economic malpractice of the highest order to focus games on that, it's a pure waste of money and anybody who does this should be fired on the spot for being an idiot.

3k, 4k, 5k will kick off when that's the bog standard resolution on laptops, which make up the majority of the market. And also when the bog standard monitor that ships with el-cheapo 400 buck acer/dell desktop combo deal has that resolution.

It will happen, it's just a long time out.
 
4K is in stagnation and it will be for a while. If you can't get 60fps @4k it is usually not worth playing a game at that resolution. At high settings this means the most expensice video card in SLI, about a $1000 video card investment. At the same time it is very much bandwidth consuming for Cable companies to have their channels in 4K and or let people stream 4K content from Netflix, youtub, Hulu etc... at the same time our media storage as far as Blurays, Hard drives etc... are not big enough to handle proper 4k content.

Moral of the story : We did not need 4k, we needed 1080p and 1600p to get better, like 30-32 inch 1600p glossy 120Hz monitors with deep blacks,
not 30Hz 4K TV's build in the basement of a guy in Taiwan with washed up colors and crappy plastics.
 
Speak for yourself please. I have 4K and wouldn't settle for anything less after. And I couldn't care less about cable companies' nuisances. Solve it if you want my business.

But it was the same with HD a few years ago, even techie people said "DVD resolution is good enough, who needs HD, GTFO"

I say GTFO to those who want to settle for less.
 
Consoles aren't really the problem you think they are.

The vast majority of desktops are on 1080p monitors with barely enough GPU power to play mid range games there. The vast majority of game capable laptops are 1080p as well. Resolutions higher than 1080p aren't really used for gaming. Those resolutions are for content creators... ie video, animation, CAD/CAM, photo, also known as "systems that are not used for gaming".

So even on computers, ultra high resolutions are a very small minority. And of that small minority, the vast majority are used for professional applications on machines that will never play a video game and by people who do not play PC games. A tiny fraction of those displays are purchased by PC hobbyists who always by the most expensive thing. But that's such a small portion of the market it would be economic malpractice of the highest order to focus games on that, it's a pure waste of money and anybody who does this should be fired on the spot for being an idiot.

3k, 4k, 5k will kick off when that's the bog standard resolution on laptops, which make up the majority of the market. And also when the bog standard monitor that ships with el-cheapo 400 buck acer/dell desktop combo deal has that resolution.

It will happen, it's just a long time out.

For a long time, PC games have been pushing the hardware in the market at the time of their release. Those days were exciting times for graphic hardware as there are always some games waiting to utilize them whenever a new generation of GPU was released.

High end hardware have always been a niche even a decade ago. That did not stop us from getting games that required these high end hardware for it's highest quality setting. However, once console became the main development platform for most video games, we stopped getting these games.
 
It looks like people are shifting the debate in this thread from real time graphics stagnation towards display tech now. Please read original post, though interesting the display debate is not really relevant to the thread. There are plenty of display threads around :)

Also we are not talking about the stagnation of 4K but in real time graphics development.
 
Answer is, depends entirely on developed. Some will push for shinier models and textures, some for higher resolution. Trying to push both simultaneously will result in a compromise no question.
As for lower power consumption, absolutely or else pcs will have to start doubling as space heaters
 
I could go either way on it, really. I'm happy with the current level of visual fidelity — maybe another click of Moore's law would be nice — and generally happy with typical 21"-24" 1080p displays (at least in terms of resolution, refresh rates and general image quality). More than what we have in both areas today is essentially icing on the cake.

I do make that distinction between photorealism and more Pixar-level quality, and my thinking is that the former is not necessarily all that wonderful compared to the latter. We have good approximations of Pixar-level quality today, and I think that's a pretty reasonable place for graphics to rest for a while while displays (and, notably, game interaction and physics) do some catching up.

I am of a completely different opinion here. Your icing is my default. I am seriously unhappy with the current level of visual fidelity in real time. Surely it shows well on a monitor, but in terms of complexity I find the current level lacking to say the least. I am shocked at the lack of progress every time I buy a new AA title. I have been gaming since the 80s.

Your distinction is very important to keep in mind. And I agree that photorealism is not necessarily all that wonderful, since reality can look very unkind. Now pass the capabilities through the minds and hands of visionaries and everyone will be stunned. I can´t wait for it to happen. Just because you have photoreal capability it does not mean that you have to create the photoreal, but it does allow for way more complex and immersive imaginary worlds.

However I think you are being way too generous when stating that we have good approximations of Pixar level quality in real time graphics these days. In my mind current real time and Pixar does not even occupy the same realm. Let me show you a few screen grabs from a 14 year old Pixar movie (as we know, an eternity in CGI).

7516_2_large.jpg


7516_6_large.jpg


Sure we have an "approximation", but only with a very generous definition of the word approximation. My point is that 14 years down the line, we have still not caught up with this. Not even close. Sure many things in there are possible, and thank goodness for that. But we don't have the poly counts (incredibly important), we don't have the lighting (incredibly important), we don't have the post processing, DoF, AA etc, we don't have the fur in this case! Now take a screen grab from a recent Pixar movie and your approximation really begins to suffer. Keep in mind that Pixar has a "CGI look style" that purposefully looks backwards or cartoony.

It is fascinating to me, that some people are so content with the current level of real time graphics development, and somewhat unwilling to recognize the slew of restrictions and limitations impact on the gaming experience. When in fact we are still at the very beginning of the real time evolution and lacking the most basic technologies to properly convey the wide scope of expression and facsimile necessary to tell better stories, create deeper gaming interaction and immerse the player through.

And I do recognize that for certain games visual fidelity is besides the point entirely, I see the value in Minecraft and old arcade games myself. For for the epic RPG adventures, SIM games, FPS games etc. I think most people would like the environments and characters to be convincing, rich and complex.

I totally agree that we need improvements in game interaction and physics. More devs need to think in how to apply this better and create new game mechanics. Physics is a great thing in gaming. Now lets have dynamic geometry locally on a per object basis, then we can have dynamic soft body collisions later.

Answer is, depends entirely on developed. Some will push for shinier models and textures, some for higher resolution. Trying to push both simultaneously will result in a compromise no question.

Thanks for your input. Yes, it has to result in a compromise, sadly.

I cherish the devs who decide to go with pushing the visual envelope outside resolution. But they can´t do this indefinitely or effectively without an industry adopting new standards and technologies. By the way, textures are there in terms of fidelity. Surely the filtering of them is not entirely there yet, including transparencies.
 
It looks like people are shifting the debate in this thread from real time graphics stagnation towards display tech now. Please read original post, though interesting the display debate is not really relevant to the thread. There are plenty of display threads around :)

Also we are not talking about the stagnation of 4K but in real time graphics development.

It's hard to do when in reality 4K push is mostly fueled by greedy corporations that want to use same crappy LCD factories they used for years instead of investing into new technologies that would improve picture quality and motion. Same as 3D fad was before it.

And on software side you can bet that when consoles switch to 4K there will be regress in quality of assets as it won't be possible to have good enough frame rates at 4K
 
It's hard to do when in reality 4K push is mostly fueled by greedy corporations that want to use same crappy LCD factories they used for years instead of investing into new technologies that would improve picture quality and motion. Same as 3D fad was before it.

And on software side you can bet that when consoles switch to 4K there will be regress in quality of assets as it won't be possible to have good enough frame rates at 4K

I don't think software will change to 4K on consoles on the current generation. And on PC the choice is the user's not the developers.

I agree 3d was a farce but 4K at least has real value to me.
 
there are no 120 or 144 hertz 4k monitors, no one wants to play counter strike on 4k 30hz monitor.

No one wants to play anything on 30hz, thankfully 4k monitors are 60hz.

Players who want 120hz or more are a very insignificant minority. As well as those who want 4K, but fighting against 4K is pointless. You can still play at 120hz and any resolution even if 4K becomes commonplace.

I don't understand this hostility towrads 4K, when 1024x768 became possible around the turn of the century noone wanted to thaw it.
 
Speak for yourself please. I have 4K and wouldn't settle for anything less after. And I couldn't care less about cable companies' nuisances. Solve it if you want my business.

But it was the same with HD a few years ago, even techie people said "DVD resolution is good enough, who needs HD, GTFO"

I say GTFO to those who want to settle for less.

Says the gentleman who has spent thousands on his 4K TV and had enough of re-watching the demo 4K content blue ray that came with it :)
 
And on software side you can bet that when consoles switch to 4K there will be regress in quality of assets as it won't be possible to have good enough frame rates at 4K

I sincerely hope you are wrong, but I don't think you are. It is of great concern for progression. We could really hit a plateau, that could be long and excruciating. Given where we are at with GPU nanometer scales and expenses in going smaller as someone mentioned earlier, 4K is a scary proposition on consoles. I am sure that your fear of a possible regression in assets is a likely consequence of this unfortunate combination (added resolution and not a whole lot of added GPU power). The masses might not even notice the regression or experience this plateau, but for the enthusiasts it is going to be a long and dry number of years. Man I hope we are wrong and that PC game devs demands otherwise.

Basically I was just hoping for an intelligent progression in real time technology. And it is beginning to look like very odd prioritization, in terms of roll out order, by industries involved. I would have liked to see an emergence of new technologies in real time that also lead to new types of games, or at least new types of game play. It is not all about getting a prettier environment for me. Physics was great in terms of breaking down the static dimension, but we really haven't seen any great games relying mostly on physics for core mechanics, Portal is close perhaps. That said I always enjoy when its there. I know that physics is not directly related to fidelity. But I would argue that any added complexity used correctly adds to the sensed fidelity.

Holy pants I am ranting. All out of fear of dying before real time graphics is totally through the roof awesome :D
 
My point is that 14 years down the line, we have still not caught up with this. Not even close. Sure many things in there are possible, and thank goodness for that. But we don't have the poly counts (incredibly important), we don't have the lighting (incredibly important), we don't have the post processing, DoF, AA etc, we don't have the fur in this case!
It's kind of interesting that you chose Monsters Inc. as an example. Pixar's RenderMan was only recently (for 2013's Monsters University) overhauled to support global illumination. We had games with global illumination two years prior. Obviously, RenderMan gets you much closer to the ideal Cornell's box than Enlighten does, but that's not really the point I'm trying to make here. The point is that it's not cut and dry.

Keep in mind that back in 2001, it took Pixar 12 hours to render a single frame with Sully in it. With a render farm. I have no idea how big that render farm was back then, but it was certainly big. Even with the absolute breakneck pace of GPU evolution, that a single chip today (assumably) can't render the same frames in 16 milliseconds shouldn't be too surprising, and certainly shouldn't be upsetting. In what other industry is a 2,700,000% improvement in anything realized within 14 years?

(Just as an aside: for Monsters University, the aggregate frame time was 29 hours)

When in fact we are still at the very beginning of the real time evolution and lacking the most basic technologies to properly convey the wide scope of expression and facsimile necessary to tell better stories, create deeper gaming interaction and immerse the player through.
Rendering isn't really the long pole in the tent as far as those things are concerned.
 
This is the dawn of an interesting era, as Media resolutions have finally exceeded typical PC graphics resolutions. Remember that the first PC monitors were actually TVs, then when that wasn't enough we rolled out special monitors for just that purpose.

Now, it's done an about face. This is actually a good thing though. Once this tech can be miniaturized into a 30" screen, we now can finally use a TV for media production work (provided it'll have to meet a few more specific standards and such). What this means is less resolution modes and no more outrunning TV resolutions. The benefit from this is videocard makers can now focus on a few resolutions and so can game designers.

Less options is good when it comes to writing games and drivers, as this is less chance for bugs. I suspect we'll win out in the long run, but it'll be kinda ugly at first as we play catch up to play games at 4k resolution.

Until then, 1080 will hold me over.
 
Keep in mind that back in 2001, it took Pixar 12 hours to render a single frame with Sully in it. With a render farm. I have no idea how big that render farm was back then, but it was certainly big. Even with the absolute breakneck pace of GPU evolution, that a single chip today (assumably) can't render the same frames in 16 milliseconds shouldn't be too surprising, and certainly shouldn't be upsetting. In what other industry is a 2,700,000% improvement in anything realized within 14 years?

Did some digging. In 2001 they upgraded their farm to a cluster of 250 sun boxes with eight ultraSPARC III processors each. With a 750mhz clock. So a total of 2000 CPUs. Each one of those were capable of 1.5 GFLOPS, for a combined total of 2000*1.5=3 TFLOPS.

A single Titan Z can push 2.7 TFLOPS in double precision. That's impressive.
 
No one wants to play anything on 30hz, thankfully 4k monitors are 60hz.

Players who want 120hz or more are a very insignificant minority. As well as those who want 4K, but fighting against 4K is pointless. You can still play at 120hz and any resolution even if 4K becomes commonplace.

I don't understand this hostility towrads 4K, when 1024x768 became possible around the turn of the century noone wanted to thaw it.


Players who want 120Hz or more are an insignificant minority? Tell that to the people trying to keep ROG Swift monitors in stock. Tell that to all the people owning 1440p OCable Korean monitors. Have you ever heard of competitive FPS's? What are you talking about?

And 4K? Hell, my phone is 1080p on a 5" screen. There are even 1440p PHONES. 1080p on a 23" screen needs a lot of AA to look marginally decent, at which points things to me look blurry. 1440p on a 27" screen still needs a little AA, but it's better. 4K is the logical next step in panel technology and we need to keep pressure on manufacturers. We need it, we want it. It's time for AA to be a thing of the past.
 
Players who want 120Hz or more are an insignificant minority? Tell that to the people trying to keep ROG Swift monitors in stock. Tell that to all the people owning 1440p OCable Korean monitors. Have you ever heard of competitive FPS's? What are you talking about?
And now you just unfolded that minority, into subgroups that are even smaller and have overlap between them. What's your point?
And 4K? Hell, my phone is 1080p on a 5" screen. There are even 1440p PHONES. 1080p on a 23" screen needs a lot of AA to look marginally decent, at which points things to me look blurry. 1440p on a 27" screen still needs a little AA, but it's better. 4K is the logical next step in panel technology and we need to keep pressure on manufacturers. We need it, we want it. It's time for AA to be a thing of the past.
Since I was arguing for 4K not against it. I don't even get it why do you address this at me.
 
Back
Top