Will real time graphics development stagnate temporarily during the onset of a 4K mainstream gaming reality? #Skip to bottom for short version
We are entering a time when suddenly video cards will have to push four times as many pixels as the days of 1080p. Does this mean that GPU power will not to the same extend be available to handle new GPU cycle dependent bells and whistles? And that therefore we are going to see a slowdown in new real time technology implementations that require more GPU power? Meaning fewer improvements to IQ beyond resolution.
Surely there are many new implementations that result in improved executing of already existing technologies (hurray). And I am aware of DX12, but not to the extend it is being adopted by devs yet.
Personally I would hate the resolution game to slow down the progress in achieving more photo real, real time graphics. Not only in terms of 3D capability but also in terms of better post processing (take DoF which so far looks horrific). Should I worry or will these aspects of visual computing not affect each other as negatively, as I fear, despite the fact that there is finite GPU power?
Having to choose between the two I would rather see a focus improvements in real time 3D (such as shaders and lighting) and post process capability (DoF, AA, CC etc.), than an endless increase in resolution alone. I am aware that resolution is straight forward and realistic real time shader technology of decent DoF is not
#BOTTOM
Should I be an optimist and think that we are getting all three below:
PS: I know that many here game at these, and beyond, resolutions already, it is besides the point of discussion.
We are entering a time when suddenly video cards will have to push four times as many pixels as the days of 1080p. Does this mean that GPU power will not to the same extend be available to handle new GPU cycle dependent bells and whistles? And that therefore we are going to see a slowdown in new real time technology implementations that require more GPU power? Meaning fewer improvements to IQ beyond resolution.
Surely there are many new implementations that result in improved executing of already existing technologies (hurray). And I am aware of DX12, but not to the extend it is being adopted by devs yet.
Personally I would hate the resolution game to slow down the progress in achieving more photo real, real time graphics. Not only in terms of 3D capability but also in terms of better post processing (take DoF which so far looks horrific). Should I worry or will these aspects of visual computing not affect each other as negatively, as I fear, despite the fact that there is finite GPU power?
Having to choose between the two I would rather see a focus improvements in real time 3D (such as shaders and lighting) and post process capability (DoF, AA, CC etc.), than an endless increase in resolution alone. I am aware that resolution is straight forward and realistic real time shader technology of decent DoF is not
#BOTTOM
Should I be an optimist and think that we are getting all three below:
- x 4 resolution
- Significantly lowered power draw while getting more GPU power
- General advancements in real time graphics resulting in improved visual fidelity
PS: I know that many here game at these, and beyond, resolutions already, it is besides the point of discussion.