First, go watch Linus Tech Tips' video on testing a 10320x1440 surround rig using 3x R9 290x.
Watched it?
Now, 10320x1440 is 14,860,800 pixels. 4k (UHD, really) - 3840x2160 - is 8,294,400pixels. So, that surround load is nearly 2x the raw pixel output load of 4K.
Linus specifically chose the R9 290x due to the prevailing wisdom that 4K required a lot of RAM, and they were the fastest cards available with 8GB of RAM per card. In use, only Shadow of Mordor using the ultra textured exceeded 4GB of GPU RAM during use.
With only one case of high RAM usage (and that using abnormally large textures, and even then only by a small amount) at an output resolution double that of 4K, it can safely be said that 4K does not "require" more than 4GB of GPU RAM. Certainly not for current games, and likely not for future games unless very large texture reslution become both common and worthwhile (would require further testing with Shadow of Mordor at very high resolutions with different RAM constraints).
Further testing will be needed to even evaluate what performance penalty, if any, that single case of Shadow of Mordor exceeding 4GB would have, given it was already performance constrained by raw GPU power.
But, I hear you cry, consoles use 8GB!
Yes, but consoles also use that RAM pool for everything, and need to share not only RAM capacity betyween CPU and GPU, but also RAM access bandwidth. A PC not only has at least another 8GB of system RAM to work with for non-GPU-specific tasks, the PCI-E bus can pass data from system RAM to GPU RAM far faster than a console can load data into it's RAM from HDD or disc, alleviating that potential bottleneck.
tl;dr 4GB of RAM per GPU is sufficient for 4K in current games. It is unlikely that more than 4GB will be necessary for the near future at the very least.
If somebody has a test that shows a noticeable improvement going from 4GB to 8GB at 4K (or any other resolution) I'd love to see it. A direct comparison of a R9 290x 4gb and R9 290x 8gb would be perfect to eliminate other variables.
Watched it?
Now, 10320x1440 is 14,860,800 pixels. 4k (UHD, really) - 3840x2160 - is 8,294,400pixels. So, that surround load is nearly 2x the raw pixel output load of 4K.
Linus specifically chose the R9 290x due to the prevailing wisdom that 4K required a lot of RAM, and they were the fastest cards available with 8GB of RAM per card. In use, only Shadow of Mordor using the ultra textured exceeded 4GB of GPU RAM during use.
With only one case of high RAM usage (and that using abnormally large textures, and even then only by a small amount) at an output resolution double that of 4K, it can safely be said that 4K does not "require" more than 4GB of GPU RAM. Certainly not for current games, and likely not for future games unless very large texture reslution become both common and worthwhile (would require further testing with Shadow of Mordor at very high resolutions with different RAM constraints).
Further testing will be needed to even evaluate what performance penalty, if any, that single case of Shadow of Mordor exceeding 4GB would have, given it was already performance constrained by raw GPU power.
But, I hear you cry, consoles use 8GB!
Yes, but consoles also use that RAM pool for everything, and need to share not only RAM capacity betyween CPU and GPU, but also RAM access bandwidth. A PC not only has at least another 8GB of system RAM to work with for non-GPU-specific tasks, the PCI-E bus can pass data from system RAM to GPU RAM far faster than a console can load data into it's RAM from HDD or disc, alleviating that potential bottleneck.
tl;dr 4GB of RAM per GPU is sufficient for 4K in current games. It is unlikely that more than 4GB will be necessary for the near future at the very least.
If somebody has a test that shows a noticeable improvement going from 4GB to 8GB at 4K (or any other resolution) I'd love to see it. A direct comparison of a R9 290x 4gb and R9 290x 8gb would be perfect to eliminate other variables.
Last edited: