EVGA 780Ti ACX cooler

How exactly to you propose to implement higher resolution textures without impacting GPU load?

That's more data to move across the PCIe bus...
Longer loading times and/or more stuttering as you wait for that extra data to transfer from system RAM...
Larger files for the GPU to decompress (more waiting, more GPU load)...
More raw data clogging up the memory bus on the card itself...
More pixels the GPU has to crunch when those textures make up part of a surface shader (most surfaces in most modern games are this)...
More pixels to scale and transform...
More pixels that Anisotropic Filtering has to chew on (if you up all textures from 4k to 16k you WILL notice AF taking a massive bite out of performance)

More-more-more. You can't simply install obscenely-high-resolution textures, fill up 6GB of video RAM with them, and expect performance to stay the same...

Did you not hear about the benefits of DirectX 11.2?

http://www.youtube.com/watch?v=-I5TEpAnuEg

Please stop worrying about GPU memory. Unless MS is lying about how this works (or its performance sucks), this is going to make the GPU memory argument mute.

Edit:: more technical video here - http://www.dsogaming.com/news/directx-11-2-revealed-available-only-on-windows-8-1-xbox-one/
 
Last edited:
Did you not hear about the benefits of DirectX 11.2?
Did you bother reading my post?

The additions made in DX 11.2 do not nullify everything in my list... not even close.

Tiled resources will come in handy, no doubt... but they're not a cure-all.
 
Did you bother reading my post?

The additions made in DX 11.2 do not nullify everything in my list... not even close.

Tiled resources will come in handy, no doubt... but they're not a cure-all.

Did you watch the extended video? They designed this from the ground up to use up to 16k x 16k textures (the Graphine demo has 64k x 64k textures) and much higher quality shadow maps with less than 1% of the current resource cost with no pop-in. The Mars demo uses 3GB of textures but needs only 16MB of GPU memory at 1080p. It will be a cure all for a long period of time if it works as advertised.
 
Did you watch the extended video? They designed this from the ground up to use up to 16k x 16k textures (the Graphine demo has 64k x 64k textures) and much higher quality shadow maps with less than 1% of the current resource cost with no pop-in.
Yes, which is great in that one specific circumstance. I've used tiled resources before, and I know that there are some big caveats that Microsoft has failed to address in those videos.

Using tiled resources, especially when video RAM is limited, leads to huge increases in bus bandwidth utilization. All that data has make it to the graphics card somehow in order for you to view it, after all. And if the graphics card doesn't have the space to cache those resources, then they might be making the trip multiple times.

Rather than keeping all 3GB resident, they're swapping the relevant portions in and out of video RAM as necessary. This lowers video RAM usage but VERY quickly moves the bottleneck to overall bandwidth.

The Mars demo uses 3GB of textures but needs only 16MB of GPU memory at 1080p. It will be a cure all for a long period of time if it works as advertised.
Except that's hardly the whole story, as I've explained above. There is no such thing as a free lunch...
 
How exactly to you propose to implement higher resolution textures without impacting GPU load?

That's more data to move across the PCIe bus...
Longer loading times and/or more stuttering as you wait for that extra data to transfer from system RAM...
Larger files for the GPU to decompress (more waiting, more GPU load)...
More raw data clogging up the memory bus on the card itself...
More pixels the GPU has to crunch when those textures make up part of a surface shader (most surfaces in most modern games are this)...
More pixels to scale and transform...
More pixels that Anisotropic Filtering has to chew on (if you up all textures from 4k to 16k you WILL notice AF taking a massive bite out of performance)

More-more-more. You can't simply install obscenely-high-resolution textures, fill up 6GB of video RAM with them, and expect performance to stay the same...
I hate to break it to you but going with higher resolution textures has basically zero impact on performance if you have enough vram just like I said. If you doubt me then take Crysis 3 and try it on low medium high or very high textures and you will see the performance is basically exactly the same not even changing a full frame per second from low to very high.
 
I hate to break it to you but going with higher resolution textures has basically zero impact on performance if you have enough vram just like I said.
Not true, and you even quoted why...

As for Crysis 3, not the greatest example. The quality difference between "Low" and "Very High" isn't actually that big to start with. Is that FPS figure taken while standing still, or is it the average of a time-demo? You also haven't taken loading times into account at all. Is resource-streaming compensating and reducing the quality of distant textures while the rest of the system catches up?

Also, I was talking about textures on the order of 16k resolution. The majority of the textures in Crysis 3 (at Very High quality) are still only 2k resolution, a whole 128 times smaller than the scenario I had laid out. Using Crysis 3 as an example is orders of magnitude off the mark.
 
Last edited:
Not true, and you even quoted why...

As for Crysis 3, not the greatest example. The quality difference between "Low" and "Very High" isn't actually that big to start with. Is that FPS figure taken while standing still, or is it the average of a time-demo? You also haven't taken loading times into account at all. Is resource-streaming compensating and reducing the quality of distant textures while the rest of the system, catches up?
I spent hours testing that and no difference at all in performance. there have been other games too where increasing the texture resolution had zero impact on performance. please show me a game where it impacts the performance.
 
I spent hours testing that and no difference at all in performance. there have been other games too where increasing the texture resolution had zero impact on performance. please show me a game where it impacts the performance.
See my edit above, Crysis 3's "very high" texture quality is far below the bar of what's being discussed.

And I had already mentioned a game, earlier in this very thread, where texture resolution alone can easily sap enough performance that the game is below playable framerates long before you run out of video RAM: Skyrim.

Granted, Skyrim doesn't come with textures high-res enough to be worth discussing either, but there are mods that add a good number of 8k textures to the game. These can be a MASSIVE performance sucker, especially if an object or surface is in-view that overlays multiple such textures and does shader-based processing on the obscenely high-resolution texture map.

I assure you, doing parallax mapping and/or displacement mapping requires GPU horsepower, and calculating at single-pixel precision becomes increasingly more GPU-intensive as texture resolution increases. And these aren't the only processes that become more intense as you increase texture resolution.
 
Back
Top