Can setting a lower resolution actually HURT performance???

euskalzabe

[H]ard|Gawd
Joined
May 9, 2009
Messages
1,478
Here's the situation:

- I have a Ryzen 3 3200U laptop with Vega 3 graphics (I know, bottom barrell, but I couldn't find any laptop that checked all my boxes and I'm making do with this until the next round of products).
- The phenomenon in question happens exclusively with Destiny 2.
- If I set the resolution to 1280x720 and then %25 resolution scale (blockiness galore!), the game becomes a stuttery mess.
- If I set the resolution to 1600x900 and then %25 resolution scale, the game performs better and the stutter is gone.

I'm beyond puzzled, I've never found anything like this where giving a weak CPU/GPU combo more work to do would cause better performance. Could it be that somehow, setting the resolution "too low" is tripping something in Destiny's rendering, that is interfering with performance? It's the only thing I can think of. At 900p %25 I can get a mostly stable capped 30fps, so I have no "issue" to solve, but it'd be nice to get a stable capped 50fps at 720p %25 if I can solve that stutter.
 
The lower the resolution, the more work the CPU has to do compared to the GPU. You're tipping the performance scales the other way. You want to give the GPU most of the rendering work in games so the CPU can focus on what it does best. If you're struggling increasing your FPS it may be the simple fact that the GPU is already at its limit.

How does it run at 1280x720 with 100% resolution scale?

You may want to try the settings in this video. Vega 3 running Destiny 2 at 1280x720 and 85% resolution scale. Still struggles a bit, but looks okay for the most part.

 
Last edited:
The lower the resolution, the more work the CPU has to do compared to the GPU. You're tipping the performance scales the other way. You want to give the GPU most of the rendering work in games so the CPU can focus on what it does best. [snip] How does it run at 1280x720 with 100% resolution scale?

Oh, that's really interesting. I never thought of it that way. May I ask, why would the CPU be doing more work? If all you do is lower resolution, shouldn't that simply alleviate work from both CPU and GPU? I don't see why asking the GPU to do less keeps the CPU busier, specially when all scaling is set to be done on the GPU. Any pointers would be really appreciated, I find this specific situation fascinating, as I've never encountered it before - then again, I've never bought a laptop this cheap, so I'm getting quite an education on making things work :)

I'll check for 720p %85 and %100 later today, I'm currently making bread dough! :)
 
Oh, that's really interesting. I never thought of it that way. May I ask, why would the CPU be doing more work? If all you do is lower resolution, shouldn't that simply alleviate work from both CPU and GPU? I don't see why asking the GPU to do less keeps the CPU busier, specially when all scaling is set to be done on the GPU. Any pointers would be really appreciated, I find this specific situation fascinating, as I've never encountered it before - then again, I've never bought a laptop this cheap, so I'm getting quite an education on making things work :)

I'll check for 720p %85 and %100 later today, I'm currently making bread dough! :)
The CPU can't keep up with how fast the GPU is generating frames. The CPU is a low throughput, low latency while the GPU is the exact opposite. The GPU renders and outputs the rendered frame, but the CPU still has to send data to the GPU for the frames to be rendered. Primarily the CPU is responsible for things like input capture, AI and physics calculations, which are needed to properly update the rendering of the game engine. What is happening is the GPU ends up waiting for the CPU to send it more data, which causes the stutter as the GPU stalls out.
 
The CPU can't keep up with how fast the GPU is generating frames.

This is quite interesting. Of course, the higher the framerate, the more commands the CPU issues. I never thought that at such low quality levels, even going to 45fps would strain this weak CPU more than it should I set it to 720, %50 resolution, with a 30fps cap, and I'm now getting very stable performance - actually better than lowering the resolution percentage and setting it at 60fps. However weak this CPU/GPU combo is, it's clear that the CPU is weaker than the GPU, overall.

I wish I had bought a Ryzen 5 3500U, but everything I found had either a garbage TN panel or a reflective finish IPS, and an Acer Aspire 5, at a lowly $300, was the only thing I could find with a matte 15" IPS panel. It's not ideal, but it'll serve as a barebones game-station during vacation and trips for a few months. Now I know to embrace 30fps on this machine to not tax the CPU too much (although, I'll admit, Doom runs at 50fps on custom res 1280x540 on mostly low settings and it's quite wonderful for such a feeble machine).
 
This is quite interesting. Of course, the higher the framerate, the more commands the CPU issues. I never thought that at such low quality levels, even going to 45fps would strain this weak CPU more than it should I set it to 720, %50 resolution, with a 30fps cap, and I'm now getting very stable performance - actually better than lowering the resolution percentage and setting it at 60fps. However weak this CPU/GPU combo is, it's clear that the CPU is weaker than the GPU, overall.

I wish I had bought a Ryzen 5 3500U, but everything I found had either a garbage TN panel or a reflective finish IPS, and an Acer Aspire 5, at a lowly $300, was the only thing I could find with a matte 15" IPS panel. It's not ideal, but it'll serve as a barebones game-station during vacation and trips for a few months. Now I know to embrace 30fps on this machine to not tax the CPU too much (although, I'll admit, Doom runs at 50fps on custom res 1280x540 on mostly low settings and it's quite wonderful for such a feeble machine).
Image quality doesn't affect all the data the CPU needs to process. Oftentimes the CPU is doing just as much work when graphics are set to ultra low at 640x360 that it has to do at ultra high and 3840x2160 because most of the things it's responsible for exist outside the rendering pipeline. Things that can be directly related to the rendering pipeline include the number of NPCs active at once and things of that nature. So for example, you can effectively decrease the load on the GPU by a factor of 36 while the load on the CPU is decreased just by a factor of 2. Remember also that a GPU is handling thousands of hardware processing threads at once while a CPU is only handling 4 to 16, so you can see how the video card will quickly outrun the CPU as you give it less work. Just throwing out some numbers to give you a better idea.
 
It's funny how I'm so used to buying higher performance parts, just setting everything to high at 2560x1080 and forgetting about anything else, that for the first time I'm actually forced to squeeze out performance out of this hardware and it's led to unexpected discoveries. I'm very much enjoying this low-level escapade, frankly - it certainly beats buying a clunky and thick or 7 pound heavy gaming laptop.

This has been very clarifying. I'm looking forward to finding the sweet spot in any games I play while traveling in the next 12 months. I feel like I'm back in the 90s with my first computer builds and the first pre-GPU video cards. Thanks for your help Armenius !
 
I think you could also get a situation where your frame count is actually higher (assuming the CPU is up to the task at hand) but the output would be more sporadic, and hence stuttery as you mentioned. So the average frame rate will be higher, but all over the place when factoring in min / max, so it looks worse. I would definitely force VSync and the highest amount of buffering available (double / triple) in the renderer when using a lower resolution, as this would mitigate the appearance of less performance. Outside of this case, exactly what everyone else said above.
 
I think you could also get a situation where your frame count is actually higher (assuming the CPU is up to the task at hand) but the output would be more sporadic, and hence stuttery as you mentioned. So the average frame rate will be higher, but all over the place when factoring in min / max, so it looks worse. I would definitely force VSync and the highest amount of buffering available (double / triple) in the renderer when using a lower resolution, as this would mitigate the appearance of less performance. Outside of this case, exactly what everyone else said above.
This is a laptop so I assume the refresh rate is 60 Hz. When framerate is less than the refresh rate and frames are being delivered at an uneven rate you can get judder. You get tearing when the opposite happens. Stutter is purely a performance issue that no type of sync, variable or fixed, will solve.
 
This is a laptop so I assume the refresh rate is 60 Hz. When framerate is less than the refresh rate and frames are being delivered at an uneven rate you can get judder. You get tearing when the opposite happens. Stutter is purely a performance issue that no type of sync, variable or fixed, will solve.

True. I also missed the laptop thing for some reason.
 
Back
Top