Question on FPS vs Multiple Displays

1Wolf

Limp Gawd
Joined
Jul 10, 2007
Messages
433
I've got kind of a weird question here and I wasn't even sure where to post it...whether it was a GPU question, a display question, or an audio question. So I'll try here and hopefully someone here smarter than me (which is probably most of you anyway) will know the answer. If not maybe I've got it in the wrong subforum.

Ok, here goes....

My primary display is a 38" ultra-wide and I'm running Display Port out of my 3090 to that display.

For audio I run a 5.1 setup. I'm running from the HDMI out of the GPU into my Denon receiver and then from the receiver to a 27" 1080p sidecar monitor. Regardless of whether or not I even have a 2nd monitor, windows sees the Denon receiver as a display and sends the display signal down the HDMI cable.

So my question is...

If you are running games, will there be any performance difference in FPS for what kind of monitor you may have hooked up as the sidecar in the example above? If I upgrade that sidecar monitor from a 27" 1080p to a 27" 1440p will it rob any of my gaming horsepower? Or conversely, if I simply unplug the HDMI cable from the 1080p monitor that is there now...will it improve performance? Please note, I'm not talking about running a game using that monitor too in a multiple display setup. I'm only talking about running the game on my main primary display and if I even bother to turn the 2nd monitor on it would just be there at windows desktop.

I'm just confused and having a hard time understanding because in my mind the "signal" is going down that HDMI cable whether its used or not so I'm trying to figure if the resolution of the monitor I may have at that end (if any) will affect performance. Sometimes I have a hard time wrapping my mind around this audio-visual signal stuff ;)

The reason I ask all of this is because I had torn my entire gaming/work desk and area apart and removed my old gaming computer to make room for the gaming computer I just built. My old primary display used to be a very nice 27" 1440p monitor and my old sidecar used to be a pretty old and beat up 1080p monitor. I just bought a new 38" primary display and I'm trying to decide whether to shift my previous very nice 1440p monitor over to serve as sidecar duty but I didn't want to rob any FPS from my gaming experience due to stepping up from 1080p to 1440p on the sidecar. I only use that sidecar mostly for work anyway so its not worth it to me to give up gaming performance but if it won't cost me gaming performance then I'd MUCH rather still be able to use my nice 1440p monitor.

Thanks!
 
Basic desktop use is so negligible in terms of gpu utilization that I wouldn't be concerned about it. I probably would avoid doing something like running a video on the side monitor while gaming though.

I think you're going to find that your receiver doesn't want to output 1440p. It might work in passthrough mode but then you lose the overlay, if that doesn't work it will want to upscale to 4k or downscale to 1080p.
 
  • Like
Reactions: 1Wolf
like this
Basic desktop use is so negligible in terms of gpu utilization that I wouldn't be concerned about it. I probably would avoid doing something like running a video on the side monitor while gaming though.

I think you're going to find that your receiver doesn't want to output 1440p. It might work in passthrough mode but then you lose the overlay, if that doesn't work it will want to upscale to 4k or downscale to 1080p.

Thank you Ebernanut. You were 100% right. I'm so glad you chimed in when you did or I might have spent the better part of the day farting around with this thing trying to figure out why it won't work. You were right: The 1440p monitor couldn't see the HDMI signal from the receiver. It was exactly as you said. I'm not sure how to get to passthrough mode or even exactly what that means...but I went through my Denon AVR-1613 manual and searched for the term "passthrough" and didn't find anything. I've got a feeling you're right and it just won't work.

Just out of curiousity, in lamens terms, why is it that the receiver can send a video signal to a 1080p monitor and not a 1440p one. Is it that the resolution is just too high and my receiver is just too old?
 
Thank you Ebernanut. You were 100% right. I'm so glad you chimed in when you did or I might have spent the better part of the day farting around with this thing trying to figure out why it won't work. You were right: The 1440p monitor couldn't see the HDMI signal from the receiver. It was exactly as you said. I'm not sure how to get to passthrough mode or even exactly what that means...but I went through my Denon AVR-1613 manual and searched for the term "passthrough" and didn't find anything. I've got a feeling you're right and it just won't work.

Just out of curiousity, in lamens terms, why is it that the receiver can send a video signal to a 1080p monitor and not a 1440p one. Is it that the resolution is just too high and my receiver is just too old?
If it's like my Marantz(sister brand) you should be able to get a signal to it, it will just be 1080p(limited to 60hz) or 4k. If I input a 1440p signal to my receiver it outputs it to my 1440p monitor as 4k which the monitor can't display natively but can scale and display so it shows up as an available res, I can manually set it to output 1080p but instead I just input 1080p and nothing is scaled. I was mistaken on the terms, passthrough for denon/marantz has to do with passing a signal with the receiver off, turn off video conversion(setup->video->output settings on mine) to enable passthrough. IP scaler needs to be on to change the output res, resolution settings are greyed out until the scaler is enabled.
 
  • Like
Reactions: 1Wolf
like this
If it's like my Marantz(sister brand) you should be able to get a signal to it, it will just be 1080p(limited to 60hz) or 4k. If I input a 1440p signal to my receiver it outputs it to my 1440p monitor as 4k which the monitor can't display natively but can scale and display so it shows up as an available res, I can manually set it to output 1080p but instead I just input 1080p and nothing is scaled. I was mistaken on the terms, passthrough for denon/marantz has to do with passing a signal with the receiver off, turn off video conversion(setup->video->output settings on mine) to enable passthrough. IP scaler needs to be on to change the output res, resolution settings are greyed out until the scaler is enabled.
Thanks Ebernanut :) Sorry to say that I don't quite follow. Perhaps I should try and describe better....

I set up the following test just as its a simpler setup then all the "sidecar" stuff and such above. Here is my setup.

I have a PC with its video card connected to the input of the receiver via HDMI. The output of the receiver is then connected to a 1080p monitor via HDMI. This works fine. When I set the monitor's input to "HDMI" it shows my windows desktop and I can hear my audio just fine.

Now...if I swap that 1080p monitor out and place my 1440p monitor in its place with all other connections the same and I set the monitor's input to "HDMI" it shows nothing. Just a black screen and flashes "No HDMI Signal". Its not getting any sort of signal.

I'm trying to understand why this is? The signal is getting TO the receiver just fine. But with a 1080p monitor I can see the desktop but with everything else the same and a 1440p monitor instead...the monitor doesn't get a signal at all.
 
Thanks Ebernanut :) Sorry to say that I don't quite follow. Perhaps I should try and describe better....

I set up the following test just as its a simpler setup then all the "sidecar" stuff and such above. Here is my setup.

I have a PC with its video card connected to the input of the receiver via HDMI. The output of the receiver is then connected to a 1080p monitor via HDMI. This works fine. When I set the monitor's input to "HDMI" it shows my windows desktop and I can hear my audio just fine.

Now...if I swap that 1080p monitor out and place my 1440p monitor in its place with all other connections the same and I set the monitor's input to "HDMI" it shows nothing. Just a black screen and flashes "No HDMI Signal". Its not getting any sort of signal.

I'm trying to understand why this is? The signal is getting TO the receiver just fine. But with a 1080p monitor I can see the desktop but with everything else the same and a 1440p monitor instead...the monitor doesn't get a signal at all.
What resolution do you have set for the hdmi that outputs to the receiver? Also have you tried going into the receivers settings menu and looked for the video output settings I mentioned?

My guess is that it's seeing a signal that's higher res than 1080p and scaling it up to a 4k signal that your monitor can't handle, you should be able to check this by forcing a 1080p resolution output in windows or the receiver. You could try to find the video conversion setting or an equivalent and turn it off, you won't have any onscreen menus or overlay but you might get a 1440p signal. If that doesn't work you'll probably need to run it at 1080p and lose at least some of the benefit of the 1440p monitor over the 1080p one.

It could also be something besides resolution that the monitor doesn't like such as a 10 bit signal with a monitor that only supports 8 bit. With hdmi if the source is sending a signal that that the monitor can't handle it acts as if there's no signal at all, with your setup you ultimately need to make sure the receiver is outputting a signal that the monitor can handle but windows can complicate that some.
 
  • Like
Reactions: 1Wolf
like this
Actually you can disregard much of that last post. I got curious and looked at the manual for your receiver and I don't see any setting related to changing video output so unless I'm missing something you're limited to monkeying around with it from the windows side and are unlikely to get 1440p working.

The other two possible causes I forgot to mention are HDCP issues or an hdmi cable that doesn't have the bandwidth for the signal it's trying to send(which might still work for a lower res), unfortunately only the latter is really fixable.
 
  • Like
Reactions: 1Wolf
like this
Receivers (except newer ones) expect a very limited set of output types I think. It doesn’t know what 1440 is.
 
Thanks Ebernanut :) Sorry to say that I don't quite follow. Perhaps I should try and describe better....

I set up the following test just as its a simpler setup then all the "sidecar" stuff and such above. Here is my setup.

I have a PC with its video card connected to the input of the receiver via HDMI. The output of the receiver is then connected to a 1080p monitor via HDMI. This works fine. When I set the monitor's input to "HDMI" it shows my windows desktop and I can hear my audio just fine.

Now...if I swap that 1080p monitor out and place my 1440p monitor in its place with all other connections the same and I set the monitor's input to "HDMI" it shows nothing. Just a black screen and flashes "No HDMI Signal". Its not getting any sort of signal.

I'm trying to understand why this is? The signal is getting TO the receiver just fine. But with a 1080p monitor I can see the desktop but with everything else the same and a 1440p monitor instead...the monitor doesn't get a signal at all.

When you've got it doing this, audio is still working? What resolution does windows think it's outputting? What resolution does your reciever think it's receiving? (My HDMI receivers will show that on the front display if you push the right buttons)

If your receiver doesn't get audio in this mode, probably it can't handle the hdmi data at the rate needed for 1440p; it will generally strip the audio and repeat the video content to the output. If it doesn't understand this data rate, that'll break audio and video.

If audio works, try forcing modes; it's not like the CRT days where you could fry a monitor with bad syncrates. It's worth trying to see if you can get something to work without heroic effort. An ideal receiver will take the monitor's EDID and add the audio modes it likes and maybe remove video modes at data rates it can't handle. Your receiver may be trying to do that, but failing for some reason... Forcing the video mode could help. You might possibly need to edit your edid (or combine edid from monitor directly with receiver edid with the 1080p monitor) and force the driver to use that. That sounds like fun for the right person, but most people aren't... Anyway, try forcing modes, there's often somewhere in the driver control panel you can do that easily.
 
  • Like
Reactions: 1Wolf
like this
What resolution do you have set for the hdmi that outputs to the receiver? Also have you tried going into the receivers settings menu and looked for the video output settings I mentioned?

My guess is that it's seeing a signal that's higher res than 1080p and scaling it up to a 4k signal that your monitor can't handle, you should be able to check this by forcing a 1080p resolution output in windows or the receiver. You could try to find the video conversion setting or an equivalent and turn it off, you won't have any onscreen menus or overlay but you might get a 1440p signal. If that doesn't work you'll probably need to run it at 1080p and lose at least some of the benefit of the 1440p monitor over the 1080p one.

It could also be something besides resolution that the monitor doesn't like such as a 10 bit signal with a monitor that only supports 8 bit. With hdmi if the source is sending a signal that that the monitor can't handle it acts as if there's no signal at all, with your setup you ultimately need to make sure the receiver is outputting a signal that the monitor can handle but windows can complicate that some.

Actually you can disregard much of that last post. I got curious and looked at the manual for your receiver and I don't see any setting related to changing video output so unless I'm missing something you're limited to monkeying around with it from the windows side and are unlikely to get 1440p working.

The other two possible causes I forgot to mention are HDCP issues or an hdmi cable that doesn't have the bandwidth for the signal it's trying to send(which might still work for a lower res), unfortunately only the latter is really fixable.

Wow. That was very kind of you to look that up. Thank you!

To simplify matters, I decided to try using my laptop instead of a desktop PC/graphics card. Here is the setup:

1) Laptop (Windows desktop resolution 1920x1200) to AVR via HDMI then AVR to ASUS VG278H 1080p monitor via HDMI. This works fine. I get BOTH Audio through the AVR speakers and video to the monitor. If I press the "Setup" button on the AVR remote the AVR overlay does display on the monitor.

2) Laptop (Windows desktop resolution 1920x1200) to AVR via HDMI then AVR to ASUS PG279Q 1440p monitor via HDMI. This gives NO video signal to the monitor. It says "No HDMI Signal". However, I DO get audio through my AVR speakers. If I press the "Setup" button on the AVR remote the AVR overlay actually DOES display on the monitor. It won't display the video source signal but it does display the overlay.

So these ended up being the same results as when I used either desktop PC I tried.

I checked through my AVR manual and couldn't find anything at all that would let me mess around with that signal.

I also tried several different HDMI cables. All are high end, high speed, 4K capable, HDR, fancy braided cable, well proven, etc. etc.

I'm guessing, based on the posts in this thread that either the receiver just doesn't understand how to "talk" to a 2560x1440 monitor or one of the features of the monitor is making it difficult to talk to. The monitor is a "gaming" monitor capable of 144Hz and GSync. Not sure if that matters or not but, one way or the other, the 1440p monitor can't seem to understand the signal from the AVR.

Receivers (except newer ones) expect a very limited set of output types I think. It doesn’t know what 1440 is.

I'm thinking that this might be the case and this just isn't going to work.

When you've got it doing this, audio is still working? What resolution does windows think it's outputting? What resolution does your reciever think it's receiving? (My HDMI receivers will show that on the front display if you push the right buttons)

If your receiver doesn't get audio in this mode, probably it can't handle the hdmi data at the rate needed for 1440p; it will generally strip the audio and repeat the video content to the output. If it doesn't understand this data rate, that'll break audio and video.

If audio works, try forcing modes; it's not like the CRT days where you could fry a monitor with bad syncrates. It's worth trying to see if you can get something to work without heroic effort. An ideal receiver will take the monitor's EDID and add the audio modes it likes and maybe remove video modes at data rates it can't handle. Your receiver may be trying to do that, but failing for some reason... Forcing the video mode could help. You might possibly need to edit your edid (or combine edid from monitor directly with receiver edid with the 1080p monitor) and force the driver to use that. That sounds like fun for the right person, but most people aren't... Anyway, try forcing modes, there's often somewhere in the driver control panel you can do that easily.
Thanks for hopping in toast0. Yup, the audio does indeed still work. If you peek at my response above to Ebernanut you can see the last test setup I've used. With regards to the windows resolution...

The laptop test in the above setup was 1920x1200. I've tried one desktop PC that was set to 2560x1440 and one desktop PC that was set to 1920x1080. All three of the machines produced the same result. Audio & video work when sending to the 1080p monitor but ONLY audio works when sending to the 1440p monitor. I don't see any sort of resolution displayed on the AVR itself by pressing the "Info" button on the AVR remote or anything so I'm thinking this AVR doesn't have that feature you mentioned.

I'm thinking that the AVR just doesn't know how to talk to that 1440p monitor.

As far as forcing modes goes, I'm not quite sure what that means. I certainly wouldn't know what an EDID is or how to edit one or mess with drivers. However, even without knowing these things, I'm guessing that if I've tried 3 different machines and get the same result each time....that its just going to turn out that the monitor and that AVR are just not going to get along.

Its looking like I'll need to stick with the 1080p for my sidecar and just pack up that PG279Q in case I need a nice monitor in the future or find another use in the home for it.
 
Im currently running a 3090 with active dvi-d adapters. All three adapters are to Acer 3d 1080p monitors and do 120hz/120 fps rubbing nSurround. I also have an ancient dell monitor with a hdmi dvi adapter for the guages thats not even close to 120hz.

So my experience is one potato in your setup doesn’t affect the rest. At some point i’ll replace the three 3d monitors with one of the fancy ultrawide gaming models but i am happy woth my setup at the moment
 
Im currently running a 3090 with active dvi-d adapters. All three adapters are to Acer 3d 1080p monitors and do 120hz/120 fps rubbing nSurround. I also have an ancient dell monitor with a hdmi dvi adapter for the guages thats not even close to 120hz.

So my experience is one potato in your setup doesn’t affect the rest. At some point i’ll replace the three 3d monitors with one of the fancy ultrawide gaming models but i am happy woth my setup at the moment
*with nvidia. Have encountered issues with the 6800XT and a potato monitor several times. Also encountered similar with a 5700XT.
 
*with nvidia. Have encountered issues with the 6800XT and a potato monitor several times. Also encountered similar with a 5700XT.
Yes, thats a qualifier…i havent had an AMD card since i think the 800x for the ‘new doom’ back in the agp days lol
 
*with nvidia. Have encountered issues with the 6800XT and a potato monitor several times. Also encountered similar with a 5700XT.
What sort of issues? I haven't had any issues with my 5600xt and IME multi monitor setups is one of the few areas that AMD tends to do better with than Nvidia.
 
What sort of issues? I haven't had any issues with my 5600xt and IME multi monitor setups is one of the few areas that AMD tends to do better with than Nvidia.
Micro stutter and freezing every 2-3 seconds. This was with one monitor at 144hz and one at 120, or one at 120 and one at 60. Shut off the slow monitor and performance is stable. Two screens at 120 or 140 also works fine. Very distracting; generally high fps games it was worst on.
 
Micro stutter and freezing every 2-3 seconds. This was with one monitor at 144hz and one at 120, or one at 120 and one at 60. Shut off the slow monitor and performance is stable. Two screens at 120 or 140 also works fine. Very distracting; generally high fps games it was worst on.
Interesting, thanks for the reply. I haven't had that issue and I have one at 120 and another at 60 which isn't to say that you're wrong, just that I haven't experienced that.
 
Interesting, thanks for the reply. I haven't had that issue and I have one at 120 and another at 60 which isn't to say that you're wrong, just that I haven't experienced that.
I noticed it first on frost punk , then halo master chief. Some on Subnautica too. I haven’t tested recently; all my monitors are currently set to 120hz. I will be adding back my pro art screen soon though, at 60hz.
 
Back
Top