Running 2 graphics cards?

Ron FTL

[H]ard|Gawd
Joined
Jan 17, 2008
Messages
1,203
My PC is old running a i7-4790k on a Asus Z97-A board with 20gb of ram.
I been running a RTX 3070 and dual monitor setup for about a month and noticed when I watch streams on my second monitor while gaming it definitely sacrifices some frames. I've noticed with a twitch stream alone has the 3070 up to about 20-30% utilization in GPU-Z.
Now I have a GTX970 that has been collecting dust. Can I install that to handle to video stream processing and give the RTX3070 all its power for the games? I don't know a lot about dual graphics cards.

I know I need a new processor and have been on the hunt for a 5600x, but for the time being will adding my old card allow me to game and play streams without sacrificing any 3070gpu power?
 
Since both will use the same drivers it shouldn't be too much trouble.

You also have the iGPU option (make sure it's activated in bios) as a last resort.
 
Since both will use the same drivers it shouldn't be too much trouble.

You also have the iGPU option (make sure it's activated in bios) as a last resort.
No luck. I enabled iGPU and multimonitor settings in bios but windows would never detect my second monitor.
I ended up putting in the gtx970 and had my second monitor connected to it. Dual monitor worked fine, but watching a 1080p60 stream it still had the 3070 perform all the video decoding and the gtx970 was at 0% utilization despite the monitor being connected to it.

Oh well may not be possible for whatever reason. I am sure I am not getting full bandwidth on this 3070 yet because this old mobo is only giving me x8 lanes with PCIe 3.0 since I am using 2 PCIe cards.
I might fare better performance when I can snag a new mobo and ryzen cpu.

Thanks for your suggestion though.
 
Oh well may not be possible for whatever reason. I am sure I am not getting full bandwidth on this 3070 yet because this old mobo is only giving me x8 lanes with PCIe 3.0 since I am using 2 PCIe cards.
I might fare better performance when I can snag a new mobo and ryzen cpu.

It has been proven time and time again that there is no difference between running your GPU with x8 lanes vs. x16 lanes. I tested my RTX 3090 at PCIe Gen 4.0 x16 and PCIe Gen 3.0 x8 and found no difference in performance what so ever. You are fine with your RTX 3070.
 
No luck. I enabled iGPU and multimonitor settings in bios but windows would never detect my second monitor.
I ended up putting in the gtx970 and had my second monitor connected to it. Dual monitor worked fine, but watching a 1080p60 stream it still had the 3070 perform all the video decoding and the gtx970 was at 0% utilization despite the monitor being connected to it.

Oh well may not be possible for whatever reason. I am sure I am not getting full bandwidth on this 3070 yet because this old mobo is only giving me x8 lanes with PCIe 3.0 since I am using 2 PCIe cards.
I might fare better performance when I can snag a new mobo and ryzen cpu.

Thanks for your suggestion though.

It should be possible, you may need to manualy tell the nvidia controls which card to use for which program.
 
On my 3070, I had problems when one monitor was on displayport and the other was on hdmi. Lot's of gltiches, stutters, screen disconnects. When i switched to displayport for both monitors the issues cleared up.
 
Something is wrong with your settings. I use a discrete card and an Intel IGP together all the time.
 
Well, at 1080P, its normally not an issue at all as the frame rates are usually so high as to not matter. At 4K, the difference is negligible even dropping to 8x PCIe 3.0. 4K being what I run and where I did all my testing.
I agree, use case determines if it really matters and I really don't see that many use cases where it does. Is there a difference? Yes. Does it matter? Rarely I would say.
 
If you have updated windows 10 then in advanced display setting next to where you enable hardware GPU scheduling you can define programs to use different GPU.
You might also want to disable HW acceleration entirely. Maybe... not sure with this GPU and what kind of quality of streams you watch but for something like 1080p it should be not too taxing n your CPU.
I do not use HW acceleration in browsers and never had, even when I was using i5 3570k and everything worked fine. Of course with 8c/16t CPU it makes more sense.
Advantage of this is that browsers do not allocate your GPU memory and do not execute on GPU.

Also if anything I would say it is better to investigate iGPU option as GTX 970 can hardly be considered power efficient and as you noted it cuts your bandwith. Not that it matters much but wasting performance to get performance is not a great solution ;)
iGPU should just work. If you have drivers enabled at least and if monitor you connect is not too crazy high resolution/refresh rate.
 
My PC is old running a i7-4790k on a Asus Z97-A board with 20gb of ram.
I been running a RTX 3070 and dual monitor setup for about a month and noticed when I watch streams on my second monitor while gaming it definitely sacrifices some frames. I've noticed with a twitch stream alone has the 3070 up to about 20-30% utilization in GPU-Z.
Now I have a GTX970 that has been collecting dust. Can I install that to handle to video stream processing and give the RTX3070 all its power for the games? I don't know a lot about dual graphics cards.

I know I need a new processor and have been on the hunt for a 5600x, but for the time being will adding my old card allow me to game and play streams without sacrificing any 3070gpu power?
Don't read anything into GPU utilization when just watching video. The GPU automatically downclocks at idle, which results in massively inflated utilization percentages. Your "20-30%" is actually more like 1-2% when the GPU is fully loaded.

As for what's causing performance issues, I sincerely doubt it's the GPU. What you're most likely seeing is your CPU getting choked thanks to juggling more tasks than it can handle, resulting in stutter. You can see how much your CPU bottlenecks a roughly-equivalent RTX 2080 TI here:




In fact, I'd expect that adding another GPU to offload video would result in worse performance issues, because the CPU would have to spend time juggling more hardware.


Long story short, keep CPU shopping.

---[edit: i can english güd.]---
 
Last edited:
Im also curious about the 20GB ram. You are using 4 dim slots or 3 to get that?
 
On my 3070, I had problems when one monitor was on displayport and the other was on hdmi. Lot's of gltiches, stutters, screen disconnects. When i switched to displayport for both monitors the issues cleared up.
I'm using Display Port for both monitors
If you have updated windows 10 then in advanced display setting next to where you enable hardware GPU scheduling you can define programs to use different GPU.
You might also want to disable HW acceleration entirely. Maybe... not sure with this GPU and what kind of quality of streams you watch but for something like 1080p it should be not too taxing n your CPU.
I do not use HW acceleration in browsers and never had, even when I was using i5 3570k and everything worked fine. Of course with 8c/16t CPU it makes more sense.
Advantage of this is that browsers do not allocate your GPU memory and do not execute on GPU.

Also if anything I would say it is better to investigate iGPU option as GTX 970 can hardly be considered power efficient and as you noted it cuts your bandwith. Not that it matters much but wasting performance to get performance is not a great solution ;)
iGPU should just work. If you have drivers enabled at least and if monitor you connect is not too crazy high resolution/refresh rate.
I will look into finding the HW acceleration setting and give that a shot.
Its a Asus XG279Q 1440P/170hz and an Asus VG248QE 1080P/144hz
Its just odd because when I select detect monitor in windows display settings it doesn't find it, no matter if I use Display port, HDMI, or DVI-e. I'm doing something wrong.
Don't read anything into GPU utilization when just watching video. The GPU will automatically downclocks at idle, which will result in a massively inflated utilization percentage. Your "20-30%" is actually more like 1-2% when the GPU is fully loaded.

As for what's causing performance issues, I sincerely doubt it's the GPU. What you're most likely seeing is your CPU getting choked thanks to juggling more tasks than it can handle, resulting in stutter. You can see here how much your CPU bottlenecks a roughly-equivalent RTX 2080 TI:

In fact, I'd expect that adding another GPU to offload video would result in worse performance issues, because the CPU would have to spend time juggling more hardware.


Long story short, keep CPU shopping.
Well I notice maybe a 3-5fps drop in certain games while I run a stream. If I alt-tab and close stream I get those frames back. I don't lose frames with all games though.
My CPU and GPU has never seem to max out 100% utilization. But you are most likely right. Probably something other than GPU causing the performance loss. I'm sure its this 2014 hardware that just cant keep up...

I might just deal with it as-is until I can find a AMD Zen3 chip.
Im also curious about the 20GB ram. You are using 4 dim slots or 3 to get that?
4 slots
 
Okay. Figured it all out. Thanks for all the suggestions.
I was missing the Intel HD 4600 drivers, once installed everything worked as it should and I could assign chrome graphic processing to the CPU instead of GPU for the 2nd monitor.

Unfortunately watching a 1080p60 twitch stream alone on the integrated graphics puts my CPU at ~50% utilization and while playing Battlefront 2 @ 1440p with ultra graphics tanked my fps from 100+ down to 30-40. CPU was maxed out 100%. I exited the stream and fps went back up to 100+. It's my processor that's choking.
Seems like I just don't have the CPU horsepower to use onboard graphics for stream + GPU for gaming.
 
I will look into finding the HW acceleration setting and give that a shot.
I wouldn't bother even trying this.

Presuming that your CPU is the bottleneck -- which I am -- dumping more load onto it would just leave you in a world of hurt.
Well I notice maybe a 3-5fps drop in certain games while I run a stream. If I alt-tab and close stream I get those frames back. I don't lose frames with all games though.
My CPU and GPU has never seem to max out 100% utilization. But you are most likely right. Probably something other than GPU causing the performance loss. I'm sure its this 2014 hardware that just cant keep up...

I might just deal with it as-is until I can find a AMD Zen3 chip.
Your CPU will never hit 100% utilization, because hyperthreading is never 100% efficient. And, you will never hit 100 percent GPU utilization, because your CPU is absolutely choking it.
 
I wouldn't bother even trying this.

Presuming that your CPU is the bottleneck -- which I am -- dumping more load onto it would just leave you in a world of hurt.

Your CPU will never hit 100% utilization, because hyperthreading is never 100% efficient. And, you will never hit 100 percent GPU utilization, because your CPU is absolutely choking it.
Thanks. You were right. Found a Zen3 Ryzen5 and man its made a huge difference. In some games I'm getting +80 fps with the same graphics card.
Now I can play a stream full quality and game with no impact to my fps. This thing is sweet! Finally seeing what this 3070 can really do.
 
Back
Top