Add graphics card to my PC (non-gamer)?

fatryan

[H]ard|Gawd
Joined
Feb 19, 2004
Messages
1,402
I'll preface this by saying that I know next to nothing about graphics cards, so I may not even be able to understand some of your responses if they're too technical.

The rig in question is the one in my sig with an i7-8700. I built this (with the help of my brother in law) for the sole purpose of being a Blue Iris "PC-NVR". All components were designed around the hardware requirements of Blue Iris, which is optimized to run off Intel QSV. Supposedly, even using a high end graphics card would be less efficient than using QSV in the program. Therefore, I never had a need for a dedicated graphics card.

Fast-forward to today and I'm currently vetting alternative options to Blue Iris, as I've found the program to be quite problematic and the developer to be a complete ass. Some alternative software is also optimized to run on QSV but still has support for dedicated graphics. And other programs don't really specify. Every one I've seen supporting dedicated graphics cards has mentioned Nvidia graphics cards, with some specifying cuda cores (whatever that means). My surveillance cameras native encoding is h264 or h265, though I'd want to keep the option open to add other cameras which might use different codecs.

In addition, this machine has also turned into a Plex server, so it regularly handles transcoding jobs. I dont know if its Plex or the CPU, but sometimes the transcoding really struggles when only 1 video is being transcoded.

With this pandemic, Ive been using the machine to work from home. I use an AOC 32" 4k monitor on DP, and I added a verfy old Dell monitor thats connected via DVI-D to HDMI adapter. That's all the available ports on my motherboard.

I'm trying to figure out if I would benefit from adding a dedicated graphics card. I'd like to be able to push another 4k monitor, or maybe a couple 1440p monitors to extend my workstation. And I'd like optimal video processing, beit in the surveillance software or in Plex. On very rare occasions, I do some light video processing in Movie Studio or similar software. I do not game at all.

What are your thoughts on adding a card? And if I should, what are some recommendations?
 
Nvidia would be the way to go with video encoding performance. Turing has fantastic video encoding performance, specifically the 2060ko offers a ton for the $

That being said it's software dependant, but if the software states it supports cuda you're good to go.
 
GPU support is getting much more prevalent but you do need software to support it in order for it to make sense. The CPU you have has a small integrated gpu that does do video transcoding so if that’s all that the software supports you will see limited benefit (i.e. from “blue iris”).
Plex supports nvidia gpus that are faster and better at transcoding and most browsers now can use the extra GPU horsepower to speed things up as does windows 10 in general.
Biggest bang for the buck will be choosing a software platform that effectively uses an nvidia gpu and then picking a recommended card from the manufacturer.
 
Nvidia would be the way to go with video encoding performance. Turing has fantastic video encoding performance, specifically the 2060ko offers a ton for the $

That being said it's software dependant, but if the software states it supports cuda you're good to go.

Thanks. That video was like a foreign language to me lol. I'll have to look into this card more.

GPU support is getting much more prevalent but you do need software to support it in order for it to make sense. The CPU you have has a small integrated gpu that does do video transcoding so if that’s all that the software supports you will see limited benefit (i.e. from “blue iris”).
Plex supports nvidia gpus that are faster and better at transcoding and most browsers now can use the extra GPU horsepower to speed things up as does windows 10 in general.
Biggest bang for the buck will be choosing a software platform that effectively uses an nvidia gpu and then picking a recommended card from the manufacturer.
unfortunately I haven't decided on a replacement surveillance program yet. Was going to try milestone Xprotect, but i can't even get it to recognize my cameras. Trying DW Spectrum after that. I guess i should make a list of the required specs for each.
 
Thanks. That video was like a foreign language to me lol. I'll have to look into this card more.

unfortunately I haven't decided on a replacement surveillance program yet. Was going to try milestone Xprotect, but i can't even get it to recognize my cameras. Trying DW Spectrum after that. I guess i should make a list of the required specs for each.

The 2060KO is a salvaged 2080 class die. As a result the EVGA 2060KO has 2060 class gaming performance, but better encoding performance vs. the regular 2060.
 
The 2060KO is a salvaged 2080 class die. As a result the EVGA 2060KO has 2060 class gaming performance, but better encoding performance vs. the regular 2060.
Well seeing as how I don't game, I don't really care much about gaming performance. I'm just concerned about the 24/7/365 processing of my surveillance cameras' video and Plex transcoding. Also would ideally like a card with multiple HDMI and multiple DP outputs.

Can the Intel iGPU be used simultaneously with a dedicated card? Some people using Blue Iris said adding a card made their iGPU unavailable, but I do not know if that's typical.
 
If you have Intel quick sync, you should be able to just turn on hardware transcoding in plex.... Depending on what your streaming, it should handle more than 1 stream. Otherwise, Nvidia does pretty good for encoding as well, AMD is a bit lower quality (image wise). Most software that supports hardware transcoding/encoding will support Intel quick sync, I suggest testing it out along with enabling hardware transcoding in Plex and see if you even need to spend any $.
 
Forgot to mention, last I checked there was a way to re-enable the onboard iGPU if you do put in a discrete GPU but it had to be plugged into a monitor (or fake monitor adapter) and enabled in the BIOS. I would look that up before making the decision as I'm just going by memory (and it may have changed since).
 
If you have Intel quick sync, you should be able to just turn on hardware transcoding in plex.... Depending on what your streaming, it should handle more than 1 stream. Otherwise, Nvidia does pretty good for encoding as well, AMD is a bit lower quality (image wise). Most software that supports hardware transcoding/encoding will support Intel quick sync, I suggest testing it out along with enabling hardware transcoding in Plex and see if you even need to spend any $.
I'm pretty sure hardware transcoding is on. When it's transcoding, the fans all start spinning up and CPU usage jumps up. Isn't that hardware transcoding? i have it set to never transcode, but sometimes Plex just does it anyway. And it can barely handle 1 transcode sometimes.
 
Forgot to mention, last I checked there was a way to re-enable the onboard iGPU if you do put in a discrete GPU but it had to be plugged into a monitor (or fake monitor adapter) and enabled in the BIOS. I would look that up before making the decision as I'm just going by memory (and it may have changed since).
I will just assume iGPU won't work if i buy a new dedicated card
 
I just looked up the system requirements for a few surveillance programs. I know some may go into more detail in the user manual, but i didn't feel like digging through all that now to find out. These are the bulleted specs from the websites.

Blue Iris
Minimum: Intel QSV
Recommended: Nvidia graphics "adapter" (same as card?) for efficient screen display (sooo just for live viewing?). I know it also states somewhere in the documentation that the GPU should have cuda cores.

Milestone Xprotect
Minimum: Intel QSV (standard), Intel Iris Pro P580 (700+ cam systems)
Recommended (for HW accel.): Nvidia graphics card supported with GPU capability version 6.x (pascal) or newer

DW Spectrum
Minimum: Intel QSV
Recommended (for HW accel.): Nvidia GeForce GTX480. They specifically state that the external card should be used at the same time as QSV on the same exact monitor (2 inputs). Also states this feature is experimental.

ContaCam
No graphics specs listed at all. Other computer specs seem pretty absurd.

So this brings me to the following conclusions and questions. Correct me if I'm wrong here.
1. Intel QSV seems to be the standard for this type of software.
2. Graphics cards seem to be used only for hardware acceleration or powering displays
3. I don't understand some of the differences in verbiage with these various card specs
4. Is having a dedicated gpu and igpu on 1 monitor unusual? Does it require a special type of monitor?
 
If all you want is pure encoding, the 1650 Super is the highest-quality encoding engine you can get at $160.

You only need to worry about something like that 2060 KO if you're looking tto encode multiple streams - for single stream, it should be more than enough.
 
I'm pretty sure hardware transcoding is on. When it's transcoding, the fans all start spinning up and CPU usage jumps up. Isn't that hardware transcoding? i have it set to never transcode, but sometimes Plex just does it anyway. And it can barely handle 1 transcode sometimes.
No, hardware transcoding means it uses the GPU, which should not spike your CPU that high (depending on CPU anyways, it does still require some CPU resources while transconding on GPU). What kind of transcodes is Plex doing? 4k h256 to something else? Bitrate/resolution? I have an old dell server from 2011 that can easily do a couple of 1080p transcodes (and it doesn't do hardware/GPU transcoding, just CPU transcoding). There is a checkbox to "Use hardware acceleration when available" in the Plex settings on the web Interface, make sure it's checked.
The version of QSV may matter on what it can decide/encode as older chips had less support. Otherwise QSV on Plex should easily handle multiple streams.

Edit: 8700 should support 10 bit, so you should be good on the QSV version, so possibly something is set right if it's stressing your rig.
 
If all you want is pure encoding, the 1650 Super is the highest-quality encoding engine you can get at $160.

You only need to worry about something like that 2060 KO if you're looking tto encode multiple streams - for single stream, it should be more than enough.
I do need multiple streams. For just my cameras alone, I have (2) 4k cams and (2) 1080p cams. All are currently recording/displaying their main stream at full resolution, 20fps, h264, mid-low bitrate. For remote access purposes, I may need a 720p substream too. Though I use TinyCam for remote live view, cause it can pull the main streams just fine. If I want camera/system control via the software's app, its possible I might need to use substream. I know the gDMSS app for viewing the Dahua cams at my rental property cant handle the main streams remotely. It needs substreams.

This all is on top of whatever transcoding might be needed for Plex at any given time. My wife is working from home every day now, and she always watches movies during the day while working. My surveillance system runs 24/7, so it will always be handling both tasks at once.
 
No, hardware transcoding means it uses the GPU, which should not spike your CPU that high (depending on CPU anyways, it does still require some CPU resources while transconding on GPU). What kind of transcodes is Plex doing? 4k h256 to something else? Bitrate/resolution? I have an old dell server from 2011 that can easily do a couple of 1080p transcodes (and it doesn't do hardware/GPU transcoding, just CPU transcoding). There is a checkbox to "Use hardware acceleration when available" in the Plex settings on the web Interface, make sure it's checked.
The version of QSV may matter on what it can decide/encode as older chips had less support. Otherwise QSV on Plex should easily handle multiple streams.

Edit: 8700 should support 10 bit, so you should be good on the QSV version, so possibly something is set right if it's stressing your rig.
Well I believe the iGPU is kicking up as well. I just recall the CPU ramping up at the same time. It's probably because I've been so paranoid about resource usage with the CPU, because Blue Iris is such a resource hog. Always keeping my eye on CPU usage.

As far as what is being transcoded, when, and why...I do not know most of those answers. My movie library is predominantly UHD, some h264 some h265. I have a few 8-bit and 10-bit HDR titles, but I've since stopped procuring them, because only our TVs support HDR so the video is dark as hell on my computer, phones, tablets, etc. I try to shoot for h265 for efficiency and storage reasons, but that's not always an option. For no apparent rhyme or reason, sometimes one movie will play fine while another freezes up or instantly starts transcoding. This is on identical spec'd videos too, from what I can tell. An example of this is the Hunger Games series. I got all the movies at once, and they're all h265 4k, yet for some reason the first 2 freeze when watching and the third (usually) plays fine...though it has frozen before too. When they freeze, its the video that freezes. Audio continues playing. Doesn't matter what device I watch on, same result.
 
Well I believe the iGPU is kicking up as well. I just recall the CPU ramping up at the same time. It's probably because I've been so paranoid about resource usage with the CPU, because Blue Iris is such a resource hog. Always keeping my eye on CPU usage.

As far as what is being transcoded, when, and why...I do not know most of those answers. My movie library is predominantly UHD, some h264 some h265. I have a few 8-bit and 10-bit HDR titles, but I've since stopped procuring them, because only our TVs support HDR so the video is dark as hell on my computer, phones, tablets, etc. I try to shoot for h265 for efficiency and storage reasons, but that's not always an option. For no apparent rhyme or reason, sometimes one movie will play fine while another freezes up or instantly starts transcoding. This is on identical spec'd videos too, from what I can tell. An example of this is the Hunger Games series. I got all the movies at once, and they're all h265 4k, yet for some reason the first 2 freeze when watching and the third (usually) plays fine...though it has frozen before too. When they freeze, its the video that freezes. Audio continues playing. Doesn't matter what device I watch on, same result.
That's odd, it should have little issue with doing a single transcode like that. I fear putting a new GPU may not fix the issue.if something odd is going it on. Maybe there is some way to get to find out what's going. Have you checked the plex logs for anything? Are you running windows, Linux? Latest Plex version? Latest drivers for the iGPU? Double check that setting I mentioned earlier as well. Maybe watch CPU usage and GPU usage while transcoding to see what its doing? There is also a way to see the transcode on one of the web config pages to see if it's direct play or not, and how it's transcoding (sometimes it can be as simple as the audio isn't the same or has to change package format from mkv to mp4, etc)
 
That's odd, it should have little issue with doing a single transcode like that. I fear putting a new GPU may not fix the issue.if something odd is going it on. Maybe there is some way to get to find out what's going. Have you checked the plex logs for anything? Are you running windows, Linux? Latest Plex version? Latest drivers for the iGPU? Double check that setting I mentioned earlier as well. Maybe watch CPU usage and GPU usage while transcoding to see what its doing? There is also a way to see the transcode on one of the web config pages to see if it's direct play or not, and how it's transcoding (sometimes it can be as simple as the audio isn't the same or has to change package format from mkv to mp4, etc)
Yeah, we regularly get delays in audio too... Even on lower quality videos like our 1080p Office collection. Audio delays can be fixed or improved by pausing/resuming a lot of times, but not always. I assumed it was in the file itself.

Haven't checked logs because i didn't think there was an actual problem. I just pulled them, but honestly I have no idea what any of it means.

I'm running Windows 10 Pro. Plex is set to update automatically, but it usually doesn't work and I need to do it manually. Regardless, I've updated dozens of times with no difference in the issues. iGPU as well as other drivers are up to date. Even updated the board's firmware in the last couple weeks. I have Plex set to play at original quality.

Trying to post some screenshots, but I'm getting an error from the forum.
 
I mean, CPU barely hits 20%, so it's not stressing your system much it appears. If you were closer I'd let you borrow a GPU to test before you spent money. Lots of people building Plex servers purposefully get the Intel chips with iGPU to be by able to run multiple transcodes without hitting the CPU to hard. I am not an expert on t/s Plex issues, but it appears to me there is something besides your processor/iGPU causing those issues. I don't feel like I can help much more, as I said, not a Plex trouble shooting expert by any means.

PS. I'm not to sure about the rest of your use cases for the camera software, but I'd you don't game I don't think an add in GPU is going to net you much benefit in most apps (I'm sure there are probably some exceptions, so if you find an app so some research).
 
I mean, CPU barely hits 20%, so it's not stressing your system much it appears. If you were closer I'd let you borrow a GPU to test before you spent money. Lots of people building Plex servers purposefully get the Intel chips with iGPU to be by able to run multiple transcodes without hitting the CPU to hard. I am not an expert on t/s Plex issues, but it appears to me there is something besides your processor/iGPU causing those issues. I don't feel like I can help much more, as I said, not a Plex trouble shooting expert by any means.

PS. I'm not to sure about the rest of your use cases for the camera software, but I'd you don't game I don't think an add in GPU is going to net you much benefit in most apps (I'm sure there are probably some exceptions, so if you find an app so some research).
I should have added some clarification to those screenshots. That Friends episode that was streaming was not transcoding, that's just what my wife was watching at the time I took the screenshot. I'll have to find one of the problematic files and see what it jumps up to then.

I know you said you're not an expert with Plex, but it was my understanding that the iGPU was limited to 1 single transcode at a time. Not that we're regularly transcoding 2+ videos at once, but it could happen. When it is transcoding, it kicks up quite a bit. And as I mentioned before, Plex just automatically transcodes some videos despite me setting it to only play at original quality. I have no idea why it does that, but it's always been that way since i setup the Plex server like 9 months ago.

I also don't want to take away capacity from my surveillance system. I'm not sure how close I am to that, but surveillance is top priority. It seems as though most surveillance systems are designed to run on iGPU with QSV, so maybe it's best to try to keep that system on Intel graphics. But then anything I could take off the iGPU in favor of a dedicated card would lighten the load on the iGPU. Specifically when it comes to multi-monitor support, people have stated that iGPU cannot push dual 4k displays, let alone 3+ displays. If this is the case, then i guess my only option would be a dedicated card. Though if I cannot simultaneously use dedicated graphics with onboard, my hands might be tied.

I keep seeing people online mention that officially as of Windows 10, it's possible to assign specific software and apps to either a dedicated card or onboard in the windows settings. This seems to be at odds with what people here have said about the use of iGPU with dedicated GPU. Can someone clear this up for me?
 
I should have added some clarification to those screenshots. That Friends episode that was streaming was not transcoding, that's just what my wife was watching at the time I took the screenshot. I'll have to find one of the problematic files and see what it jumps up to then.

I know you said you're not an expert with Plex, but it was my understanding that the iGPU was limited to 1 single transcode at a time. Not that we're regularly transcoding 2+ videos at once, but it could happen. When it is transcoding, it kicks up quite a bit. And as I mentioned before, Plex just automatically transcodes some videos despite me setting it to only play at original quality. I have no idea why it does that, but it's always been that way since i setup the Plex server like 9 months ago.

I also don't want to take away capacity from my surveillance system. I'm not sure how close I am to that, but surveillance is top priority. It seems as though most surveillance systems are designed to run on iGPU with QSV, so maybe it's best to try to keep that system on Intel graphics. But then anything I could take off the iGPU in favor of a dedicated card would lighten the load on the iGPU. Specifically when it comes to multi-monitor support, people have stated that iGPU cannot push dual 4k displays, let alone 3+ displays. If this is the case, then i guess my only option would be a dedicated card. Though if I cannot simultaneously use dedicated graphics with onboard, my hands might be tied.

I keep seeing people online mention that officially as of Windows 10, it's possible to assign specific software and apps to either a dedicated card or onboard in the windows settings. This seems to be at odds with what people here have said about the use of iGPU with dedicated GPU. Can someone clear this up for me?
No, GPU transcoding isn't normally limited to 1 transcode... Well NVidia does have some limit (I think it's 2 sessions for consumer products, but still multiple streams) for this (artificial, just disabled in drivers, there are some instructions online to get rid of the limit). I don't know for sure if Intel has a limit, but I doubt it's one. AMD did (haven't looked recently, but don't remember them ever adding a limit) not have a limit on their cards so you can transcode as many as it can handle, although they typically don't have the same quality. Found a nice chart for Nvidia and Plex transcoding.
https://www.elpamsoft.com/?p=Plex-Hardware-Transcoding
A 1660 can handle up to 5 4k to 1080 transcode streams as can a 1060 6gb. A 4gb 1050 to can handle 3 streams. Obviously higher bitrate or 4k to 4k would stress more. Lower resolutions and bitrate would give you more streams.

If you have other reasons needing a GPU for driving multiple screens most any recent GPU that you'd want for transcoding should handle this just fine. For sure wouldn't get less than 4gb, may even want to aim for 6/8 depending on your budget and how much you feel you'll be doing at one time.
 
No, GPU transcoding isn't normally limited to 1 transcode... Well NVidia does have some limit (I think it's 2 sessions for consumer products, but still multiple streams) for this (artificial, just disabled in drivers, there are some instructions online to get rid of the limit). I don't know for sure if Intel has a limit, but I doubt it's one. AMD did (haven't looked recently, but don't remember them ever adding a limit) not have a limit on their cards so you can transcode as many as it can handle, although they typically don't have the same quality. Found a nice chart for Nvidia and Plex transcoding.
https://www.elpamsoft.com/?p=Plex-Hardware-Transcoding
A 1660 can handle up to 5 4k to 1080 transcode streams as can a 1060 6gb. A 4gb 1050 to can handle 3 streams. Obviously higher bitrate or 4k to 4k would stress more. Lower resolutions and bitrate would give you more streams.

If you have other reasons needing a GPU for driving multiple screens most any recent GPU that you'd want for transcoding should handle this just fine. For sure wouldn't get less than 4gb, may even want to aim for 6/8 depending on your budget and how much you feel you'll be doing at one time.
I was only talking about the iGPU being limited to 1 transcode. I thought someone said that.

The only reason I want multiple screens is for work, which is via RDC anyway so I'm limited in that respect. I'm not trying to watch 2-3 4k HDR movies at once or play any computer games or anything. But I need the physical ports on the motherboard or graphics card to be able to drive the monitors. Also need the power, as I understand iGPU can only handle 1 4k monitor. I'm currently running a much lower resolution Dell monitor via DVI-D to HDMI adapter in addition to my main 32" 4k display. Both are being pushed from the motherboard ports (4k on DP and Dell on HDMI). It seems to work OK, but the 4k monitor always takes a few seconds to "wake up", which is super annoying with multiple monitors because windows then moves all the desktop icons over to the second screen :-/

A few people here recommended the 2060KO, but I don't see the KO specifically in several comparison charts, including the one you listed. Its my understanding that the KO is unique and superior to the other 2060's on the market. Is this correct? Im also seeing other KO variants, which I don't know the difference between (KO, KO Ultra, KO Gaming, KO Gaming Ultra, etc.). The KO also has similar ports to my MSI board - only 1 DP and 1 HDMI. So assuming I got that card, I'd be in the same spot with multiple monitors...that is, unless I can still use my board's DP and HDMI ports. I don't know how that all works, like if the boards ports will connect to a dedicated card or if they only connect to the iGPU. But if I'm limited to just using 1 or the other, my multi-monitor situation hasn't really improved with that card.

This will all ultimately come down to whether of not I can still use iGPU with QSV when a dedicated card is present on the board. Because in all likelihood, most any surveillance software I pick will be designed to at least prefer QSV (if not require it). So if adding a dedicated card completely disables iGPU, then I probably have to forgo getting one for boosting Plex transcoding and the multi-monitor issue.
 
You should look up using iGPU with discrete GPU, there used to be says of enabling both, but I don't have one to test for you. Found the 2060 KO for $300 has 3 ports (DVI, HDMI, DP)
https://www.newegg.com/evga-geforce...&ranSiteID=8BacdVP0GFs-n6h.2.CXtw4qzYqNNLH0JQ
Seems a bit lite on ouptuts as you mentioned. The reference from Nvidia had more outputs, so they do exist. 2060 KO is somewhere between the regular 2060 and 2060 super. All the other terms after KO are just marketing for that particular brand. Depending on the monitors you have this may work or may not. Also, I would read up on enabling the onboard GPU with discrete, may allow you to keep something plugged in as long as you don't so any heavy graphics on that monitor. Not sure how that would work, maybe someone else can chime in?
 
You should look up using iGPU with discrete GPU, there used to be says of enabling both, but I don't have one to test for you. Found the 2060 KO for $300 has 3 ports (DVI, HDMI, DP)
https://www.newegg.com/evga-geforce-rtx-2060-06g-p4-2068-kr/p/N82E16814487486?Item=N82E16814487486&nm_mc=AFC-RAN-COM&cm_mmc=AFC-RAN-COM&utm_medium=affiliates&utm_source=afc-PCPartPicker&AFFID=2558510&AFFNAME=PCPartPicker&ACRID=1&ASID=https://pcpartpicker.com/product/2H7p99/evga-geforce-rtx-2060-6-gb-ko-ultra-gaming-video-card-06g-p4-2068-kr&ranMID=44583&ranEAID=2558510&ranSiteID=8BacdVP0GFs-n6h.2.CXtw4qzYqNNLH0JQ
Seems a bit lite on ouptuts as you mentioned. The reference from Nvidia had more outputs, so they do exist. 2060 KO is somewhere between the regular 2060 and 2060 super. All the other terms after KO are just marketing for that particular brand. Depending on the monitors you have this may work or may not. Also, I would read up on enabling the onboard GPU with discrete, may allow you to keep something plugged in as long as you don't so any heavy graphics on that monitor. Not sure how that would work, maybe someone else can chime in?
Been trying to lookup using iGPU and GPU simultaneously, but I'm not finding a ton on it. Maybe its just because I dont know anything about this stuff, so I don't know what I'm looking at ha ha. I definitely have found some tutorials to "get it to work", implying that normally it wouldn't work. I was kind of hoping for confirmation on this. Being as this is a surveillance PC, reliability is very important. I really cant have this thing giving me any issues or having a bunch of downtime. Thats part of the reason for buying an 8700 chip (as opposed to K), not going crazy with memory OC, etc. I also have like 6 fans in the case, which is likely way overkill for a stock CPU and no dedicated graphics.
 
Been trying to lookup using iGPU and GPU simultaneously, but I'm not finding a ton on it. Maybe its just because I dont know anything about this stuff, so I don't know what I'm looking at ha ha. I definitely have found some tutorials to "get it to work", implying that normally it wouldn't work. I was kind of hoping for confirmation on this. Being as this is a surveillance PC, reliability is very important. I really cant have this thing giving me any issues or having a bunch of downtime. Thats part of the reason for buying an 8700 chip (as opposed to K), not going crazy with memory OC, etc. I also have like 6 fans in the case, which is likely way overkill for a stock CPU and no dedicated graphics.
Last I remember it was mostly just a matter of requiring a monitor to be plugged in to both GPUs and enabled in the BIOS. Without a monitor plugged into the iGPU it would automatically be disabled. They make a plug that simulates a monitor being plugged in, so you could trick the system into not disabling it but not needing a real monitor plugged in. Again, it'd be great to be able to test out. If you get it working it would still be just as stable, it shouldn't have any effect on that. Better would be to find a GPU that does everything you need so no messing around needed. I know my AMD card has 3 DPs and 1 HDMI... I'm sure you can find Nvidia cards with similar outputs. I know the rtx 2060 can handle.up to 4 simultaneous outputs, just they don't always put the connectors on to save money.
 
Last I remember it was mostly just a matter of requiring a monitor to be plugged in to both GPUs and enabled in the BIOS. Without a monitor plugged into the iGPU it would automatically be disabled. They make a plug that simulates a monitor being plugged in, so you could trick the system into not disabling it but not needing a real monitor plugged in. Again, it'd be great to be able to test out. If you get it working it would still be just as stable, it shouldn't have any effect on that. Better would be to find a GPU that does everything you need so no messing around needed. I know my AMD card has 3 DPs and 1 HDMI... I'm sure you can find Nvidia cards with similar outputs. I know the rtx 2060 can handle.up to 4 simultaneous outputs, just they don't always put the connectors on to save money.
Seems like kind of a ridiculous requirement to have to have a monitor on each card, but that wouldn't be an issue for me if it got both GPU and iGPU running at the same time. The bigger question is whether or not I can control which programs use which GPU. If I do have control of this, I could then dedicate iGPU to surveillance and use dedicated GPU for anything else. Maybe a stupid question, but does having 1 monitor per card/board affect what gets displayed on screen? Like if only my secondary screen is connected to iGPU, can I then only view surveillance live feeds or recordings on that monitor? I recall one of the tutorials for this GPU/iGPU configuration was saying that BOTH GPUs must connect to the same monitor. So is that a requirement in order to use both to their fullest extent or something? If so, that's an assload of wires when hooking up multiple monitors!
 
Seems like kind of a ridiculous requirement to have to have a monitor on each card, but that wouldn't be an issue for me if it got both GPU and iGPU running at the same time. The bigger question is whether or not I can control which programs use which GPU. If I do have control of this, I could then dedicate iGPU to surveillance and use dedicated GPU for anything else. Maybe a stupid question, but does having 1 monitor per card/board affect what gets displayed on screen? Like if only my secondary screen is connected to iGPU, can I then only view surveillance live feeds or recordings on that monitor? I recall one of the tutorials for this GPU/iGPU configuration was saying that BOTH GPUs must connect to the same monitor. So is that a requirement in order to use both to their fullest extent or something? If so, that's an assload of wires when hooking up multiple monitors!
I honestly don't know, I haven't run into that situation. I do have a 7th gen Intel Pentium with iGPU they has a discrete GPU installed I could probably do a little testing. It's a different series CPU and it's an AMD GPU (RX 560), so I don't know if it's be close enough or not. It's my daughter's desktop so mostly just a few steam games and Minecraft installed (she's 6), so not sure how much I could really test either.
 
I honestly don't know, I haven't run into that situation. I do have a 7th gen Intel Pentium with iGPU they has a discrete GPU installed I could probably do a little testing. It's a different series CPU and it's an AMD GPU (RX 560), so I don't know if it's be close enough or not. It's my daughter's desktop so mostly just a few steam games and Minecraft installed (she's 6), so not sure how much I could really test either.
I can pull the GPU out of an old Dell Precision 390 I got from work in 2013. The thing was ancient even for 2013, but it looks to have a NVIDIA Quadro FX3500 in it, which has 2x DVI-I capable of 3840 x 2400 at 24 Hz. It could at least push my second monitor, because that's the monitor that came with the Dell. Probably couldn't do much with 256MB GDDR3 SDRAM and 450MHz GPU clock speed though.
 
Can the Intel iGPU be used simultaneously with a dedicated card? Some people using Blue Iris said adding a card made their iGPU unavailable, but I do not know if that's typical.
yes on using both, in BIOS on most MB's you decide which is the primary video, integrated iGPU or PCIE, if you choose PCIE (the add in card) you can
then select if you want the igpu disabled or not and use both.
I'm pretty sure hardware transcoding is on. When it's transcoding, the fans all start spinning up and CPU usage jumps up. Isn't that hardware transcoding?
That actually sounds like software decoding pushing the cpu utilization way up which is way less efficient. Here's an easy question, do you have the paid PLEXPass subscription? If you don't its not doing hardware transcoding, that's a paid only feature. No pay, no hw transcoding. Coretemp is a handy little program you can load up, it tells you the utilization of the cpu cores (and their temps). Next time everything spools up you can get a visual on what's happening or just see what your normal load looks like with the camera software only
 
yes on using both, in BIOS on most MB's you decide which is the primary video, integrated iGPU or PCIE, if you choose PCIE (the add in card) you can
then select if you want the igpu disabled or not and use both.

That actually sounds like software decoding pushing the cpu utilization way up which is way less efficient. Here's an easy question, do you have the paid PLEXPass subscription? If you don't its not doing hardware transcoding, that's a paid only feature. No pay, no hw transcoding. Coretemp is a handy little program you can load up, it tells you the utilization of the cpu cores (and their temps). Next time everything spools up you can get a visual on what's happening or just see what your normal load looks like with the camera software only
I do have Plex pass, the Plex pass + Tidal HiFi bundle actually.

So i went into the BIOS this morning, because my systems been bogged down for no apparent reason over the last couple days. It keeps ramping up to 100% CPU for several minutes at a time, and that's with Plex not running at all an only no 1 camera recording. Only thing I've changed recently was installing all the Milestone Xprotect software for a trial last week. I've since uninstalled it, deleted temp files, and cleaned registry, but it was still acting up. When this occurs "system interrupts" is what's using all the CPU, according to task manager. Not even sure what system interrupts does...

Anyway, I went into the BIOS, and i noticed the graphics adapter was changed back to PEG somehow. I definitely had it on IGD. I don't even know to what the BIOS does when PEG is selected, cause I have nothing on my PCI slots...def no dedicated GPU on there. So does it just default to IGD anyway? It must if my iGPU is doing all the work, right? For the record, multi-monitor was also set to disabled, so it must have just been ignoring these settings and using iGPU. I changed it back to IGD and rebooted and so far everything seems about the same. No CPU ramp up yet after changing the settings in the BIOS.

On another note, I decided to play the cards and bought a 2060 KO from EVGA's website last night. So I'll be trying that dual dGPU/iGPU configuration in about a week or so. And if it doesn't work to my liking, I guess I'll try to return the card or see how much I can get for it used.
 
Well, please keep us updated, it really seems something odd is going on in your system with those random spikes.
 
Well, please keep us updated, it really seems something odd is going on in your system with those random spikes.
Yeah, I dunno. I've rebuilt windows like 3 times since September. I think if anythings wrong, it's gotta be hardware related. Or at worst, a setting, though I haven't changed much from stock. I am on a slightly older build of Windows though (1903), as the newer ones had [even worse] resource usage. Haven't looked into the latest yet to see if they've addressed these problems.

I did a test on Plex transcoding tonight, and it seemed to be doing ok actually. I tried transcoding two 4k 10-bit movies simultaneously - one on my computer's browser, the other on my phone. I inadvertantly picked a movie for my phone that didn't require a video transcode (only audio). But the movie playing in Firefox on my PC needed transcoding from 4k 10-bit h265 to 1080p. I also had my 1 camera running and recording as normal. CPU hovered around 30-35% in Plex, as did iGPU in task manager. The movies played fine without skipping or artifacts, though the initial start and shipping ahead took several seconds to buffer every time.

 
barley skimmed through and posting like an a hole here:

blue iris is optimized for intel quick sink even on various forums and partner forums they say nvidia cards burn some power. you'd possibly need a quadro.

PLEX is different animal you need a software mod OR a quadro to run more than 1 or 2 transcode/encode streams.

EDIT: that's why i moved to windows 9 for some testing! it's pretty light! at least at idle, it doesn't have telemetry stuff but, it's not easy to find the windows 8.1 industry pro iso
 
barley skimmed through and posting like an a hole here:

blue iris is optimized for intel quick sink even on various forums and partner forums they say nvidia cards burn some power. you'd possibly need a quadro.

PLEX is different animal you need a software mod OR a quadro to run more than 1 or 2 transcode/encode streams.
Thanks. Yeah I got all that already. I'm going to try using iGPU and dGPU simultaneously, so i can keep Blue Iris using iGPU with QSV. Though, I'm looking for a replacement for Blue Iris anyway, so that may be a non-issue anyway. I'm liking DW Spectrum so far.
 
never heard of it. i wouldn't mind one that ran on linux. it looks like there a few softwares out now
 
never heard of it. i wouldn't mind one that ran on linux. it looks like there a few softwares out now
DW (Digital Watchdog) Spectrum seems to actually be more well known than Blue Iris, though that's really with commercial setups. Others like Milestone are surely leading the industry, but DW isn't exactly a no-name brand. From my research on it, it seems to be far more versatile than BI. It runs on several platforms, including Linux, and it may actually support even more camera manufactures than BI. I haven't gotten to play around with it yet, but the initial setup was basically effortless. It recognized my camera right away. All i had to do was enter the cameras username and password. Supposedly you can even configure the camera settings right in the program instead of having to login to the camera settings in a browser like in BI. So we'll see how this goes once I get the time to play with it. GPU should be here Wednesday, but I've been talking with a DW rep and he says Spectrum is designed to run on QSV and that GPUs are for more advanced setup with many cameras or that require encoding/decoding.
 
DW (Digital Watchdog) Spectrum seems to actually be more well known than Blue Iris, though that's really with commercial setups. Others like Milestone are surely leading the industry, but DW isn't exactly a no-name brand. From my research on it, it seems to be far more versatile than BI. It runs on several platforms, including Linux, and it may actually support even more camera manufactures than BI. I haven't gotten to play around with it yet, but the initial setup was basically effortless. It recognized my camera right away. All i had to do was enter the cameras username and password. Supposedly you can even configure the camera settings right in the program instead of having to login to the camera settings in a browser like in BI. So we'll see how this goes once I get the time to play with it. GPU should be here Wednesday, but I've been talking with a DW rep and he says Spectrum is designed to run on QSV and that GPUs are for more advanced setup with many cameras or that require encoding/decoding.

Why don't you just buy a dedicated NVR instead of faffing about with PC solutions? You'll end up paying a lot less and have something a lot more reliable. I used to service PC DVRs and I hated them, they always broke down or had some issue caused by software conflicts or Windows itself. I eventually converted all of my clients with PC DVRs over to dedicated DVR/NVR units from Hikvision, which were far more reliable and my phone isn't blowing up multiple times a week for inane software issues or someone turning the unit off.

Who made your cameras? It sounds like you have IP cameras, and most vendors these days follow the ONVIF standard, meaning they should work with any NVR that also supports ONVIF. H.265 is the new industry standard, but the camera itself needs to support it for it to work. Though most also still support H.264 and older analog formats. The picture quality isn't really that different, what you get with the newer standard is higher video compression and more recording time out of the existing storage space.

Digital Watchdog makes decent gear, so does Hikvision and Speco. Milestone is great for a PC based DVR, but you only get 8 cameras in their free tier and their licensing gets super expensive after that.
 
Why don't you just buy a dedicated NVR instead of faffing about with PC solutions? You'll end up paying a lot less and have something a lot more reliable. I used to service PC DVRs and I hated them, they always broke down or had some issue caused by software conflicts or Windows itself. I eventually converted all of my clients with PC DVRs over to dedicated DVR/NVR units from Hikvision, which were far more reliable and my phone isn't blowing up multiple times a week for inane software issues or someone turning the unit off.

Who made your cameras? It sounds like you have IP cameras, and most vendors these days follow the ONVIF standard, meaning they should work with any NVR that also supports ONVIF. H.265 is the new industry standard, but the camera itself needs to support it for it to work. Though most also still support H.264 and older analog formats. The picture quality isn't really that different, what you get with the newer standard is higher video compression and more recording time out of the existing storage space.

Digital Watchdog makes decent gear, so does Hikvision and Speco. Milestone is great for a PC based DVR, but you only get 8 cameras in their free tier and their licensing gets super expensive after that.
I briefly touched on it in one of my earlier posts above, but I do already have experience with NVRs. My first surveillance system, which is now running at my rental property, is a Dahua PoE NVR with 4 Dahua 2MP Starlight cams. It records 2 streams per channel, 24/7/365, encoded h265. ANd as you said, yes, it is incredibly reliable. It's been running continuously for 3 years and I've never seen it go down.

The reasons for me not going the NVR route this time around were:
1) the UI on these NVRs is such garbage. It's a pain to figure out how to do anything, and tech support is non-existent.
2) You're locked into a specific IP cam manufacturer. I don't know where you heard ONVIF cams are basically interchangeable on these NVR devices; everything I've ever read says you need to match manufacturers or it won't recognize the cams.
3) I was under the impression that setting up my Blue Iris machine was going to be a pretty seamless experience, worthy of the added up front cost. This absolutely did not end up being the case.

My new cameras are again all Dahua IP cams. I got 2 more of the 2MP Starlights (newer version than what's at my rental property) + 2 8MP 4k cams. I currently only have 1 of the 2MP cams connected to BI. Been putting off installing the others while I troubleshoot all the issues I've been having.I have no plans to increase beyond 4 cams any time in the next year or so, certainly not while still living in my current house.

I'm sure MIlestone is great, but I do not feel like dealing with all the hassle of getting it to work. If I can't even get the initial setup correct, I know it's just going to be a pain in my ass the entire time. I definitely do not have the time to devote to learning a whole new software suite just to get my cameras configured. That's why I was hoping BI would be good for me. But instead I've just spent all my time troubleshooting BI and following the numerous "tutorials" that tell you to dumb-down every setting just to get the program to work as it's designed to. DW is looking promising from my limited experience with it, but I'm a little reluctant to be overly optimistic just yet.
 
I briefly touched on it in one of my earlier posts above, but I do already have experience with NVRs. My first surveillance system, which is now running at my rental property, is a Dahua PoE NVR with 4 Dahua 2MP Starlight cams. It records 2 streams per channel, 24/7/365, encoded h265. ANd as you said, yes, it is incredibly reliable. It's been running continuously for 3 years and I've never seen it go down.

Dahua gear is generally pretty reliable, but I really don't like their cameras, they suck. I've done several side by side comparisons over the years of Dahua vs. Hikvision, Speco and a few other brands and Dahua is always worse in picture quality in lower light levels, especially at night when using IR.

The reasons for me not going the NVR route this time around were:
1) the UI on these NVRs is such garbage. It's a pain to figure out how to do anything, and tech support is non-existent.
2) You're locked into a specific IP cam manufacturer. I don't know where you heard ONVIF cams are basically interchangeable on these NVR devices; everything I've ever read says you need to match manufacturers or it won't recognize the cams.
3) I was under the impression that setting up my Blue Iris machine was going to be a pretty seamless experience, worthy of the added up front cost. This absolutely did not end up being the case.

1) You are indeed correct the UI on these lower tier units is trash, it's a constant headache. Hikvision had a big UI update a couple of years ago which completely scrambled everything in an attempt to modernize the look, which had function as an afterthought. I still think they offer the best value though, which is why I put up with them.

If you're looking for a good user experience and support, Axis is probably the best out there, but you pay for it. Axis cameras are very expensive, but the UI is amazing and they have really good support. I've only ever installed a few of them, but they were a breeze to configure compared to any other vendor, and the video quality is really good.

2) This is true if you get the low end consumer trash they sell at retail stores, but not with better quality units generally only available to installers. I have used Hikvision TurboHD DVRs with different brand cameras, because they're compatible with CVBS, CVI, HDI and AHD. Likewise with Hikvision NVRs because it supports any camera that has ONVIF support. The reason why you hear that different brand cameras don't work on consumer gear is because the consumer space is flooded with shitty fly by night vendors who spec equipment from one of the big camera vendors like Hikvision, Dahua, Speco and a couple of others. They'll get cut-down equipment made for them with their label that usually has a locked FW on the DVR/NVR and sometimes cameras so that you can only use their gear.

One such example was Lorex, which used Hikvision gear for a long time and then switched everything over to Dahua. Nothing was compatible between them because they had a locked firmware on their NVRs. There was a way to hack the older NVRs to take an official Hikvision firmware and enable ONVIF support, but it was a long and complicated process.

I never advise my customers to buy the trash available at retail stores because of issues like that, and there's a high chance the company won't be around for more than a year. The companies that sell that crap cycle around all the time and you're basically guaranteed to not have support in the future. Unfortunately, they're never going to go away because they're so cheap and people are going to keep getting burned.

3) Yeah, that's how my customers got into the PC DVR mess. They make it sound like its cheaper and easier, but it really isn't.

I'm sure MIlestone is great, but I do not feel like dealing with all the hassle of getting it to work. If I can't even get the initial setup correct, I know it's just going to be a pain in my ass the entire time. I definitely do not have the time to devote to learning a whole new software suite just to get my cameras configured. That's why I was hoping BI would be good for me. But instead I've just spent all my time troubleshooting BI and following the numerous "tutorials" that tell you to dumb-down every setting just to get the program to work as it's designed to. DW is looking promising from my limited experience with it, but I'm a little reluctant to be overly optimistic just yet.

Milestone does have a decent learning curve. I would honestly get an 8 channel Hikvision, Speco or even DW NVR and call it a day. You can probably see I'm partial to Hikvision, but Speco and DW are also pretty good.
 
Back
Top