Been out the GPU game for awhile need another HDMI/HTPC type card

Format _C:

2[H]4U
Joined
Jun 12, 2001
Messages
3,885
I want to replace my GPU in two of my computers to be able to support the 'new' Ultra HD resolutions for my Sony XBR-55X850C. Also I assume I will have to connect it directly to an HDMI port on the TV as my AVR is rather old (2011 Onkyp TX-NR609)

One of the computer is/was (as the only original part left is the motherboard) an Asus CM6830 with a Pentium G620 and an HD5450

The second computer is the one in my sig and is the one I use the most on the TV so if I only buy one card it would be for that computer

I am currently using an Asus GeForce GT530 (That was pulled from the Asus CM6830 above)

I do not use either PC to play any newer games (The newest game is from 2004) the other one are emulators for Sega and Nintendo. So I do not need a high end card. I just want to have support for Ultra HD resolutions which I believe needs HDCP 2.2?

So what is the new goto HTPC/HDMI lower powered HDCP 2.2 card that is popular today?

Thanks
 
Nvidia gtx 1050 and AMD RX 560 would be my picks for that scenario..
 
The greatest flexibility is 1050ti.

Reason: 1050ti meets all specs for 4k Netflix (1050 has too little VRAM and AMD cards in general are SoL for 4k Netflix streaming), and has both HDMI 2.1 and HDCP2.2 support, it also doesn't require any additional power cables.

Ultra HD resolution itself does not require HDCP, it's the 4k media that actually need them, for example, UHD Blu-rays, 4k Netflix, etc., the resolution itself, for the uses you have mentioned, does not require HDCP, assuming your display doesn't require it either.
 
GT 1030. Has two high-resolution outputs, DiplayPort 1.4 and HDMI 2.0b. It's the perfect bus-powered HTPC card., whereas many RX 460 cards are not. It's also half the price of the GTX 1050 Ti.

This one is passive:

https://www.newegg.com/Product/Prod...14137140&cm_re=gt_1030-_-14-137-140-_-Product

It can play older games competently, but a little slower than the 750 Ti. Still should be enough for games from 2004, and should be mountains faster than that 530. Especially since most of those games won't support 4k native anyway.

Emulation at 4k might be an issue - are you talking Dolphin (newer platform emulation), or are you talking older consoles or MAME? The second option is vastly simpler to run.
 
Last edited:
I am talking about the older emulators for Nintendo (NES/SNES/N64) or Sega (Genesis/CD/32X)

Also how do I get the TV to display resolutions past 1920 x 1080 (AKA 1080P) as the screen resolution slider only goes up to 1920 x 1080. I am using the GT530 directly connected to an HDMI port on the Sony TV and I have Windows 7

I don't think I need HDCP for my purposes as I just want a monitor with no commercial content (Netflix or UHD or regular Blu-rays as I use my Xbox One S to play newer games and Blu-ray and DVD movies
 
OK I just thought of something could it be that I also have another monitor connected to the DVI output on the card that only supports 1080P max? BenQ GL2760H
 
It shouldn't matter. I have a 4k monitor and a 1440p monitor hooked up to the same video card and I can run both at native.

Try completely unplugging the benq monitor and see, but according to the specs for 530, the card can't do 4k at all.
 
I am talking about the older emulators for Nintendo (NES/SNES/N64) or Sega (Genesis/CD/32X)

Also how do I get the TV to display resolutions past 1920 x 1080 (AKA 1080P) as the screen resolution slider only goes up to 1920 x 1080. I am using the GT530 directly connected to an HDMI port on the Sony TV and I have Windows 7

I don't think I need HDCP for my purposes as I just want a monitor with no commercial content (Netflix or UHD or regular Blu-rays as I use my Xbox One S to play newer games and Blu-ray and DVD movies


Well there's 2 wild guess I can make:

1. (Less likely) The TV may not like 30Hz as at best that GT 530 only has HDMI 1.4 (max resolution it supports is 3840×2160 at 24 Hz/25 Hz/30 Hz). Although I would of thought 30Hz would be supported.

2. (Most likely) It's possible that the GT 530 HDMI port is below HDMI 1.4 which if i understand the standards correctly, doesn't "officially" support higher resolutions above 1080p. (Cant find docs saying which HDMI version is used with the 530 but this seems to be the mostly likely culprit).


So either direction, you'll need a different gpu to support your TV to it's max resolution.
 
GTX 1030 specifically can't do 4k Netflix, 4k Netflix requires cards with 3GB VRAM or greater, meaning 1050 and 1030 are both out of the question if 4k Netflix is a concern.

The OP has indeed stated he isn't interested in 4k Netflix, so I am not going to argue against 1030 either, just pointing out that not every H.265 capable GPU is capable of decoding 4k Netflix streams, unless something changed since Netflix/nVidia announced 4k Netflix coming to Pascal, I distinctly remember 3GB VRAM being THE requirement (which would have knocked out the cheapest, at the time, GTX 1050).

Oh and AMD GPUs are completely dead in the water with 4k Netflix, so is Ryzen, as of now.

Hence why I am of the opinion that there MAY be something behind the scene at play, though it could very well be purely technical.
 
II'm pretty sure it's just Nvidia playing the money game. I think the VRAM requirement is pure bull, since the units are fixed-function. The 3GB ram limit means you pay at least $140 for your card.

Really, the h.265 decoder on my GTX 960 2GB works JUST FINE on 4k material. It's just protected content that's locked-down here.

They had to pay the Netflix piper for certification, so I guess you get to pay Nvidia. I'm sure the Volta parts will all support it.
 
For the record, when I say 4k Netflix, I am EXCLUDING everything 4k, only SPECIFICALLY Netflix. I am fully aware that 960/950 and newer GPUS have native h.265 decoding that can decode other mainstream 4k media, 4k Netflix is in an entirely different world on its own, and have treated as such ever since it actively required Kaby Lake CPU and Edge/Win 10 App, and the fact that it gimped all other browsers to not even able to suppor 1080p, let alone 4k. I personally find it complete bull as well.
 
Not sure how I missed it, but I forgot to check your monitor when you asked about it the first time, but unfortunately Pascal no longer has analog outputs, so you will also need something like this:

https://www.amazon.com/Cable-Matter...6406&sr=8-2-spons&keywords=dvi-d+to+vga&psc=1

Note, this cable is NOT the same as your DVI to VGA adapter (e.g. this one: https://www.amazon.com/UGREEN-Femal...6406&sr=8-1-spons&keywords=dvi-d+to+vga&psc=1), DVI-I cable does not plug into a DVI-D Socket as the former is analog, the latter is digital, and the pin layout is slightly different.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
My monitor has a DVI input which I will be using. I do not need an analog VGA output anymore (The monitor does have a VGA input though)
 
My monitor has a DVI input which I will be using. I do not need an analog VGA output anymore (The monitor does have a VGA input though)

That should be okay. From what I have heard, the DVI ports on Maxwell-onward are all rated to run at HDMI 2.0 speeds. So a simple passive adapter should be good enough to get a second 4k@60 output. That way you have an upgrade path on that second monitor.
 
Last edited:
The greatest flexibility is 1050ti.

Reason: 1050ti meets all specs for 4k Netflix (1050 has too little VRAM and AMD cards in general are SoL for 4k Netflix streaming), and has both HDMI 2.1 and HDCP2.2 support, it also doesn't require any additional power cables.

Ultra HD resolution itself does not require HDCP, it's the 4k media that actually need them, for example, UHD Blu-rays, 4k Netflix, etc., the resolution itself, for the uses you have mentioned, does not require HDCP, assuming your display doesn't require it either.

I think you mean HDMI 2.0b since HDMI 2.1 standard won't be around until Q1 2018.
 
Back
Top