Question about HDCP-

Noni

Limp Gawd
Joined
Nov 4, 2005
Messages
418
Hi there,

Quick Q-

I have a Viewsonic VX2025, it does not have HDCP on the back. But my graphic card (7900GT) has a little HDCP thingy. Does this mean I cant use HDCP?

Will a Dell 2007FPW HDCP be better? As the monitor has it on the back? Or the same?

Thank you.
 
I honestly wouldn't worry about HDCP right now.

Personally, I'm gonna wait to buy HDCP crap until it's shoved down my throat.
 
"HDCP" is thrown around a lot but not defined most of the time. At least in terms of the big picture.

In terms of the tech HDCP is just a handshake. All the components in the chain must get a valid handshake for the HDCP "flag" to be set to true. So your monitor must shake with your card for the OS to see everything is HDCP compliant.

Now Vista is not enforcing HDCP, it is just providing the tools to see that HDCP flag. Windows Media player that comes with Vista does enforce HDCP. All that "blu-ray and HD-dvd requires 64bit Vista" is all WMP bullshit.

If you dont have HDCP Vista will not be hindered in any way. Just WMP. If you want high-def media and don't have all HDCP compliant crap you can just use the HD/BR-DVD version of PowerDVD. Atleast I think its out now. Even if its not there will be plenty of other players that can play HD/BR content as soon as said content is readily available. Im sure windvd will, Sonic, all the big names. Zoom player and VLC ftw :D

And thats how it will be IF HDCP continues to be adopted.

So no worries. No HDCP = No WMP. Tons of people already refuse to use it anyways.
 
So what about Directx 10 games?

I have heard that the HDCP monitors are going to be running at a much higher resolution for the Directx 10 games....

Or am I mistaken?

because thats the way I see it, which is why im waiting for a actual release of Vista before I upgrade video card/monitor.

I know HDCP with the Vid card and Monitor will be required to screw around with HD/BR stuff, but I have heard about needing it for Directx 10 games as well...
 
HDCP has nothing to do with DirectX 10. HDCP is all about the CP, Copyright protection. DirectX doesn't have anything to do with playing highdef media :)

How and why displays get HDCP is beyond me. If you look at the definition of HDCP its a mechanism to ensure your hardware can display HD content at its full capacity, IE 1080p. But there are plenty of sub 1080p displays that are HDCP compliant. Then comes in how HD/BR content is handled with standalone players. Its 1080p on disk but I dont think there are any players that output at 1080p. So making only 1080p displays HDCP compliant when you cant even use them is pretty dumb, so thats probably one reason.

Basically just because a display is HDCP says nothing about its native res.

But if you buy any expensive display now or in the future chances are it has HDCP anyways...
 
Back
Top