ATI's advertised HDCP claims = false

jebo_4jc

[H]ard|DCer of the Month - April 2011
Joined
Apr 8, 2005
Messages
14,566
http://www.dailytech.com/article.aspx?newsid=851

Wow this isn't pretty. I didn't realize ATi was claiming that their cards are HDCP ready. How would this be possible without an HDMI port? I wonder if some sort of fancy DVI -> HDMI dongle could get this done?
I don't know how many people bought ATi cards because they thought the ATi would be Vista-ready, but if I did, I would be super pissed.



 
The video card does NOT need a HDCP key. The video SOURCE is supposed to "queury" the vdeo DISPLAY and ask it for HDCP decode key.... which is NOTHING more than a number ID'ing the manufacturer. The video SOURCE then compares the number to a list of BLACKLISTED numbers and decides whether to allow display or not.

Perfect example is the Comcast HiDef Digital DVR box. It supports HDCP, and if you plug in a HiRes display into the DVI connector (it doesn't have HDMI connectors) the box will queury the display and, if no HDCP answer is received, will simply display a warning message that the display is not HDCP and thats it. Anyone with a ComcastDVR can test this with any LCD DVI equiped monitor.

** WARNING, if you dod the Comcast test, the box will flip out, and you will need to hit the "magic" button to get the setup menu to force the resolution BACK to 480i so the display goes back to working on your normal TV. **

The ATI and NV boards have the ability to "diddle" the same DVI signal lines to "queury" the display for HDCP compliance. This simply needs to be implemented in the display driver software.

The VIDEO CARD itself doens't NEED HDCP compliance, since it is NOT a display. You cant just stare at the DVI connector on the back and watch the HD video....duh.

The video card is the SOURCE of the signal and to be HDCP compliant must implement the HDCP handshaking with hte DISPLAY and enforce the HDCP prohibitions.

The "blacklist" of unimproved numbers would then be updated over time as the enforcement arm of the HDCP authority adds new numbers.


It's a really stupid lame nonsensical system typical of morons and stooopid people who shouldn't be involved in designing "things" actually trying to design something and creating YAA... Yet Another Abortion.

A simple bloackbox can sit in the DVI cable between the VIDEO SOURCE and any Display (DVI or VGA) and "answer" the HDCP queury with a NON-blacklisted number, and VIOLA' you have a HiRes display connected to your HDCP complaint VIDEO SOURCE and noone is the wiser. This is the method any "blackbox" will use to make OLD HiDef TV's work with HDCP compliant sources.

Yeah like we are going to TOSS 2-3-4 MILLION older HiDef TV's because the HDCP thing came along....suuuure. And I am certainly going to TOSS my $400 Samsung non HDCP 8ms LCD, toss it right the Fek out the window.... stooopid stooopid non-HDCP monitor.

Once again, fight the "content producer" lame-azz morons the right way... DO NOT BUY THIER SHIT. If you DONT BUY IT, they WONT MAKE IT.

All Vista is going to do is IMPLEMENT the HDCP handshaking, something XP doesn't do. But the VIDEO CARD only needs to have the DVI port wired up to allow the HDCP handshaking to queury the HDCP number from the DISPLAY. THe VIDEO CARD itself needs not do ANYTHING else.

If ATI and NV board makers didn't bother connecting the DVI signal lines required to the GPU so they can be diddled, then thats a shame, but I really find it hard to believe that is the case.

I find it EASY to belive it IS the case that they haven't bothered to add the HDCP queurying functionality to thier DRVER SOFTWARE yet, since thier simply is NO REASON yet.

End of Rant.
 
Were they advertising that the chips are ready or the cards are ready?

Secondly i'd like to applaud both IHVs for not rushing to adopt this crap since it's only going to cause problems and is likely a marketing gimic. Just a way to make people buy new hardware when their older hardware works just fine.

If anyone is familiar with those DVI->VGA adapters I could see a hack that works in a similar way coming out at some point in time. When the query gets sent the adapter sends back some whack return code that will be accepted without the expense of needing all of the hardware. Which will likely get replaced 3-4 years down the road requiring you to upgrade your hardware yet again.
 
xX_Jack_Carver_Xx said:
The video card does NOT need a HDCP key. The video SOURCE is supposed to "queury" the vdeo DISPLAY and ask it for HDCP decode key.... which is NOTHING more than a number ID'ing the manufacturer. The video SOURCE then compares the number to a list of BLACKLISTED numbers and decides whether to allow display or not.....
I don't know anything about this stuff, but it seems to me that the source (the video card) needs to have the proper hardware to query the display for HDCP compliance.
 
jebo_4jc said:
I don't know anything about this stuff

MooseEdit: Let us not forget our manners, kids.

As of now, its a whole unsorted garbled up mess. I dont think anyone knows all the facts yet, and if certain cards are, or are not HDCP capable.
 
xX_Jack_Carver_Xx said:
A simple bloackbox can sit in the DVI cable between the VIDEO SOURCE and any Display (DVI or VGA) and "answer" the HDCP queury with a NON-blacklisted number, and VIOLA' you have a HiRes display connected to your HDCP complaint VIDEO SOURCE and noone is the wiser.
Sorry, but it doesn't work. The keys have to be requested from the HDCP licensing body, and the keys themselves are never exchanged directly between devices. Instead, AKE (Authentication Key Exchange) is used to send a 'mutually agreed' value.

So a "black box" can't "steal" keys. And if a valid, requested key is ever compromised, it simply gets added to the blacklist.
 
xX_Jack_Carver_Xx said:
The video card does NOT need a HDCP key. The video SOURCE is supposed to "queury" the vdeo DISPLAY and ask it for HDCP decode key.... which is NOTHING more than a number ID'ing the manufacturer. The video SOURCE then compares the number to a list of BLACKLISTED numbers and decides whether to allow display or not.

Perfect example is the Comcast HiDef Digital DVR box. It supports HDCP, and if you plug in a HiRes display into the DVI connector (it doesn't have HDMI connectors) the box will queury the display and, if no HDCP answer is received, will simply display a warning message that the display is not HDCP and thats it. Anyone with a ComcastDVR can test this with any LCD DVI equiped monitor.

** WARNING, if you dod the Comcast test, the box will flip out, and you will need to hit the "magic" button to get the setup menu to force the resolution BACK to 480i so the display goes back to working on your normal TV. **

The ATI and NV boards have the ability to "diddle" the same DVI signal lines to "queury" the display for HDCP compliance. This simply needs to be implemented in the display driver software.

The VIDEO CARD itself doens't NEED HDCP compliance, since it is NOT a display. You cant just stare at the DVI connector on the back and watch the HD video....duh.

The video card is the SOURCE of the signal and to be HDCP compliant must implement the HDCP handshaking with hte DISPLAY and enforce the HDCP prohibitions.

The "blacklist" of unimproved numbers would then be updated over time as the enforcement arm of the HDCP authority adds new numbers.


It's a really stupid lame nonsensical system typical of morons and stooopid people who shouldn't be involved in designing "things" actually trying to design something and creating YAA... Yet Another Abortion.

A simple bloackbox can sit in the DVI cable between the VIDEO SOURCE and any Display (DVI or VGA) and "answer" the HDCP queury with a NON-blacklisted number, and VIOLA' you have a HiRes display connected to your HDCP complaint VIDEO SOURCE and noone is the wiser. This is the method any "blackbox" will use to make OLD HiDef TV's work with HDCP compliant sources.

Yeah like we are going to TOSS 2-3-4 MILLION older HiDef TV's because the HDCP thing came along....suuuure. And I am certainly going to TOSS my $400 Samsung non HDCP 8ms LCD, toss it right the Fek out the window.... stooopid stooopid non-HDCP monitor.

Once again, fight the "content producer" lame-azz morons the right way... DO NOT BUY THIER SHIT. If you DONT BUY IT, they WONT MAKE IT.

All Vista is going to do is IMPLEMENT the HDCP handshaking, something XP doesn't do. But the VIDEO CARD only needs to have the DVI port wired up to allow the HDCP handshaking to queury the HDCP number from the DISPLAY. THe VIDEO CARD itself needs not do ANYTHING else.

If ATI and NV board makers didn't bother connecting the DVI signal lines required to the GPU so they can be diddled, then thats a shame, but I really find it hard to believe that is the case.

I find it EASY to belive it IS the case that they haven't bothered to add the HDCP queurying functionality to thier DRVER SOFTWARE yet, since thier simply is NO REASON yet.

End of Rant.
Could not be said better :) .
I don't know anything about this stuff, but it seems to me that the source (the video card) needs to have the proper hardware to query the display for HDCP compliance
Read the guy I quoted earlier. He iscompletely right.
 
Russian said:
Could not be said better :) .

Read the guy I quoted earlier. He iscompletely right.
What part of the card performs this "decode key" checking? The GPU itself?
 
Russian said:
Read the guy I quoted earlier. He iscompletely right.
No, he's wrong on a few major points....see my post above for just one of them.
 
Unfortunately those little boxes won't be allowed (ie, illegal like cable descramblers) because it defeats the purpose of HDCP, which is a encrypted line from source to display. if you cut the encryption half way there, than its no different than an unencrypted signal. Oh and yes the Vid Card needs the HDCP chip to do the handshaking, and no card right now do, thats the whole point of this thread.
 
aarondavis121 said:
Unfortunately those little boxes won't be allowed (ie, illegal like cable descramblers) because it defeats the purpose of HDCP, which is a encrypted line from source to display. if you cut the encryption half way there, than its no different than an unencrypted signal. Oh and yes the Vid Card needs the HDCP chip to do the handshaking, and no card right now do, thats the whole point of this thread.
lol they wont be allowed? when has that stop anyone? i sure aint gonna toss out my 2405.
 
Martyr said:
lol they wont be allowed? when has that stop anyone?
It stops them when they go, hat in hand, before the HDCP licensing committee to get their keys. And if it doesn't stop them at that point, their key(s) merely get added to the blacklist.

i sure aint gonna toss out my 2405.
You can use non-compliant monitors with HDCP sources. You merely get a downgraded signal...which is still better than the 480i we've been watching all these years.
 
itll stop them quick if they dont have the keys to handshake, and if they manage to get a key, alls the HDCP have to do it blacklist it and the device will stop working.
 
You merely get a downgraded signal...which is still better than the 480i we've been watching all these years.[/QUOTE]

The downgrabed signal is something between 480p and 720p, so decently higher than dvd quality
 
Not stop working, just falls back to 1/4 resolution (i.e. 1920x1080 -> 960x540, slightly higher than DVD quality). If the chain of encryption is broken at any point, you can't get full quality from HDCP.

This is unacceptable to me because i'm not going to buy a new HDTV or new monitor. HDCP is begging to be cracked because the protection is so inconvenient* to legitimate users. And yes, it will be cracked. I don't mind CSS. It's transparent to my use since I use commercial DVD playback software and stand-alone DVD players work with virtually all TVs (coax input TVs need an RF adapter, but even those are readily available).

Ars has a good overview of this: http://arstechnica.com/news.ars/post/20060214-6177.html

Video cards that support HDCP will have to be programmed with encryption keys while they are still in manufacturing. ATI confirmed to me that it will not be possible to patch or otherwise update cards without keys through software. Thus, any card already in the marketplace will never support HDCP, no matter what it says on the box.
And ATI is removing "HDCP ready" from card descriptions on their site.

* as in requiring new expensive hardware purchases to not give any quality improvements over current HDTV or high resolution monitors, just to support new encryption
 
The Ars article says nVidia has been claiming HDCP support also. I don't remember that.

The blame here should really fall on the HDCP governing body, for not finalizing the standards yet. It seems the video card companies are ready to enable HDCP support, but since the standards aren't set their hands are tied.
 
pxc said:
This is unacceptable to me because i'm not going to buy a new HDTV or new monitor.
If you don't want to buy new equipment, then you won't have any HDCP-protected content to watch anyway, so its really no big deal. If you're willing to spring for a player or cable-box, you can watch non-protected content at full res, or protected content at a lower resolution. Or you can buy a compliant monitor...big deal. New content requires new hardware to watch. Big deal. Upgrades always cost money.

Remember, its not *your* content. Its the publishers. If they want to set restrictions on you watching it-- no matter how onerous-- its their right to do so. You likewise have the right to not buy it...or buy it and watch it at substantially higher than DVD-quality, even if its not "full res".
 
Here is an excellent article on how HDCP works and on the lack of graphics cards and monitors situation. It also goes into specifically what Nvidia and ATI are advertising in relation to HDCP. Which is that they are COMPLIANT, because they have an integrated 165 MHz TMDS transmitter, and a DVI port. No card made to this date has been HDCP READY, because no 3rd party manufacturer has added the HDCP chip needed to actually enable HDCP, but technically speaking any manufacturer could do so with any ATI/NVidia cards released in the last 2.5 years.

http://www.behardware.com/articles/603-3/hdcp-the-graphic-card-and-monitor-nightmare.html

Honestly, I think HD-DVD and Blu-ray are going to have a bitch of a time getting anywhere as long as they have all these restrictions on useable hardware. Especially since MS is trying to push to have such devices not work AT ALL with HD content if the entire system doesn't support it instead of simply downgrading the output to 480p. If they do that I just don't see why I would ever consider buying any of this hardware and media.
 
arentol said:
Honestly, I think HD-DVD and Blu-ray are going to have a bitch of a time getting anywhere as long as they have all these restrictions on useable hardware.
Remember that 99% of the market is going to just buy a standalone player and plug it into an HDTV. If that TV happens to not be HDCP compliant, and they get a downgraded signal, most people won't even notice it.

...instead of simply downgrading the output to 480p.
50% higher than 480p actually. 960x540.
 
We are just a few years from a massive TV revolution in which huge online libraries of digital content, including all brand new shows, will be directly downloadable to your cable "DVR" box (this is already taking place to a limited degree). No more recording a broadcast with your Tivo, you will simply chose the shows you want to watch anytime you like after the "broadcast" time and then it will stream the whole thing while you start watching from the beginning. At first this will be 480p only, but before 2020 it will probably all be 720p, and by 2030 we should have the bandwidth for 1080p.

So WHY exactly do we need a new highly restrictive disk based technology when we are rapidly approaching getting the same thing online 24/7??? No wait, no store, no added cost, just simple monthy service fees and watch ANYTHING you want, from the original Three Stooges shows, to the movie that just finished it's run in the theaters. HD-DVD/Blu-ray... Gone. Netflix... Gone as we know it. (They are already actively preparing for the transition to an online downloadable service for all there titles).
 
masher said:
Remember that 99% of the market is going to just buy a standalone player and plug it into an HDTV. If that TV happens to not be HDCP compliant, and they get a downgraded signal, most people won't even notice it.

50% higher than 480p actually. 960x540.

Yes, but remember that is totally irrelevant if MS gets their way. They currently want it to either just give you a black screen if you don't have an HDCP TV, or they want it to display a security warning and then go black. I don't know if this is with PC's only for certain, but I am guessing it is with all devices since the 960x540 signal would be a simple un-encrypted digital signal even from an HD-DVD player, and therefore if the output were actually plugged into a recording device the movie could be pirated in 960x540 format quite easily.
 
jebo_4jc said:
The Ars article says nVidia has been claiming HDCP support also. I don't remember that.
They have, sort of. The NV40 was announced claiming HDCP support... and it's true. nvidia makes chips, not boards and the chip supports HDCP (which is also true for HDCP capable ATI chips). Forceware 75 release notes also mention HDCP support.

But you're also right. I can't find any card reviews or spec pages that claim HDCP support for nvidia-based cards. It looks like nvidia card manufacturers were a lot more careful about it.
 
arentol said:
So WHY exactly do we need a new highly restrictive disk based technology when we are rapidly approaching getting the same thing online 24/7???
First of all, HDCP isn't "disk based". And it (or some variant) will still be needed to protect content downloaded directly over the cable.

arentol said:
Yes, but remember that is totally irrelevant if MS gets their way.
This is quite frankly wrong. There are some hardware manufacturers who didn't want to support the ICT flag and downgrade the signal, but that's primarily due to laziness and/or cost issues on their part. Most of the movie studios (and AFAIK Microsoft) support the idea.
 
masher said:
If you don't want to buy new equipment, then you won't have any HDCP-protected content to watch anyway, so its really no big deal. If you're willing to spring for a player or cable-box, you can watch non-protected content at full res, or protected content at a lower resolution.
No one thinks you wave a HD DVD at a TV and it plays. :rolleyes: A $200-$300 standalone HD-DVD player is nothing compared to replacing a big screen HDTV. A HDCP infected HDTV to replace my already working 51" 1080i HDTV would cost over $2200.

BTW, standalone players are not going to output 960x540 to regular TVs. Most TVs can't display arbitrary vertical resolutions. Degraded output will be displayed at 480i or 480p (960x480 vs 720x480 for regular DVD... not a big difference) for non-HDTVs. Maybe in players that support analog HDTV/unencryped DVI HDTV, the degraded 960x540 stream will be doubled in both directions to output blurry 1920x1080 or upsampled to 1280x720. Oh boy! Pay for a 1920x1080 movie and 1920x1080 player and get up to 960x540 resolution on your 1920x1080 non-HDCP HDTV. That's the deal of a lifetime!
 
pxc said:
No one thinks you wave a HD DVD at a TV and it plays.
Some people here are apparently under that impression :)

A $200-$300 standalone HD-DVD player is nothing compared to replacing a big screen HDTV.
Ok, first of all HD-DVD players are likely to start out at around $500. Blu Ray might be as much as twice that. Thats for low-end players...high end gear is going to be triple that.

Second of all, if you own an HDTV *now* then you bought it to watch OTA HD broadcasts. Which you will STILL be able to continue doing. And if you're not satisfied with HD-DVDs that are "only" 50% higher res than normal DVDs, you have every right to not buy a new player. Just as content publishers have every right to make NEW content that no longer works with your old TV.

So who exactly are you trying to blame? You got exactly what you paid for. Expecting old technology to be future proof in a rapidly changing environment is silly. And your early-adopter "penalty" isn't very harsh either. Your TV will still let you view everything you can today...for now and forever.

A HDCP infected HDTV to replace my already working 51" 1080i HDTV
Your TV ALREADY can't display Blu Ray disks at full res anyway. My 62" DLP TV is likely going to have to be replaced. So what? Its only 720p, and by the end of this year when I'm ready to upgrade, I'll have got my use out of it. I have my eye on a new 72" anyway...will give me an excuse to upgrade :p
 
masher said:
First of all, HDCP isn't "disk based". And it (or some variant) will still be needed to protect content downloaded directly over the cable.


This is quite frankly wrong. There are some hardware manufacturers who didn't want to support the ICT flag and downgrade the signal, but that's primarily due to laziness and/or cost issues on their part. Most of the movie studios (and AFAIK Microsoft) support the idea.

"In contrast, DVI without HDCP is definitely not liked by content owners, because it provides a pristine digital interface that can be captured cleanly. When playing premium content such as HD-DVD and Blu-Ray DVD, PVP-OPM will be required to turn off or constrict the quality of unprotected DVI. As a result, a regular DVI monitor will either get slightly fuzzy or go black, with a polite message explaining that it doesn’t meet security requirements."

Retrieved 2/17/06 from: download.microsoft.com/download/5/D/6/5D6EAF2B-7DDF-476B-93DC-7CF0072878E6/output_protect.doc

THAT is a MS document on what will happen without an HDCP monitor in Vista. Maybe it won't be the case on stand alone players, like I said before, I don't know, but it is MS's position on what there systems are planned to do.
 
arentol said:
"When playing premium content such as HD-DVD and Blu-Ray DVD, PVP-OPM will be required to turn off or constrict the quality of unprotected DVI....
There is a VAST difference between your first statement of "if Microsoft gets their way" and the above reality of what they are being required to do by the HDCP licensing committee.

If you want to display HD-DVD or Blu Ray, you have to follow their rules. No choice.
 
PRIME1 said:
Infected? Were you this upset several years ago when you found out that DVD's were copy protected?
Nope. CSS is fine and transparant, like I mentioned above. I assumed people read the threads they comment on. :p I'll repeat: It inconveniences few people for simple playback: licensed DVD playback software is available almost all platforms, although it's unwelcome by the Linux community. Standalone DVD players interface to standard TVs and show full quality for the medium.

Why do I complain about HDCP? Because it takes a standard and cripples it. Since when has an "upgrade" that degrades quality good? HDCP is an infection of standard HDTV. HDCP is not a requirement for HD playback. It's an extra layer added to prevent decoded video stream copying on top of AACS encryption. Because you know all the kiddies are making 1920x1080 DVI and component video copies of HD movies. :rolleyes:
 
Guys it can get very confusing mixing different aspects of the HDCP up.

All a video crd HAS to do is queury the display to assure it is complliant, and inform the client software....say WMP. or Quicktime or PowerDVD.... that ithe display is compliant.

Now SINCE the original understanding the HDCP body has changed the game. Originally the COMPUTER could/would be the machinery to do the decryption and pass the result out the DVI connector using the video card.

After life under DVD_DeCryptor the HDCP body decided, hmmm having the computer do the work, prolly not a good idea, someone will CRACK it. (like duh).

So they want to turn NV/ATI into the copyright police, they want the DECRYPTION to be done on the video card in hardware so there is nothing to HACK. I guess thats cool for THEM, but for us its a Mind Numbingly STOOOOPID idea. We are going to go spend ANOTHER $400+ on a HDCP encryption-builtin video card, to watch your shit.... hey FK U content providers, instead, NEW deal.... YOU buy ME the nice $400 video card and include it with your $20 DVD, and THEN Ill think about it.


So ATI/NV built in what was needed to be HDCP compliant previsouly, but the requirements have changed. They built in TMDS DVI so the video crd could queury the display for HDCP compliance, and marched off thinking OK, we implemented what we need to for HDCP.

Annnnt WRONG ANSWER. Now the Video card must include the decryption hardware because the HDCP/content providers dont TRUST the PC itself and software drivers running on it to do the work.... too CRACKABLE.

Well, Pound Sand. Me no buy your crap, me no watch your crap, see ya.

Oh by the way, what you gonna do when people CRACK the Linux based Comcast digital cable boxes and CRACK out the HDCP protection and use non HDCP displays.... gonna CRY like little beeeatchs? They didnt make this demand on the Comcast boxes thinking they are sealed settop things noone would hack.... heh, tell it to Xbox/ Linksysrouters/ Tivo's/ and the good old DVD player :D

So dont confuse the issue of what NV/ATI "thought" was gonna get HDCP cmpliance done, vs what it turns out to require now that the rules have changed.

Meanwhile the ENORMOUS computing power of your PC wont be bothered to do decryption duty on BluRay/HiDefDVD tasks.

Or Stoooopid fooolish HDCP regulators, someone will just CRACK your shit, and make a program that "VIRTUALLY" implements all the requirements while running as a program on the ordinary PC and utterly SNARFLE your WasteofElectrons efforts.
 
You nailed it on the head pcx.

HDCP serves no real purpose except to make everyones life more difficult, and it's only around because the "content providers" (who are already making a ton of money with copyable DVD's around) want it to be.
 
pxc said:
Nope. CSS is fine and transparant, like I mentioned above...It inconveniences few people for simple playback: licensed DVD playback software is available almost all platforms...
You're new eh? Anyone who dealt with CSS in the early days of DVD knows better. There were real and serious problems...some early DVDs would not work on some players....and there was NO software decoding (and hence ZERO ability ot watch through your PC) in those days.. The situation changed several months after DVDs were out, when the DVDCC Association finally relented. CSS is no different in this respect than HDCP. I'll repeat. No different. Problems existed...and they were worked out.

Similar issues existed with Macrovision, CGMS, and pretty much indeed all the older copy protection standards. Some of them still exist today....I can't even watch a Macrovision-recorded VHS tape on my HTPC, because my video input card detects it, and loops the signal. That's with a computer, video card, and monitor all bought long AFTER that tape was recorded. And it STILL doesn't work. And you're crying because the TV you bought before any HDCP content even existed....wasn't FORWARD compatible with it? Truly amazing...


HDCP is not a requirement for HD playback.
So you think CSS is a "requirement" for Standard-Def playback? Don't be silly. And HDCP is not required for HD playback either...its a requirement to watch protected content. If you don't like the game, don't play it...you can still watch OTA HDTV.

Freedom. A wonderful thing, isn't it?
 
aarondavis121 said:
You merely get a downgraded signal...which is still better than the 480i we've been watching all these years.

The downgrabed signal is something between 480p and 720p, so decently higher than dvd quality
on another post, you will only get downgrade on ANALOG connectors(VGA,COMP.) if you use DVI your screwed
 
xX_Jack_Carver_Xx said:
Well, Pound Sand. Me no buy your crap, me no watch your crap, see ya.
I've never heard this entire issue summed up so perfectly in two sentences.

Isn't it amazing, as the entertainment industry's asscheeks clamp tighter and tighter, the quality of their product grows weaker and weaker. Very few videos in my collection of 100's were produced after 1990, and I stopped going to movie theaters years ago because I got tired of wasting my money 95% of the time.
 
nan whats going to happen is in 2 year after this stuff all comes out its all going to go away as ppl manage to get keys out divices and there added to the "blacklists" and 1/2 the displays/players in the hands of consumers no longer work
 
Elios said:
nan whats going to happen is in 2 year after this stuff all comes out its all going to go away as ppl manage to get keys out divices and there added to the "blacklists" and 1/2 the displays/players in the hands of consumers no longer work
Well, there was talk of each manufacturer getting a key pool, large enough to allow them to assign unique keys to each production run, or even uniquely to each device. I don't know if this made it into the final spec or not; anyone else know?
 
Extremetech said:
Even ATI Technologies, which manufactures a line of HDCP-ready Radeon chips and cards, said that the "business model" was simply not there to support the inclusion of the additional chip.

"ATI's newest Radeon graphics are HDCP ready -- meaning that add-in-board partners can enable the HDCP security keys during the manufacturing process if they want to," John Swinimer, a spokesman for the company, said. "ATI has chosen not to include any HDCP compliant chips on its graphics cards, including All-In-Wonder X1900, because all the end-to-end elements are not yet in place to implement a fully compliant HDCP PC. The business model is not yet there."

I don't have a date available," Swinimer said, when asked for clarification. "But it will occur when all the other elements from the other companies -- content providers, OS, applications, monitors – et cetera, are ready."

Here is the Article

I know I am alittle late to the party here but, now maybe people all over the net will shut the fuck up about this.
ATI never lied they just don't want to waste money, just like every other chip maker or board maker on the planet.

Personally we as comsumers need to make it clear with don't want this fucking bullshit.
I intend to vote with my wallet and no buy this crap.
 
Its was in the GPU Technology section and the feature was their, I dont remember ATI claiming their actual boards had HDCP chips on them.
 
ATI had "HDCP Ready" listed in their official 9700 Pro specs for over three years.

ATI removed the reference last week. Their action speaks for itself, doesn't it? What else was "HDCP Ready" supposed to mean to the average consumer?
 
Back
Top