Hardware video decoding; Who cares?

spine

2[H]4U
Joined
Feb 4, 2003
Messages
2,715
Like the title says. As each new gen of cards comes along we see more improvements in video hardware acceleration. Great.

I've not had to worry about lack of cpu video rendering power since upgrading to a PIII 650mhz.

Hardware video overlay is standard and is all that's necessary. Quicktime on windows doesn't use that which is why HD shittime encoded movies may play slowly on decent systems, but that's not a cpu issue, it's Apple being bastards.

So yeah, does anyone actually care about all this hardware video acceleration? Does anyone actually have a system so bad that they need to buy an expensive 9800GTX just to be able to play video properly? Is that scenario even possible?!
 
Its for people who watch higher quality video.
Without it some of my 1080p videos struggle to maintain framerate in fast action scenes.
 
Actually I care about a video card having it or not. Because I do more then just gaming. I wouldnt buy a high end card for just gaming alone. It would be for Games and High def movies and then my montior res.

I rather have my cpu doing something esle instead of putting it load into a 1080p video or movie. I didnt really have problems, I just perfer if the supports the tech.
 
I was under the impression some of the low end 8-series cards had this technology as well.

Personally on my Opteron 2.7 GHz + 7800 GTX, I still have problems with stuff like 1080p x264-encoded movies. Also I could see this being a help for people who want to have CPU-intensive stuff going in the background while they watch a movie. Could very well make a movie unwatchable on many systems if you've got something going in the background. Just an idea of why maybe someone might want to offload video decoding.
 
I took hardware acceleration for granted until I ran on a PCI RageIIC while RMAing video cards; it definitely makes a difference.
 
The low end 8 series cards do offer this. You don't need an expensive card to get it. And as for the OPs question: *I* care. And you apparently cared enough to start a damn thread about it.
 
Yeaaah...I don't give a shit. If I were to even watch a blu-ray movie, I wouldn't really care if my cpu was at 10% util or 80%.
 
I'd care, except for the fact that I can't find an application that actually uses the 8600GTS HD hardware decoding for H264.
 
Sounds like some people need to get their head out of their asses about things other than old school .mov files. :( I wonder what kinds of video the OP watches on his/her computer.
 
Hardware video overlay is standard and is all that's necessary. Quicktime on windows doesn't use that which is why HD shittime encoded movies may play slowly on decent systems, but that's not a cpu issue, it's Apple being bastards.

So yeah, does anyone actually care about all this hardware video acceleration? Does anyone actually have a system so bad that they need to buy an expensive 9800GTX just to be able to play video properly? Is that scenario even possible?!

Uh. No. Quicktime uses directdraw (VMR7) which is better than overlay and is the default renderer in windows XP. Overlay is pretty bottom of the barrel now and has been for years. Quicktime's issue with HD is because its software decoded. Its standard Mpeg-4 and the files play just fine on lower end computers with hardware acceleration.

quicktime.jpg


With that said, lets go back on the different rendering modes, since you still think its all overlay. a modern pc uses the following for the majority of its rendering. Overlay (rarely) VMR7 and VMR9. Vista (and XP machines with .net 3.5) can use EVR. There are different modes as well for the VMR/EVR modes but I won't go into it here. Basically every mode has its own strengths and weaknesses in acceleration and video quality. All of them have various levels of hardware acceleration as well.
 
I'd care, except for the fact that I can't find an application that actually uses the 8600GTS HD hardware decoding for H264.
Cyberlink's H.264 decoder. You can configure it in Media Player Classic, or use PowerDVD Ultra.
 
I've got two HTPC's which both have an LG combo HD drive. I have enough CPU for each (just enough for one - E4500 + 9600GT), but hardware acceleration keeps the computer free to do more than one thing as well as well as it does keep things cooler.

It's nice to have a video card that handles just about all VIDEO that's thrown at it.
 
I'd care, except for the fact that I can't find an application that actually uses the 8600GTS HD hardware decoding for H264.

As long as the player supports VMR9 it should be good. The codec is also important. You need something like Power DVD 7.3 installed as well so the program can use its decoder. Core still isn't hardware accelerated, and ffdshow isn't.

There are a few other issues that can prevent it from working Having a file that has improper profile information (this can be easily fixed) or the file was actually encoded incorrectly. Yes its playable, but its unsuitable for hardware acceleration.

A good read on the subject. http://www.avsforum.com/avs-vb/showthread.php?t=972503

http://forum.doom9.org/showthread.php?t=132924
 
In the Video Acceleration Settings in WMP, should "overlays" be deselected under both the "Video Acceleration" and "DVD video" headings?
 
So you have to buy extra software just to use this acceleration anyway? lame.

My old 7800GS and opteron 146 handled every HD 1080p video fine when I tried it way back when. See these videos as examples:
http://www.microsoft.com/windows/windowsmedia/musicandvideo/hdvideo/contentshowcase.aspx

And yeah, quicktime, whatever the technical details, is not using a PC 'properly' to play videos so users playing an HD quicktime vid and finding it running slowly are going to be falsely lead to believe that their PC is not up to snuf, when it is.

Anyway, my point is simply that in this day and age, with quadcores rapidly becoming the norm, all this hardware video decoding just isn't necessary. Just like a dedicated physics processor isn't necessary.


I don't know why graphics companies are so compelled to introduce all these extra hardware video features. Maybe it's just because they can? That the hardware can be exploited to do this stuff with clever coding where it originally wasn't necessarily designed to do it?
Kinda like how Nvidia got SSAA going on their entire geforce line just after 3dfx came out with a true hardware solution. Nvidia can certainly pull their finger out when they need to. :p
 
Probably because graphics cards with stream processors are very well suited to performing tasks on streaming video.
 
My AMD Sempron LE-1150 and ATI Radeon HD 3450 (both very cheap items) play unencrypted 720p videos with only about 10 % CPU usage. I use Media Player Classic - Home Theatre Edition. It has built in HW acceleration without having to use any external codec (at least for my card). I have the AVIVO drivers installed but I don't know if they are needed.
 
Agreeeeeeeed. I believe i'm the twentieth person to say it, but i'll say it anwyays. Hardward decoding is awesome for playing all those 1080p rips.
 
whats the best software to play Bluray and HD-DVD movies and regular DVD's with best possible picture? and is my stock Q6600 quadcore CPU enough to support the full performance of Bluray and HD-DVD content for best picture quality?
i got the 9800GX2 video card.

thanks.
 
WHAT!!!? i cannot watch anything if its in SD...hardware decoding improvements are a must
 
We use PowerDVD HD. With HD DVD movies it's very noticeable when you don't use hardware decoding. Depending on the type of codec the BD-DVD or HD-DVD uses, your cpu can really go nuts.

We imaged all our bd and hd dvd movies and run them from the hard drive. I think another important factor is being HDCP compliant as well as having the hardware acceleration. Without HDCP, you're really not playing anything with hardware acceleration in hi-def :).
 
This is a bit of a bonus for me, but I am gonna use it well when/if I have a nice large HDTV of somesort, besides using this HD offloading for PC use.
 
I found it to be incredibly important. A quick lesson is a big fat mpeg2 at 10mbit and small vid size (if my vid were a 1920x1080 it would be proportianately 30+ mbit to equal my 704x480)
anyway, coincidental testing after blowing out yet another ati card (physically pooped a chip) My encoding went from 12fps (still slow, but I did have a 50million transistor 9550se - a "little" card) all the way to 2.5 to take my little vid to h.264.
There is also realtime deinterlace that needs to happen, and the vid has such a deep density, it really needs a seperate place and leave the cpu to work the actual software holding the vid. I learned this years ago with a rage LT pro in 1998 that claimed to decode mpeg2 (dvd was brand new) and it really worked. I could watch a dvd and browse the net in 64mb of ram and a p2 350 at 10%. It is a falsity to believe you do not need hardware decoding for intensely standardized vids (even vids we make and slap into a WMV9 profile is a standard that uses decode) The cpu helps put the front end of what we actually do with the vid, like move it around on screen, and use craptime player for the mp4 etc etc etc.
An interesting near future result for another h264 benchmark is what a 390million transistor agp 2600 is compared to the 50mil trans. card I just blew... that should answer any question here about needing decoding or not. :)
This question is common, and I asked it just once a long time ago....
 
Back
Top