TO ALL GF6800 AGP owners

EnderW said:
But the 6600s don't have it either.
WMV9 and WMVHD are the same de/compression codec, different bitrates.

And I do say that my 6600GT has WMV9/HD acceleration by my low CPU utilization (35%-45%) on a pretty modest CPU (P4 3GHz) playing back WMV HD 1080 videos.

http://www.nvidia.com/object/feature_on-chip-video.html

Another important factor is that the GeForce 6 Series GPUs are completely programmable and can handle formats such as WMV9 and MPEG-4. The NVIDIA motion compensation engine can provide decompression acceleration for a variety of video formats including WMV9, MPEG-4, H.264, and DiVX. As with motion compensation for MPEG-2, the NVIDIA video engine can perform most of the computation-intensive work, leaving the easiest work to the CPU.

I guess nvidia missed a page. It's already December and there's still not a good explanation of the state of that above and it's been out for 6-7 months. Class action time. :p
 
pxc said:
WMV9 and WMVHD are the same de/compression codec, different bitrates.

And I do say that my 6600GT has WMV9/HD acceleration by my low CPU utilization (35%-45%) on a pretty modest CPU (P4 3GHz) playing back WMV HD 1080 videos.

http://www.nvidia.com/object/feature_on-chip-video.html



I guess nvidia missed a page. It's already December and there's still not a good explanation of the state of that above and it's been out for 6-7 months. Class action time. :p

Thank you pxc, finally someone who understands whats going on.
As you can all see, pxc has very low cpu usage with his 6600GT compared to our 6800s. Obviously our 6800s are not accelerating the decode. And yes, WMV9 and WMV9HD are the same codec with different bitrates and different resolutions.
As for class action, we really need someone who is willing to start one. I know everyone's been saying it, but no one has had the guts to actually bring one up. Sign me up if anyone really decides to bring the lawsuit.
 
bonkrowave said:
Yeah Im feeling the same way you are. I know the first time I loaded up a WMV HD video and saw task manger indicate 80-90% cpu usage with spikes so much that the video skipped, I was pretty angry.

Nvidia your not gonna like me when im angry

/turn_green

Uhh, your system doesn't even meet the min requirements for WMV:HD. On Microsoft's site it recommends at least a 3 gigahertz processor in order to play WMVHD movies with minimal skipping. You were expecting that minimum recommendation to encompass your 2500+?
 
Minimum config ASSUMES no hardware decode.

Minimum decoding for mpeg-2/dvd is AFAIK p3-500 or 600. SOmewhere around there.

I'm fairly certain I can do it with a p2-266 IF I have a HARDWARE DECODER (such as hollywood+).

I think at the time Microsoft wrote that spec, hardware decoders for WMV9HD didn't exist.

Rob
 
Well I DL'd the Step into Liquid 1080p just to see what the hubub was all about. Dayum! I've missed a lot in the last few years. That was awesome.

Anyway, it drew about 85% peak CPU off a 3500+ with a BFG 6800 Ultra OC and whatever drivers were on the CD that came with it. Do I care? Not really. I bought the card to game with, I never even considered HD CD watching as an option. But I'm glad I read this thread, I DL'd some more of the clips off the Windows site. They are amazing.

And I went ahead and got a x800xt pe that I'll swap in soon to try out, I'll see how much CPU power it needs.
 
pxc said:
WMV9 and WMVHD are the same de/compression codec, different bitrates.

And I do say that my 6600GT has WMV9/HD acceleration by my low CPU utilization (35%-45%) on a pretty modest CPU (P4 3GHz) playing back WMV HD 1080 videos.

http://www.nvidia.com/object/feature_on-chip-video.html

Do you have HyperThreading turned on? Turn it off if you have not and try playing the video again. Hyperthreading can cut a large chucnk off of the CPU usage report, making you think you have acceleration in your chip. It also means that all HT-capable P4 systems do a lot better with HD WMV than Athlon-64 based boxen.
 
finalgt said:
Uhh, your system doesn't even meet the min requirements for WMV:HD. On Microsoft's site it recommends at least a 3 gigahertz processor in order to play WMVHD movies with minimal skipping. You were expecting that minimum recommendation to encompass your 2500+?

Minimum system req ... in this case

is if you have no Hardware decoding. Try to understand the situation before blasting out such future responces.
 
Robstar said:
Please post cpu usage when you put the x800xt in.

Rob


I have a x800 Pro and I just ran the step into liquid movie (the 1080 one) and I got 99% cpu usage all the time. Before the x800 pro i had a 6800 GT. I ran the movie on that too and it had the same cpu usage but was waay more laggy than my x800 pro (which skipped a little but was still playable).
 
pandora's box said:
I have a x800 Pro and I just ran the step into liquid movie (the 1080 one) and I got 99% cpu usage all the time. Before the x800 pro i had a 6800 GT. I ran the movie on that too and it had the same cpu usage but was waay more laggy than my x800 pro (which skipped a little but was still playable).

Well the fact that the 6800 says it has a WMV HD decoder "to take the load off of the cpu" would sure seem to suggest its not working ... which we know it is not.

The thing I find skethcy is when all this hub-bub started ... Nvidia felt the need to add a little footnote to the WMV HD feature page saying this feature will be implemented in a future driver release. Now I know damn well that was not there before I bought the card. It is pretty clear how Nvidia falsely claimed to have the decoding ability on the card .... and did not follow through.

If I bought a car with air conditioning to cool me off on thoose hot days, and the first day I try it it doesn't work I would be hella pissed. The dealership would be under obligation to fix this, as it was advertised as having working air conditioning.

Now I bought the car for transportation, not to sit in and cool myself off. Much the same way people bought the 6800 for games but thought the WMV HD decoding was a feature they would use, and may have like myself used this feature to choose Nvidia over ATI. It was the percieved value of features such as WHV HD that made me choose the 6800 over the x800.

I fail to see why this does not apply in this situation.
 
And I'm not stickin' up for NVidia when I said the WMV HD wasn't why I bought the card. I honestly had no idea such a thing even existed when I bought the card. But I do agree if they say my card can do something, then later I find out it can't do it (even if it doesn't apply to my orginal purchase intent), it still sucks as it would lower the perceived value of my card on the re-sale market. Not that I plan to sell it. And this WMV HD issue wasn't why I bought a x800xt pe, either. I just wanted to try one out.
 
Auger said:
Well I DL'd the Step into Liquid 1080p just to see what the hubub was all about. Dayum! I've missed a lot in the last few years. That was awesome.

Anyway, it drew about 85% peak CPU off a 3500+ with a BFG 6800 Ultra OC and whatever drivers were on the CD that came with it. Do I care? Not really. I bought the card to game with, I never even considered HD CD watching as an option. But I'm glad I read this thread, I DL'd some more of the clips off the Windows site. They are amazing.

And I went ahead and got a x800xt pe that I'll swap in soon to try out, I'll see how much CPU power it needs.


Wow, I am THE KILLER of THREADS!

Anyway, I just installed the x800xt pe. A few observations:

The "Step Into Liquid" CPU usage rate of 85% max was with the Fast Writes disabled on the BFG 6800 Ultra OC. When I turned them back on, it went down to about 65% max.

Now with the x800xt pe, the CPU usage is basically identical to the 6800 U/OC. This is still with Fast Writes enabled, I'm still fiddling around with this card. I will post later CPU usage with the Fast Writes off, just to be fair.

But, I haven't heard anybody say that ATI claimed hardware video or whatever we are griping about here. All I can say is that with Fast Writes enabled, there is basically no difference in CPU usage between these 2 cards. Also, go to ATI's homepage> Get In The Game> CryTek> CryTek Demo (the 168MB DL..) That thing pegs BOTH cards at 100% CPU. I was surprised, I had already established that it would peg the nVidia 6800 U/OC, but was very surprised to see it peg ATI's own top of the line card (unless I've missed some tweaks..I'm no expert here).

Now I have a complete operating system loaded on a WD 74GB Raptor with the nVidea BFG 6800 Ultra OC solution, and an identical drive loaded with the ATI x800xt pe solution. All I have to do is swap one SATA cable and the Video Card to A/B between the 2.

So I'll try to answer any A/B questions within reason... I'm not gonna swap cards 20 times a night.

And I have no idea how to take a screen shot. If anybody would care to tell me how that's done, I'm all ears (or eyes. I guess...)

I will say first impressions: The nVidea boots really clean, the ATI has a few stutters/refreshes during boot, like ripping. But both run really well, really clean. It's nice to know with over a grand tied up in video cards and supporting HDD's I can find some joy (my last GPU was a Matrox G400Max, I'm rather stoked.)
 
Thanks for a real test. Almost everyone here is just arguing because they can and not even proving this to be true or not or even asking for number for that matter. I read this whole thread a few days ago and learned absolutely nothing which I learned on the first page.

To take a screen shot, press "Print Screen," load up MS Paint and press Control+V.
 
Well, I have all of the 1080p WMV HD clips on a separate HDD, And I can say right now, after just watching a few of them (The Living Sea is magnificent!) the ATI just does not have the beauty on the nVidia. And I'm not a shill for either, but it's a great enough difference to be readily recognized. Washed out colors, , even grainy, definite pixels. I'm still gonna get HL2 to see how that all goes, but there is no contest in WMV HD quality IMHO.

And don't get me wrong...tonite when I mounted the ATI for the first time, and stuck the nVidea back in the bag, I firmly believed that the BFG was going in the bag for good. But after watching a couple of the vids, and running 3D Mark 01 (The ATI won by about 300 points...) the BFG is back in the rig.
 
Cool Test Auger.......I talked to my buddy last night and told him about this thread and he said he ran the Step Into Liquid 1080 video and averaged only 30% CPU usage....This is with the DEll 8400 3.2Ghz with the X800SE. I told him that was impossible but he assured me that it was true..

Mac
 
Staples said:
Thanks for a real test. Almost everyone here is just arguing because they can and not even proving this to be true or not or even asking for number for that matter. I read this whole thread a few days ago and learned absolutely nothing which I learned on the first page.

So it's business as usual then... ATi r0x0r5... No wait nvidia does... !!!!!!s suck... Both companies are making great cards these days.

I am going to build my HTPC soon and plan to stick my 6800 in it and get a faster card or cards :D for my gaming rig. I have been a bit leary thinking I might have to get a new card for the HTPC rig but based on Auger's results, I think I'll be OK.

I understand the problem with not getting what you paid for, as advertised, but really; What's the difference between 35 and 85% CPU utilization since it is likely that all you will be doing with your rig is codec hardware compressing/decompressing at the time?

I guess to answer my own question, maybe those with lesser CPUs than the 3500 who buy this card specifically because of hardware MPEG4 support would really have something to bitch about. Were talking maybe 1 in 500 customers though.

I do agree that nvidia should make this right though for those who want need this functionality.
 
Well, here's the gist of things now, as I see it:

First of all, all of us who own a Geforce 6800 card (with the exceptions of PCIe 6800NU and PCI 6800LE cards) are SOL. There are now several major reviews of PurevVideo technology (Anandtech, PC Perspective, and others) who make it clear that those of us who have an NV40 or NV45 based card are out of luck when it comes to WMV9 hardware decoding, no matter what driver we download or video decoder we purchase, despite promises nVidia has made, including one now on the Watchdog section of the January issue of Maximum PC. Furthermore, every site that has reviewed or given a blurb the PureVideo technology has toned down their stance on the product, at most stating "There must be a number of unhappy 6800 owners out there), or, "The technology has changed somewhat from initial press releases" and leaving it at that. This tells me that they are leaving us out in the cold, because none of them wants to be faced with the giant that is nVidia no longer sending them review samples. There is currently no-one willing to listen to us here except each other.

AMD users are the hardest hit. All of the review sites have tested PureVideo so far with Pentium 4 Hyperthreading processors, except for PC Perspective. The P4's HT technology compensates heavily for the 6800's weaknesses, allowing users to have far less CPU usage even with PureVideo disabled, and making the problem seem less than it is. PC Perspective uses an Athlon 64 FX-53, which, being a top-end CPU, also downplays the situation, and they do not use a Geforce 6800 series for comparison. Owners of Athlon XP CPU's, low to mid-end Athlon 64's, Intel Celerons and P4's without hyperthreading will have far, far higher CPU utilization numbers when playing 1080 WMV-9. We were promised something, and there are still documents on nVidia's site (despite additional documents contradicting this) that state that the entire Geforce 6 Series has WMV-9 decode acceleration. Anyone who bought the product based on these documents was falsely advertised to, and then propped up by promises of future driver releases. Unfortunately, I know no-one in the legal profession, but perhaps someone here does who can step forward; at this point if that doesn't happen, I believe the deception will be complete.

P.S. I have myself tried the 67.03 drivers and nVidia's 1.00.67 DVD Decoder. CPU utilization has not dropped, as I expected it would not.
 
Back
Top