TO ALL GF6800 AGP owners

mattman255 said:
"The GeForce 6800 models featured the 1st generation of our programmable
video technology which includes support for high-definition MPEG2 video
decode and standard definition MPEG2 encode, as well as advanced post
processing features such as motion adaptive deinterlacing, and inverse 3:2
pulldown.

The GeForce 6600 models have the same 1st generation programmable video
technology support as the GeForce 6800 models. However, the GeForce 6600
models also include hardware acceleration for high-definition Windows Media
Video (WMV) decode.

6800 won't do HD WMV. The card is incapable of doing it, and they're admitting here. The second paragraph explicitly says that only the 6600's can do it.
 
starhawk said:
/legalspeak
that is conclusive and irrefutable proof of advertising fraud. It puts nvidia's wrongdoing beyond a shadow of a doubt and proves our case.

/normalspeak
that is proof of advertizing fraud that is pretty much impossible to prove incorrect. there is no way in #@!! that nvidia would win a jury trial on this, they can't prove us wrong.

all we need is about 60 other people who agree and we can sue the $#!* out of them!

Edit: no, we do not need $$$, except for the retainer fee for the law firm, which will probably be around $1000. the way it works is that they will take a cut of the settlement $$$ if we win, or they will get nothing if we lose (which is impossible).


hi i will join you guys. count me in!
 
As a point of reference (non-6800, old gen video card), my XP-Mobile + Radeon 8500 gets 52-66% CPU utilization on Step Into Liquid.

Definitely not 100% by any means though, but I'm certain my overclock helps smooth it out considerably.
 
Now that November is almost over (3 hours left), its time to ask the question:
What is the status of the video processor on the Geforce 6800 series of cards??
Lots of rumored promises about having a driver fix being release by the end of November has gone around, but November is almost over and I see nothing.
 
Jebus, sure makes ATi look a bit more attractive now after reading this thread. Find it interesting that some people are just brushing this aside - do I detect a blind sense of !!!!!!ism going around?

Luckily living in NZ I'd be pretty protected from this under the Consumer Guarantees Act, which basically says if any product is not fit to do what is advertised as doing I can legaly get my money back :D
 
Give Nvidia a break. They put effort into their videocards and drivers. Just look at the performance to price ratio on the 6800, its incredible. ATI didn't put as much effort, as it is obvious by looking at their x800 PRO. Did they put in the extra pipelines..??? NO!! Did Nvidia-Yes. And both sell for exactly the same amount. Oh NOOOO!! some on board video processing feature that was advertised does not work. OMg, any modern computer that can take advantage of a 6800 can run a DVD. Most people, who are enthusiastic enough to buy the 6800 series videocards in the first place have a strong computer, more than capable of decoding some movie.

And why would ANY one watch a movie on one screen and play a game on the other?????
lol. Humans can't HT like machines you know.

Or just get a DVD player, they are cheap now.
 
{NcsO}ReichstaG said:
Omg, just give Nvidia a break. They put effort into their videocards and drivers. Just look at the performance to price ratio on the 6800, its incredible. ATI didn't put as much effort, as it is obvious by looking at their x800 PRO. Did they put in the extra pipelines..??? NO!! Did Nvidia-Yes. And both sell for exactly the same amount. Oh NOOOO!! some on board video processing feature that was advertised does not work. OMg, any modern computer that can take advantage of a 6800 can run a DVD. Most people, who are enthusiastic enough to buy the 6800 series videocards in the first place have a strong computer, more than capable of decoding some movie.

And why would ANY one watch a movie on one screen and play a game on the other?????
lol. Humans can't HT like machines you know.

Or just get a DVD player, they are cheap now.

Grow up.
 
I'm curious...is CPU usage far lower for HD WMV files when using an ATI Radeon 9500 or 9700 class card? Some have claimed that the GeForce 6800 is a step backward from these cards in video for this reason, but I don't know that these cards have hardware decode support for HD WMV formats.

I am bummed by this problem, but my card does seem to be okay with the MPEG-2/MPEG-4 files I have. I'd just like to know as I had a Radeon 9700 before my GF 6800.
 
{NcsO}ReichstaG said:
ambiguous. Why not put in some constructive criticism?

i can do that:

you are acting immature, sort of like a 12-year-old. grow up, get a life, or become an editorialist for a newspaper that nobody will ever read. and i am awful close to putting you on my don't trade with list due to your immaturity...
 
Well I guess there is no reason to argue with others on the forum because they didn't screw us over, Nvidia did. It is almost a duty to make sure that Nvidia knows that if they break a law the public will not tolerate it. If we let them get a way with it this time then they are going to expect to get off the hook next time as well because they see nobody will stand in their way. I guess the next thing that comes to my mind is this. About a year down the road (or however long it takes for the 3rd Half-Life to come out) you are playing the 3rd Half-Life. You have an 6800gt and everything is going smoothly then a little LCD pops up out of nowhere and the G-Man shows up on the screen. Your frames go from 50fps to below ten. You notice that there are plenty of graphical goodness on your screen, but no more than there was before so why did the games slow down? Wait the movie on the LCD screen. You find out later that some programmer at VALVE decided to have the movies in the game actually be played by a seperate player so he can easily import more movies into the game also makeing mods easier to make. You notice these are pretty high quality movies and the video processor on your nice expensive 6800gt isn't worth a dime or nickel or even a penny. It may not seem possible at this point in time, but we are not worried about the 6800's not being able to play the games of today, but the games of tomorrow are we not?

Vogar34
 
i've got an nvidia in the comp that i'm using right now. it's not the best, but it does its job. i'm building a boxen (don't ask... it'll be at least a month before i can make any serious progress... that's when i'll make the thread about it) and it's almost certainly gonna have an nvidia card in it. the comp that i'm getting for xmas (i hope) however, is gonna have an x800 in it. it's not that i trust one company or the other... just that i know quality when i see it.

both ati and nvidia make very good graphics cards. it's just that they don't necessarily do what's needed.

pity voodoo doesn't make graphics cards anymore... they would probably beat the crap out of both ati and nvidia considering how good those were back in the day...
 
I have not seen an answer to Doodman's question, has recent drivers corrected the problem?

I have downloaded all the non-DRM i1080 movies from Microsoft's site: http://www.microsoft.com/windows/windowsmedia/content_provider/film/ContentShowcase.aspx

I played them on my work PC a 2.4GHZ P4 . It runs 100% CPu as expected due to the week on board video. I will try them on my home PC listed in my sig to check CPU loads. I know the 6800 should put less strain on the system as it is a newer generation card.

Again, has recent driver updates fixed (patched (covered up)) the problem?
 
starhawk said:
i've got an nvidia in the comp that i'm using right now. it's not the best, but it does its job. i'm building a boxen (don't ask... it'll be at least a month before i can make any serious progress... that's when i'll make the thread about it) and it's almost certainly gonna have an nvidia card in it. the comp that i'm getting for xmas (i hope) however, is gonna have an x800 in it. it's not that i trust one company or the other... just that i know quality when i see it.

both ati and nvidia make very good graphics cards. it's just that they don't necessarily do what's needed.

pity voodoo doesn't make graphics cards anymore... they would probably beat the crap out of both ati and nvidia considering how good those were back in the day...

3Dfx had its own problems, people are quick to forget. Like never designing a card that did better than 1x AGP, this includes all of their late stuff. Like texture limitations in some of their top-end cards that hampered Quake 3 performance (i.e., Voodoo3) compared to the nVidia Riva TNT2 Ultra and original Geforce. Like limitations of only being able to do 16-bit color for a long time in games. They were not a panacea by any means, and these issues along with some management issues ultimately doomed them.

BTW Astronutty, the newest drivers have not fixed the problem. I'm running 66.81 at home I believe; others have tried the 67.xx betas, no luck. The drivers just released today are only to fix a GF6600 issue with SiS chipsets, so I'm sure they don't fix it either.
 
I was curious after reading this thread what codecs were affected. I recently got a BFG6800 card and have been absolutely ecstatic. Even on my comparatively pokey 2500+ Barton running at 3200+ I can play D3, HL2 and Farcry at very respectable resolutions and IQ, no mean feat. :) I do occassionally watch DVD's on my system while doing other things, but not terribly cpu intensive tasks and I haven't really noticed a slowdown so far. I'd like to try stressing my system to see if it is one of the affected models.
 
LoneWolf said:
3Dfx had its own problems, people are quick to forget. Like never designing a card that did better than 1x AGP, this includes all of their late stuff. Like texture limitations in some of their top-end cards that hampered Quake 3 performance (i.e., Voodoo3) compared to the nVidia Riva TNT2 Ultra and original Geforce. Like limitations of only being able to do 16-bit color for a long time in games. They were not a panacea by any means, and these issues along with some management issues ultimately doomed them.

:confused: isn't 3Dfx the company that makes those wildcat cards for workstations?
 
When i saw the new driver release, I was hoping it was the fix for the video processor, sadly it is not. So november is over and no driver fix. I guess I should stop recommending my friends to buy Nvidia.
 
starhawk said:
:confused: isn't 3Dfx the company that makes those wildcat cards for workstations?

No. That's 3DLabs, makers of the Permedia series of graphics chips. 3Dfx made the first truly good 3D graphics accelerators on the market, the Voodoo and Voodoo2, among other products. They were truly revolutionary for their time (around late `96, early `97 if memory serves; I was the first kid on my block to have a Voodoo, the Orchid Righteous 3D), and they drove the market. Their first cards were 3D only, you needed a 2D card like a Matrox Millenium for your everyday work. Their history goes longer than that; Anandtech has some good articles on them, as do other sites.

As for someone who asked about the codecs involved with this nVidia issue, it concerns HD .WMV files, which use the Windows Media 9 codecs. Unless things change, this codec will be the backbone for the HD DVD standard coming out in the next year or two. The way to test this is to download a 1080 resolution .WMV file from the Windows Media site. Even with the system in my sig, CPU usage is 90-100% playing one of these back, and you can't multitask or it'll get mighty choppy.
 
doodman said:
When i saw the new driver release, I was hoping it was the fix for the video processor, sadly it is not. So november is over and no driver fix. I guess I should stop recommending my friends to buy Nvidia.

Couple things..

One, a driver fix won't fix it. It is a hardware related issue.

Two, how many of your friends will be doing another task while watching HD videos, which are meant to take up the whole screen...

And three, even if the video processor is borked, at 100% ussage, I don't get any lag.. I don't do much else when watching videos.. specially HD ones, if I watch them at all..
 
nullvector said:
Hmm, just ran some of the 1080p videos from Microsoft using WM9. VIdeos are here: http://www.microsoft.com/windows/windowsmedia/content_provider/film/ContentShowcase.aspx

Only used 20-25% CPU for 1080p, the 720p was about 10% CPU.
Current system:
Abit IC7-G Max3
P4 - 3.0C HT
1GB - Corsair XMS Extreme
BFG 6800 GT OC
Audigy2 ZS Platinum

Pic for proof is here: http://adam.loadedregister.com/pics/1080p.jpg It was 28% at that moment, mostly stayed 23%.

What videos (if you havn't, try SiL, as it's the one most refer to); what drivers?; and what OS.. looks like 2000.. hm..
 
I have read both sides of the argument, and I would like to say this to those who persistently defend nVidia despite its failure:

Please! I know that the HW decoder feature prolly affacts nobody. But that does not mean nVidia should be forgiven for its fault. The decoder is an advertised feature, and by law, nVidia has the responsibility to fulfill what they promised.

If a person makes a mistake, and that the mistake is inconsequential, it is only reasonable to just forgive him and "give him a break". After all, we are only human, and no human is perfect.

But we are dealing with a company here, not a person. We cannot forgive a corporation just like we forgive somebody. When a company fails to deliver what we paid for, it should be held accountable. In business, being "reasonable" gets you nowhere. What really matters is that both the company and its customers do everything according to the rules.

In America, we believe in fair business. This is why we have laws dictating what businesses are expected to do. This system fails when 1) The laws are broken and the guilty party gets away with it. 2) The laws are twisted and abused. 3) When no one cares about the laws anymore.

I really don't think we should condone this sort of misconduct by nVidia. Let's set aside our own personal feelings about nVidia and its products. Whether nVidia makes powerful video cards is irrelevent to whether it should abide to the rules. In order to keep a fair business environment in this industry, everyone must live up to the standards.
 
[Camper]AWPerator said:
I have read both sides of the argument, and I would like to say this to those who persistently defend nVidia despite its failure:

Please! I know that the HW decoder feature prolly affacts nobody. But that does not mean nVidia should be forgiven for its fault. The decoder is an advertised feature, and by law, nVidia has the responsibility to fulfill what they promised.

If a person makes a mistake, and that the mistake is inconsequential, it is only reasonable to just forgive him and "give him a break". After all, we are only human, and no human is perfect.

But we are dealing with a company here, not a person. We cannot forgive a corporation just like we forgive somebody. When a company fails to deliver what we paid for, it should be held accountable. In business, being "reasonable" gets you nowhere. What really matters is that both the company and its customers do everything according to the rules.

In America, we believe in fair business. This is why we have laws dictating what businesses are expected to do. This system fails when 1) The laws are broken and the guilty party gets away with it. 2) The laws are twisted and abused. 3) When no one cares about the laws anymore.

I really don't think we should condone this sort of misconduct by nVidia. Let's set aside our own personal feelings about nVidia and its products. Whether nVidia makes powerful video cards is irrelevent to whether it should abide to the rules. In order to keep a fair business environment in this industry, everyone must live up to the standards.

Yep, your right. Nice sum up.
 
DropTech said:
What videos (if you havn't, try SiL, as it's the one most refer to); what drivers?; and what OS.. looks like 2000.. hm..

It was the Dolphin 1080p vid, Windows XP pro (visual themes were off), Drivers are latest Forceware as of 11/30, not the new version that came out today, 12/1,
 
[Camper]AWPerator said:
I have read both sides of the argument, and I would like to say this to those who persistently defend nVidia despite its failure:

Please! I know that the HW decoder feature prolly affacts nobody. But that does not mean nVidia should be forgiven for its fault. The decoder is an advertised feature, and by law, nVidia has the responsibility to fulfill what they promised.

If a person makes a mistake, and that the mistake is inconsequential, it is only reasonable to just forgive him and "give him a break". After all, we are only human, and no human is perfect.

But we are dealing with a company here, not a person. We cannot forgive a corporation just like we forgive somebody. When a company fails to deliver what we paid for, it should be held accountable. In business, being "reasonable" gets you nowhere. What really matters is that both the company and its customers do everything according to the rules.

In America, we believe in fair business. This is why we have laws dictating what businesses are expected to do. This system fails when 1) The laws are broken and the guilty party gets away with it. 2) The laws are twisted and abused. 3) When no one cares about the laws anymore.

I really don't think we should condone this sort of misconduct by nVidia. Let's set aside our own personal feelings about nVidia and its products. Whether nVidia makes powerful video cards is irrelevent to whether it should abide to the rules. In order to keep a fair business environment in this industry, everyone must live up to the standards.

Um.. if I remember correctly. They only advertised the actual modes that are supported. I'll dig up both my Ultra and GT box and see what they say.. but... So I dun think they technically did much wrong. Plus, the decoder was already fixed.
 
DropTech said:
Couple things..

One, a driver fix won't fix it. It is a hardware related issue.

Two, how many of your friends will be doing another task while watching HD videos, which are meant to take up the whole screen...

And three, even if the video processor is borked, at 100% ussage, I don't get any lag.. I don't do much else when watching videos.. specially HD ones, if I watch them at all..

I probably forgot to say this, but where I work, we have a whole studio for filming. Its not movie or tv business, but for internal usage. We do lots of video related stuff. I recommend these video cards to the users specifically for the video processor. But if this part of it does not work, I'm not recommending it. I'm currently telling them to get a 6600GT instead because that appears to work. 3D is not important to these people. And no, we're not going to spend thousands of dollars to get a dedicated encoder card.
 
doodman said:
I probably forgot to say this, but where I work, we have a whole studio for filming. Its not movie or tv business, but for internal usage. We do lots of video related stuff. I recommend these video cards to the users specifically for the video processor. But if this part of it does not work, I'm not recommending it. I'm currently telling them to get a 6600GT instead because that appears to work. 3D is not important to these people. And no, we're not going to spend thousands of dollars to get a dedicated encoder card.

6600GT is the only card I would recommend for this kinda playback.. It beats every other card on the market in the same catagory (Gaming..)
 
nullvector said:
Hmm, just ran some of the 1080p videos from Microsoft using WM9. VIdeos are here: http://www.microsoft.com/windows/windowsmedia/content_provider/film/ContentShowcase.aspx

Only used 20-25% CPU for 1080p, the 720p was about 10% CPU.
Current system:
Abit IC7-G Max3
P4 - 3.0C HT
1GB - Corsair XMS Extreme
BFG 6800 GT OC
Audigy2 ZS Platinum

Pic for proof is here: http://adam.loadedregister.com/pics/1080p.jpg It was 28% at that moment, mostly stayed 23%.

That is because you have a HyperThreading CPU. If you turn off HyperThreading, you'll see your usage spike to 90-100%. If you have a P4 non-HT, or a Celeron, or an AMD CPU, the 90-100% usage is very apparent.
 
This is why I took my card back.
This is probably why the gaming performance while using a second monitor is complete crap.

Sorry, I'm no sucker. I work for my money, and for 300 dollars, I expect a peice of computer equipment to work perfectly.

When I code my programs at work, they are expected to do exactly what they are supposed to do. Any less than that and there is hell to pay.

I expect the same from any other company.
 
caitiff said:
I was curious after reading this thread what codecs were affected. I recently got a BFG6800 card and have been absolutely ecstatic. Even on my comparatively pokey 2500+ Barton running at 3200+ I can play D3, HL2 and Farcry at very respectable resolutions and IQ, no mean feat. :) I do occassionally watch DVD's on my system while doing other things, but not terribly cpu intensive tasks and I haven't really noticed a slowdown so far. I'd like to try stressing my system to see if it is one of the affected models.
Try attaching a second monitor :rolleyes:
 
I'm running a Athlon 64 3000+ with a 9800 Pro right now.....I just upgraded to a 23" LCD and the 9800 pro was lagging with CS Source and Halflife 2 so I just picked up a 6800GT and am installing it tonight. I have all the WMV-HD movies at home so I'll see what CPU utilization I have at home before and after the install and will get back to the board with results shortly there after....

Mac
 
I wonder why everyone is so mum about this issue. No review sites has made any news of this issue for the past month. Did they all just give up? or is there some secret they're hiding...? hmmmm
 
LoneWolf said:
That is because you have a HyperThreading CPU. If you turn off HyperThreading, you'll see your usage spike to 90-100%. If you have a P4 non-HT, or a Celeron, or an AMD CPU, the 90-100% usage is very apparent.
My Athlon 64 2.4GHz and 6800GT had 45-55% for the T2 1080 video.

edit: i posted this a couple of months ago in the other 6800 vp thread (65.76 drivers):
6576vp.png
 
The T2 video is not as cpu intensive as the Step Into liquid video. Try that one instead.
 
doodman said:
The T2 video is not as cpu intensive as the Step Into liquid video. Try that one instead.
I did at the time. See the older thread. No significant change.
 
Mav451 said:
As a point of reference (non-6800, old gen video card), my XP-Mobile + Radeon 8500 gets 52-66% CPU utilization on Step Into Liquid.

Definitely not 100% by any means though, but I'm certain my overclock helps smooth it out considerably.

See specs in sig.

This system managed about 80% CPU usage when playing 'Step Into Liquid'. Would it matter much that the source image is 1080p while my monitor's max vertical size is 1024? I'd think resampling the image to the lower size would increase CPU usage?

I'm presuming we are talking about playing the video fullscreen, right?

Anyway, if the 6800s really are pegging at 100% CPU usage while playing this video, that IS a pretty substantial step backwards.

sil_cpu.gif
 
LoneWolf said:
That is because you have a HyperThreading CPU. If you turn off HyperThreading, you'll see your usage spike to 90-100%. If you have a P4 non-HT, or a Celeron, or an AMD CPU, the 90-100% usage is very apparent.

So what *should* CPU utilization be for 1080p videos?

If I run 20-30% utilization for a 1080p video, on a *supposedly* bad 6800GT, what should it be if it was "working correctly".

Theres some middle ground where CPU processes (streaming the I/O of the file, sound processing, etc), meets the optimization of the 6800 for the video. In other words, when they get this supposed problem fixed, we still wont be seeing 0% CPU utilization.

What do non-6800 cards see for CPU utilization for this video? I'd like to see a system with specs like mine and possibly an ATI card compare their CPU usage to mine, to see whether the 6800 is really the bottleneck, or 20-30% is just the norm for this combination cpu/memory/etc.

Microsoft recommends a 128mb Video card, and a 3.0+ GHZ processor to play the 1080p videos. I would suspect anything under either of those specs wouldnt work. So to say, you have under a 3Ghz, but a 128mb 6800, you might not acheive optimal results. So from a pure requirements aspect, a 90% CPU usage seems to fit the model if you dont meet the video requirements.

Is there anyone with a 3GHZ machine that doesnt have Hyperthreading, who also has a 6800, AND still has 90-100% usage?

I received the same CPU usage using my old 9600pro (20-30%), with the same videos, as I do with my 6800GT. Kinda weird, especially since everyone is saying the 6800 has a huge problem. Maybe I am hitting a middle point where the HT negates the acceleration of either card?
 
KingPariah777 said:
This is why I took my card back.
This is probably why the gaming performance while using a second monitor is complete crap.

Sorry, I'm no sucker. I work for my money, and for 300 dollars, I expect a peice of computer equipment to work perfectly.

When I code my programs at work, they are expected to do exactly what they are supposed to do. Any less than that and there is hell to pay.

I expect the same from any other company.

I agree with you.

I suggest that anyone who is not happy with broken decoder return the card (while you still can).

To nVidia, "Look, you did a fabulous job making a high-performance gaming chip. Without a shadow of doubt the 3D accelerator meets or exceed the spec. But as of today some of the advertised functions are still incomplete. Why don't we just return the merchandize at this moment while you take the necessary time to finish your job. Once you have completed your work, we will probably buy the card back. Fair enough?"
 
[Camper]AWPerator said:
I agree with you.

I suggest that anyone who is not happy with broken decoder return the card (while you still can).

To nVidia, "Look, you did a fabulous job making a high-performance gaming chip. Without a shadow of doubt the 3D accelerator meets or exceed the spec. But as of today some of the advertised functions are still incomplete. Why don't we just return the merchandize at this moment while you take the necessary time to finish your job. Once you have completed your work, we will probably buy the card back. Fair enough?"
Too late if you ask me. Ever since they released the FX series, I haven't trusted that company.
 
nullvector said:
So what *should* CPU utilization be for 1080p videos?

If I run 20-30% utilization for a 1080p video, on a *supposedly* bad 6800GT, what should it be if it was "working correctly".

Theres some middle ground where CPU processes (streaming the I/O of the file, sound processing, etc), meets the optimization of the 6800 for the video. In other words, when they get this supposed problem fixed, we still wont be seeing 0% CPU utilization.

What do non-6800 cards see for CPU utilization for this video? I'd like to see a system with specs like mine and possibly an ATI card compare their CPU usage to mine, to see whether the 6800 is really the bottleneck, or 20-30% is just the norm for this combination cpu/memory/etc.

Microsoft recommends a 128mb Video card, and a 3.0+ GHZ processor to play the 1080p videos. I would suspect anything under either of those specs wouldnt work. So to say, you have under a 3Ghz, but a 128mb 6800, you might not acheive optimal results. So from a pure requirements aspect, a 90% CPU usage seems to fit the model if you dont meet the video requirements.

Is there anyone with a 3GHZ machine that doesnt have Hyperthreading, who also has a 6800, AND still has 90-100% usage?

I received the same CPU usage using my old 9600pro (20-30%), with the same videos, as I do with my 6800GT. Kinda weird, especially since everyone is saying the 6800 has a huge problem. Maybe I am hitting a middle point where the HT negates the acceleration of either card?

As I said, my machine (specs in sig) is getting 90% CPU usage running the 1080 "Step Into Liquid" using Forceware 66.81 drivers; as you can see, my system is no slouch. The guts of the problem is simple, though some people have a misunderstanding of its nature. From all that I've read, nVidia's Video Processor on the 6800 has problems with the decode of HD .WMV files, which use the Windows Media 9 codec. MPEG-2/4 is fine (I get 30-50% CPU usage playing files of this type, which is acceptable). The 6600, released later, does not have these problems, you'll see that CPU usage is far less when using a 6600 or 6600GT to play Step Into Liquid.

The crux of the issue is that nVidia made claims that the Video Processor they designed and put into the 6800 could hardware encode and decode both MPEG-2, MPEG-4, and .WMV files. However, most driver revisions do not enable the Video Processor. When it was found that the WMV portion did not work on the 6800, nVidia did not fix the problem in the 6800, but apparently made sure it worked in the 6600 class of cards. Since Microsoft's Windows Media 9 codec has been officially approved as the official video standard for the HD DVD format, this will affect a number of people when the format arrives on the market.
nVidia has not clarified the issue much for users. There are claims on their website that lead one to believe that a driver revision could fix the problem, but at the same time, the problem is based in the hardware of the 6800 chip, so it's unclear (and very possibly unlikely) that the problem could be fixed with drivers. One could guess that the flaw in the 6800 is significant enough that it might require a redesign of the chip to make things work, else nVidia could have just released a new stepping of the 6800 GPU. Either way, the Video Processor is still disabled in the current NForce drivers, a huge disappointment to a lot of users.

This is the kind of thing that class-action lawsuits are made of, if enough people get upset. I'm torn myself; I love the speed my card has in games, yet I do use my computer for video, and knowing this, a Geforce 6600GT or a Radeon card of some sort may be a better long-term choice for me. I have the next several days to decide whether to take my video card back, though if the restocking fee exists, I might do better selling it on Ebay should I find I need the video chip.
 
Back
Top