TO ALL GF6800 AGP owners

gamer1drew said:
Damn people, you hear rumors that something doesnt work, and you start claiming lawsuit? I have a 6800 gt, and dont really care about the video processor. My 6800 gt performs great now, and thats whats really important to me. I didnt buy it for the video processor, and I dont think many of you did. True I guess it sucks if it wont work, but its not the end of the world. Nvidia made a mistake, it happens. If it you bought the card with the video processor as being one of the sole reasons to get it then yah RMA it. Otherwise give them a break, at least until we know for sure.

I own my 6800nu and i have no qualms about it either but yea,it still blows goat balls in dinosaur hell if its true.
 
Hey, I'm sorta new to this issue, but I was wondering, how do you tell if the video processor works or not?

I got a 6800 GT from BFG recently. The first one I got couldn't even fucking run at 370 MHz (BTW even if you downclock the card to prove that the frequency is the problem, don't tell BFG; they consider downclocking to void your warranty!), so I got an RMA. This one works at 370, but just barely. It's the shittiest overclocker I've ever heard of. This whole video processor thing might be a great excuse to return this and get myself a decent card.
 
Clownboat said:
Hey, I'm sorta new to this issue, but I was wondering, how do you tell if the video processor works or not?

I got a 6800 GT from BFG recently. The first one I got couldn't even fucking run at 370 MHz (BTW even if you downclock the card to prove that the frequency is the problem, don't tell BFG; they consider downclocking to void your warranty!), so I got an RMA. This one works at 370, but just barely. It's the shittiest overclocker I've ever heard of. This whole video processor thing might be a great excuse to return this and get myself a decent card.

go the posted link and search for some other links that show some HD vids to download

however, i must add that returning your card because it doesn't do something it was not designed or promised to do is kinda uncool, and has nothing to do with this thread
 
Actually, it was promised:

http://www.bfgtech.com/6800GTOC.html <-- look where it says "on-chip video processor"

And I do play videos and DVDs quite a bit actually, what with my computer doing double duty as a TV in my dorm. It's just that with an Athlon 64 3400+, it's hard to tell if it causes a performance drop or not.

Sorry to take the hardline approach (and I'm not necessarily talking about the person I'm responding to), but anyone who says this is not a valid reason to be upset with nVidia is a dick. It's called FALSE ADVERTISING, and there are laws against it.

Besides, I've had a lot of other problems with this card. These include the fact that there ISN'T A GODDAMN OFFICIAL DRIVER FOR IT LESS THAN FOUR MONTHS OLD. This is just an excuse (and a valid one) to return it.
 
OK guys I just got off the phone with BFG tech support and they are telling me it is driver related and they expect a fix mid to late november. Also the guy said it was a nvidia problem.

now when I told him I am hearing it is hardware related he told me if that was the case nvidia would have some sort of replacement model.

KEEP THIS THREAD

if we are getting screwed here at least we have some documentation

as well ... I play DVDs on this card all the time as I have it hooked up to my tv with s-video (just waiting to hook it to HDTV with Y PR PB ... which may not work well if this is right)... if I payed $700 CDN for this card and that doesn;t work I am gonna be mighty pissed.
 
bonkrowave said:
OK guys I just got off the phone with BFG tech support and they are telling me it is driver related and they expect a fix mid to late november.

i hope thats true, but id rather hear that from out of nvidias mouth, since they are one with the problem, and the ability to fix it
 
At first i did not want to post anyting but then I got to thinking about what can be done if
its not a driver problem.
Man is this going to hit the fan big time.
IF you feel that your card doesn't do what we told you mail it to us "AT YOUR EXPENSE" and within 4-6 weeks you will get your card back with the problem corrected we will snail mail it back 8-10 day delivery time at OUR expense.

the post then will be real FUN TO READ, why do the Hardware people care nothing for their names or reputations, because 50% will kiss their arse no matter what they do.


sparks
 
I just ran the T2 DVD, and at my video cards settings, 65.76 dirvers, I only had 32% CPU usage spikes... and that was only once in a while.. I idled at ~20%... My Brothers GT does the same thing in his shuttle, but he has more CPU usage, spiking up in the 40's.

[EDIT] http://discuss.futuremark.com/forum...umber=4519125&page=&view=&sb=&o=&fpart=1&vc=1

Apparently, it just only disabled by the drivers, and future official driver releases will enable it. It explains why you don't get 0-40% ussage, and are still getting high usage when watching WMV movies. The test video mentioned in that thread I idled at about 50% usage, but would hit 80 every once in a while. I guess we'll just have to wait for nvidia to enable it; alteast it's not a design flaw and whatever you get is still livible.

There are some minor fixes. Turning Fastwrite on works well, but your max overclock may drop (I havn't had to do this)

And people are getting surprisingly lower CPU ussage with the 65.73/65.76 drivers.
 
you can run a DVD movie with a 500mhz PIII in pure software mode

Oh sure it will "run" but with the absolute minimum base DVD quality. What lower CPU utilization allows is more post- decompression processing like smooth interpolation which is important if your desktop resolution does not exactly match 720x480 (none do, its usually 1024x768 or 1280x1024) The GPU can also do many pre-decompression visual passes like Faroudja calculations.

I can definitely tell the difference in a video that is software decoded and one that is GPU hardware assisted/decoded with all the extra CPU post-processing added. Its much more prominent than the simple differences between a no frills $50 set-top DVD and a >$100 progressive with Faroudja set-top DVD.

Same thing with Mpeg-4. In Divx its noticable when you move the decoder configuration slider from the "automatic" settings (tries to determine your GPU and CPU ability and adjust accordingly) to all minimum or all maximum with smooth playback, double buffering.

You can Disable all six levels of post-processing in Divx, and it will greatly reduce your CPU load, it will also decrease visual quality somewhat as its just using the pure "software" decode. I'm pretty sure WMV's automatically start shutting off extra post-processing once the CPU utilization hits a certain threshold (its not forceable like Divx is)

For reference: The first time I noticed that all six levels of Divx CPU post-processing enabled at 720x480 was working without hitting 100 percent utilization was with a Radeon 8500 and a Pentium 4 2.0Ghz....

Oh, and if anyone is a video purist, A 32-bit SSE instruction DVD decoder is actually worse quality than a true 64-bit FP. If you are doing DVD-Rips and have extra time, use the 64-bit FP.
 
At thte moment, unconfirmed.

If it's true, it sucks. nVidia pulled a fast one, and consumers got cheated. However, it doesn't really affect me since I never bought the card for the codec processing ability anyway.
 
valkyre said:
At thte moment, unconfirmed.

If it's true, it sucks. nVidia pulled a fast one, and consumers got cheated. However, it doesn't really affect me since I never bought the card for the codec processing ability anyway.

Ditto, I got the card for games. I watch DvDs on my pc and I have no complaints and haven't noticed anything out of the ordinary. If they fix it in drivers, sweet...if its hardware, I don't think I'll send mine in. It's just not a big enough problem to effect me.
 
SuX0rz said:
Ditto, I got the card for games. I watch DvDs on my pc and I have no complaints and haven't noticed anything out of the ordinary. If they fix it in drivers, sweet...if its hardware, I don't think I'll send mine in. It's just not a big enough problem to effect me.

If you're running your rig as it is in your sig then it is a big enough problem to effect you... you'd be unneccessarily stressing your already OCd cpu everytime you tried to do something simple like watching a dvd. Who knows how much something like that could shorten the lifespan of your cpu..?

IMO, that's definately something a gamer should be concerned with. I'm gonna run some tests of my own later today with HD wmvs, and see if mine's working or not.

*EDIT* and yes, apparently PCI-E 6800 owners are at risk as well, since the only difference between them and the AGP cards is the PCI-E bridge attachement... they apparently still have the same exact pcb.
 
ZenOps said:
Oh sure it will "run" but with the absolute minimum base DVD quality. What lower CPU utilization allows is more post- decompression processing like smooth interpolation which is important if your desktop resolution does not exactly match 720x480 (none do, its usually 1024x768 or 1280x1024) The GPU can also do many pre-decompression visual passes like Faroudja calculations.

I can definitely tell the difference in a video that is software decoded and one that is GPU hardware assisted/decoded with all the extra CPU post-processing added. Its much more prominent than the simple differences between a no frills $50 set-top DVD and a >$100 progressive with Faroudja set-top DVD.

Same thing with Mpeg-4. In Divx its noticable when you move the decoder configuration slider from the "automatic" settings (tries to determine your GPU and CPU ability and adjust accordingly) to all minimum or all maximum with smooth playback, double buffering.

You can Disable all six levels of post-processing in Divx, and it will greatly reduce your CPU load, it will also decrease visual quality somewhat as its just using the pure "software" decode. I'm pretty sure WMV's automatically start shutting off extra post-processing once the CPU utilization hits a certain threshold (its not forceable like Divx is)

For reference: The first time I noticed that all six levels of Divx CPU post-processing enabled at 720x480 was working without hitting 100 percent utilization was with a Radeon 8500 and a Pentium 4 2.0Ghz....

Oh, and if anyone is a video purist, A 32-bit SSE instruction DVD decoder is actually worse quality than a true 64-bit FP. If you are doing DVD-Rips and have extra time, use the 64-bit FP.


Is this why I see some kind "lag" unsmoothness sometimes when I veiw DIVX? before this card, I had a gf2ti
 
Hitokiri Batohsai said:
Is this why I see some kind "lag" unsmoothness sometimes when I veiw DIVX? before this card, I had a gf2ti

Yeah, I noticed this too. Even before I had seen this thread or heard about this mess. My prior system was a P3, Geforce 2. Now: Athlon 64 3400+, 1gb ram, 6800gt. I noticed when playing divx though there was lag. "Hitching" even at times. I'm not too bothered by this though as I got this to play games.
 
Hitokiri Batohsai said:
Is this why I see some kind "lag" unsmoothness sometimes when I veiw DIVX? before this card, I had a gf2ti

Actually, no, it shouldn't be. The DIVX encoding and decoding is fine on the NV40's, try a newer driver set.
 
DropTech said:
Actually, no, it shouldn't be. The DIVX encoding and decoding is fine on the NV40's, try a newer driver set.

NO it is not. It does not work in WMV9/HD as advertised...
yes the new drivers will enabled older MPEG codecs, but those are not CPU intensive compared to the new ones...

Please go check out the NV news thread...people are using the newer drivers AND still getting problems.
 
http://www.xbitlabs.com/articles/video/display/geforce6600gt-theory_6.html <-- Actual test, with newer / working drivers.

http://www.theinquirer.net/?article=19314 <--- has a copy of an email from Nvidia reguarding the drivers/fix.

It seems as though its just for WMV 9.

Realistically its probably b/c the videocard was designed / taped out before the final codecs were available for Window Mediaplayer Video 9. So things did not work exactly as planned. They fixed it for the 6600's and 6200's... and probably the new 6800 PCIe part based on NV41. (12 pipe only core)
 
In theory though, accelerating MPEG/2/4 decodes should not be a problem for any DX9 programmable cards. It should just be a matter of passing a 16x16 pixel DVD type macroblock as the lower resolution equivalent of a 256x256 game texture, and then using a specialized set of instructions to decode each block in each of the pipes. Not extremely difficult, but it does mean you can't watch a movie and play 3D games at the same time (not that big a deal IMO)

If programmed properly, it could cause a speedup of X number of times based on how many vertex pipes the card has (Most DVD calcs are motion vector based) 6 pipes means 6x more efficient than using a 2D fixed function, with the added bonus that it should be at higher precision and look better too. This seems to be what is happening with the new Forceware release.

With a PCI-E card, MPEG/2/4 encodes should be easily accelerated by simply reversing the decode process and adding or enlarging an accumulation/synchronization buffer.
 
chrisf6969 said:
http://www.xbitlabs.com/articles/video/display/geforce6600gt-theory_6.html <-- Actual test, with newer / working drivers.

http://www.theinquirer.net/?article=19314 <--- has a copy of an email from Nvidia reguarding the drivers/fix.

It seems as though its just for WMV 9.

Realistically its probably b/c the videocard was designed / taped out before the final codecs were available for Window Mediaplayer Video 9. So things did not work exactly as planned. They fixed it for the 6600's and 6200's... and probably the new 6800 PCIe part based on NV41. (12 pipe only core)

Hey, these cards tested on the top article are PCI-X, which do NOT have the problem mentioned in this thread. Their VPU's are working. It is an AGP-only problem, not adressed here.

FWIW I don't plan on watching HDTV on my 6800GT. I plan on getting high framerates in games. My 3500+ will play any WM9 file / divx file under CPU alone, so I personally don't care.

I understand why some are pissed though.
 
Wayfare said:
Hey, these cards tested on the top article are PCI-X, which do NOT have the problem mentioned in this thread. Their VPU's are working. It is an AGP-only problem, not adressed here.

I thought the initial 6800GT's and Ultra's are just AGP cards with the bridge chip. (same core) So they should operate the same, though the drivers may act differently and enable it for the PCIe cards, only.
 
So according to Nvidias' response to TheInq, it works -- it will take a driver revision to enable it though. Psh... we'll see.

Personally, I don't care. If I watch HD/WMV's it doesn't skip even at 50-80% CPU usage, and I won't be doing much less if I'm watching full screen HD formated playback..so... I could care less.

I hope they're right though in that it only takes a driver fix (and or a DVD decoder..which should be free, but; like I said..we'll see).
 
DropTech said:
So according to Nvidias' response to TheInq, it works -- it will take a driver revision to enable it though. Psh... we'll see.
That's not what I get out from that response:
The GeForce 6600 models have the same 1st generation programmable video technology support as the GeForce 6800 models. However, the GeForce 6600 models also include hardware acceleration for high-definition Windows Media Video (WMV) decode.
...meaning the 6800 does not have hardware acceleration for HD WMV? I don't think they were clear in that respones.
 
S Z you know why they are not being clear about this ... they want to keep in a win win situation.. If they are able to fix it with a future driver release they can claim it was there all along just bad drivers.. and if they cant ... they can just say it was never on the 6800.

Theres two things that smell like fish ..... and one of thems Nvidia

Nvidia is currently in damage control ... they are willing to say anything to calm the storm.
 
In order to utilize the programmable decode and advanced post processing features of the GeForce 6800 and 6600, end users need to download an updated ForceWare driver as well as the NVIDIA DVD decoder.

So what exactly is that supposed to mean then?
 
bonkrowave said:
Thats all BS read the other fourm on this subject
i can see that
hdtv_load.gif

mpeg4_load.gif

mpeg2_load.gif

fillrate_color_z.gif

fillrate_color_noz.gif
 
Look guys, before we have a huge misunderstanding here, we need to clarify what the problem is.

Fact: Nvidia advertised the Geforce 6800 AGP series of cards as having WMV9 decode acceleration from the beginning.
Fact: Nvidia has replied to TheInq that WMV9 decode acceleration will be implemented in a future driver release, FOR THE 6600 SERIES.
Fact: When I play "Step Into Liquid" WMV9-HD video on my Athlon XP 3000+ with a BFG 6800 GT AGP card running drivers 65.76 (same for 61.77 and 66.81), my CPU usage is at 100% with frames dropped everywhere.

The complaint by the people that know what they're talking about:
Nvidia advertised WMV9 decode acceleration for the 6800 AGP series of cards (the PCI-E might be fixed?), almost nobody has been able to get the WMV9 decode acceleration with any current driver versions out there. Nvidia confirmed that the 6600 series of cards contain the acceleration which will be enabled in future driver releases. HOWEVER, their statement regarding the 6600 having it implied that the 6800 does not have it. And I quote from TheInq:
The GeForce 6600 models have the same 1st generation programmable video technology support as the GeForce 6800 models. However, the GeForce 6600 models also include hardware acceleration for high-definition Windows Media Video (WMV) decode.

So all of us 6800 AGP owners want is the advertised WMV decode acceleration, but Nvidia seems to be indicating that it will not have this.
 
UAESTAR edit : this was not meant to be address to UAESTAR im an asshead for doing so

You forget to mention that the graphs and charts listed were not released untill nvidia had a chance to look at them and help the person who posted this story along. The article states Nvidia had some hand in the results. It also says the story was ready to go yesterday and was not released untill today .. because Nvidia had to look over it.

So Nvidia held these results back ... Im wondering why this was ??

Sounds kinda ..... made up

And if these charts are correct ??? why does Nvidia also say the driver fix will not be ready until mid to late November ???

doodman said:
So all of us 6800 AGP owners want is the advertised WMV decode acceleration, but Nvidia seems to be indicating that it will not have this.

Doodman hit the nail on the head ... everyone who has tried fixed drivers (which nvidia says will not be available until mid to late NOV) has not had success... the only people with low cpu usage are people with very fast process 3ghz+ range ... which would make sence. Its clear that a 3.4 has about 30-40 % cpu usage and on my 1.8 Im seeing 80-90% usage. Well this just screams ... the CPU is doing all the work ... and the vaporware "on-chip processing" is not working

oh and the top most graph for HDTV which would be WMV9 shows 100% cpu usage. Thats some fine on-chip processing at work there
 
Then again, what is this supposed to mean?

In order to utilize the programmable decode and advanced post processing features of the GeForce 6800 and 6600, end users need to download an updated ForceWare driver as well as the NVIDIA DVD decoder.

It says in order to use them on BOTH sets of cards... it must require a new driver set released by only nVidia... why it must take one is unclear, but they say it will enable it on both series of cards.
 
bonkrowave said:
UAESTAR

You forget to mention that the graphs and charts listed were not released untill nvidia had a chance to look at them and help the person who posted this story along. The article states Nvidia had some hand in the results. It also says the story was ready to go yesterday and was not released untill today .. because Nvidia had to look over it.

So Nvidia held these results back ... Im wondering why this was ??

Sounds kinda ..... made up

And if these charts are correct ??? why does Nvidia also say the driver fix will not be ready until mid to late November ???



Doodman hit the nail on the head ... everyone who has tried fixed drivers (which nvidia says will not be available until mid to late NOV) has not had success... the only people with low cpu usage are people with very fast process 3ghz+ range ... which would make sence. Its clear that a 3.4 has about 30-40 % cpu usage and on my 1.8 Im seeing 80-90% usage. Well this just screams ... the CPU is doing all the work ... and the vaporware "on-chip processing" is not working

I’m just like you i have my Fucking 6800GT GPU in the box new (400$+) to setup my new AMD sys.
All the reviews out there did not say any thing bout this (shit) with NV cards we are misguided, all that test, benchmarks and nothing bout this faulty 6800 cards
 
doodman - It's strange that you are dropping frames in that video. I downloaded and played the 1080 version on my system (AMD64 3000+, 6800nu) and CPU utilization hovered around 70-80% with no dropped frames. I'm just curious as to why we have differences when we have literally the same CPU.

doodman said:
Look guys, before we have a huge misunderstanding here, we need to clarify what the problem is.

Fact: Nvidia advertised the Geforce 6800 AGP series of cards as having WMV9 decode acceleration from the beginning.
Fact: Nvidia has replied to TheInq that WMV9 decode acceleration will be implemented in a future driver release, FOR THE 6600 SERIES.
Fact: When I play "Step Into Liquid" WMV9-HD video on my Athlon XP 3000+ with a BFG 6800 GT AGP card running drivers 65.76 (same for 61.77 and 66.81), my CPU usage is at 100% with frames dropped everywhere.

The complaint by the people that know what they're talking about:
Nvidia advertised WMV9 decode acceleration for the 6800 AGP series of cards (the PCI-E might be fixed?), almost nobody has been able to get the WMV9 decode acceleration with any current driver versions out there. Nvidia confirmed that the 6600 series of cards contain the acceleration which will be enabled in future driver releases. HOWEVER, their statement regarding the 6600 having it implied that the 6800 does not have it. And I quote from TheInq:


So all of us 6800 AGP owners want is the advertised WMV decode acceleration, but Nvidia seems to be indicating that it will not have this.
 
UAESTAR i am very sorry ... I confused you with 3dmarktymark because I am doing a million things at once .... I apologize greaty. I am the ultimate asshead.. please dsiregard my stupid post to you. You have my persmission to flame me beyond recognition ... it was a stupid mistake on my part.
 
DropTech said:
Then again, what is this supposed to mean?



It says in order to use them on BOTH sets of cards... it must require a new driver set released by only nVidia... why it must take one is unclear, but they say it will enable it on both series of cards.
Use what? It does not specifically say WMV decode acceleration as it does in earlier in the text were they say that 6600 will have WMV decode acceleration.

UAEstar, about those charts.. according to a guy over at nVnews these are done with a 6800GT PCIe not an AGP version. I don't know if its true or not but this is what he said got back from X-bit:

We used PCI Express version of GeForce 6800 GT.
There is a rumor, that VPUs used in AGP versions of
GeForce 6800 Ultra and GT have a hardware flaw, that
makes NVIDIA Video Processor inoperable.

I checked my GeForce 6800 Ultra, and i was unable to
get video acceleration working neither on ForceWare 65.76 nor
ForceWare 66.81. Perhaps, VP on these cards is really disabled.
 
S Z if this is true we have another tall tale from nvidia. If thoose results show 100% cpu usage on a PCIe card ... then it is broken on the PCIe as well. IT was stated earlier that this was only a problem with AGP versions.
 
It looks to me like JUST the WMV 9 is BROKEN in the 6800 series cards. However, there are many other videocompression technologies: Mpeg4, divX, mpeg2, etc...

None of the DVD's I've ever used use WMV9. And I would think that would be the most important thing DVD playback.

ONE format doesn't work. At least not yet. But dont discount Nv's driver team. They may be able to use the programmable shaders or video processor somehow or an other.

Currently in the supported formats, its looks like its cutting the CPU by 75% of the load!! thats a HUGE difference. (ie: 60% down to 15%, or 100% down to 25%, or 40% down to 10%, etc.)

Give them a little time, I'm sure they will come up with something. It might not be the 75% difference of the other formats.... but even if they can reduce it by 20% that would be a nice improvement.

And technically, if they can do anytype of hardware processing (even reducing the CPU load by 1%) dont think you're going to get a new card, b/c they will state that technically they are doing some sort of decoding on the hardware. They never claimed a % performance. Just hardware decode support.
 
Nvidia 6800 generation indeed has hardware video support up and running. It will work just fine if you download new drivers which are expected soon and will appear on its web site. You will need the Nvidia DVD decoder as well.

Read more: Inquirer
 
Back
Top