Why Is 6800 Only Clocked at 400mhz ?

Labrador

[H]ard|Gawd
Joined
Jan 19, 2002
Messages
1,930
I must say that I am NOT too impressed with the GeFroce-6800 Ultra review + bencmarks. I am being very open minded on this card, but when the CEO of nVidia said back in March of expecting a 2-3x performance increase over the 5950, are we really getting that ?
ATI's 9800-XT in FarCry is pretty close, for a last generation part, what will happen when the X800-XT competes against it ?

Another thing I am confused about is the clock speeds ? Is the 6800-Ultra only clocked at 400mhz ? Isnt that lower than the current 5950 ? And the X800-Pro will be 500mhz, and the XT at 600mhz, well those are only rumors for now, though.
How come the 6800 isnt clocked higher ? That would certainly improve performance >?


I will only buy a new video card that gets at least 100% performance increase over my 9800-Pro, I dont care which company makes it, nVidia, ATI or Matrox. I only buy the best card for the dollar

Opinions ?


P.S. No offense Brent, but I would prefer more benchmarks comparing the 6800 vs. 9800XT at the same settings, then we can really see the performance difference. Or is it too close, and will make nVidia look bad ?
 
i think its DDR3 the 6800 ultra is using, so its really 1.2mhz. or so i think.
 
Originally posted by Labrador


I will only buy a new video card that gets at least 100% performance increase over my 9800-Pro

Probably going to have to wait a while for that.
 
Originally posted by zoltan
i think its DDR3 the 6800 ultra is using, so its really 1.2mhz. or so i think.

Labrodor is referring to the core clock speed of 400 MHz, the memory is at 550 MHz , 1.1 GHz effective.
 
Ohh nice questions Labrodor,

On the Speed of the card, think of it like this, the 5800-5900 series of cards are like p4's they have a deep pipeline,which needed to be run as fast as possible becasue no one takes advantage of the pipe. what happens is when the NV 30's/35 get there info its has to haul ass down this deep long ass pipe and if any issues pop up its back to step one, so if the instruction was off off on on off off, it would be fast as hell , but since games are more off on on off on off off , its has to take the hit.

The NV 40 is more like your athlon chips shorter pipe + it can do parallel proccessing , basicly each pipe can run diffrent sets of instructions at the same time where as the NV 30/35 couldnt and have to resart from the begining.

Now looking at the synthetc benches with everything off, yeah it is 2 to 3 x's faster than the 5900, 5959 in some areas (dx 9) but in others its not (open GL )
 
well pretty much all games are cpu bound to one level or another, so a 100% increase in many of them is simply impossible except in situations with lots of gpu intensive processes, (aa, af, and resolution), if you want an app with almost no cpu dependence, 3dmark 03 is your app, and in that the stock 6800 ultra about doubles the stock 9800xt.
 
It's 400MHz because it's meant for mass production.

I'm also betting that they're leaving some headroom just in case they need to up things to compete with the R420. There was a time very recently when the very same NV40 ran 20% faster than it does now (475MHz); I'm pretty sure they can bump it back up there and call it a 6850U.
 
Originally posted by Labrador
I must say that I am NOT too impressed with the GeFroce-6800 Ultra review + bencmarks. I am being very open minded on this card, but when the CEO of nVidia said back in March of expecting a 2-3x performance increase over the 5950, are we really getting that ?

You won't see 2-3x the performance lead from the HardOCP article. Read the Anandtech article where the GPU settings are all set to equal. You'll see that the 6800Ultra is 2x faster in some cases.
 
well its a revision board. and very fresh drivers. so i expect performance gains when it hits shelfs.

in some benchmarks you see more than 2x and i think i even saw a 3x gain in one test (and im not talking 3dmark03).

notice alot of the games were cpu-limited, i bet you would see some 2x, 3x gains if we had faster cpus to this day.

i bet you when the card comes out, we should expect alot better performance, maybe not too much, but it will help.
 
Originally posted by Labrador
I must say that I am NOT too impressed with the GeFroce-6800 Ultra review + bencmarks. I am being very open minded on this card, but when the CEO of nVidia said back in March of expecting a 2-3x performance increase over the 5950, are we really getting that ?
ATI's 9800-XT in FarCry is pretty close, for a last generation part, what will happen when the X800-XT competes against it ?

Another thing I am confused about is the clock speeds ? Is the 6800-Ultra only clocked at 400mhz ? Isnt that lower than the current 5950 ? And the X800-Pro will be 500mhz, and the XT at 600mhz, well those are only rumors for now, though.
How come the 6800 isnt clocked higher ? That would certainly improve performance >?


I will only buy a new video card that gets at least 100% performance increase over my 9800-Pro, I dont care which company makes it, nVidia, ATI or Matrox. I only buy the best card for the dollar

Opinions ?


P.S. No offense Brent, but I would prefer more benchmarks comparing the 6800 vs. 9800XT at the same settings, then we can really see the performance difference. Or is it too close, and will make nVidia look bad ?

The 6800U seems to be mostly be 2x a 9800XT or a 5950U. In Pixel Shader 2.0 performance it is 3-4 times faster than the 5950 as Shader Mark proves. You're not impressed? WHATEVER.....
 
no need to get mad chris, he just asked a simple question, wasnt rude. heh

but seriously, you should look up some more reviews. you will see in alot of benchmarks you either see 50-100% in performance gains or even more than 100% performance gains in alot of test. or you see cpu-limited test where the 6800 just hits the max it can go with the cpu tested on (and alot at 1600x1200) all on fresh drivers.

thats pretty impressive if you ask me.
 
Originally posted by Bad_Boy
no need to get mad chris, he just asked a simple question, wasnt rude. heh

but seriously, you should look up some more reviews. you will see in alot of benchmarks you either see 50-100% in performance gains or even more than 100% performance gains in alot of test. or you see cpu-limited test where the 6800 just hits the max it can go with the cpu tested on (and alot at 1600x1200) all on fresh drivers.

thats pretty impressive if you ask me.

It's just amazing how many people "aren't impressed" by a card that doubles performance. I'd upgrade for 25%.
 
Originally posted by evilchris
It's just amazing how many people "aren't impressed" by a card that doubles performance. I'd upgrade for 25%.
The 6800 is 2x the performance of a 9800XT when the cpu isn't bottlenecking. That's why it shows itself more at high res and AA/AF settings. This is seriously a jump in performance on par or better than the GF4 ti4600 -> 9700 Pro jump a few years ago.
 
A lot of people are being mislead by hardocp's preview, go read any other review and you can clearly see how much faster the nv40 is.
 
a review sample card with drivers that are not mature...it isnt gonna run the way you think...however so far I am very impressed...a 25% increase minimum....is pretty damn snazzy on pretty much alpha drivers...now im more interested in seeing what BFG and others will do and how well mature drivers will do for this thing...by the time the R420 hits it should be a very good battle
 
You have to consider the NV40 is a 222 Million transistor chip, ie. its big, chews alot of power and generates alot of heat. Its also hard to get good yeilds on such a large chip. Remember Ati is using low-k 0.13 u process, ie. less heat and less power and will be able to clock it higher than if they were using the standard 0.13 u process.

Wait til the Ati cards comes out, everything looks good for a consumer point of view, NV40 is a great card, just have to see how it stands against its real competition. Plenty of time to wait anyway, dont expect any 6800 Ultras til latter end of June, by then everyone will have the clear picture everyone has been speculating about. I dont know whats the fuss is about reviews, its just a preview. Cant tell shit right now anyway with only one side of the competition.
 
You know the ATI fans will not believe this but.
I would be willing to bet that these first cards out (The samples sent for review) are not going to be near as fast as what hits the stores in a few months. They will put better memory on the cards so that it can be clocked higher. They will be adding different cooling on them both bigger and smaller. I am sure someone will put one out that will not take up the extra PCI slot.
They I would be willing to bet like someone else said have some head room left in order to push it up more. Just look at any company, They do not put all their top things out right at the start. If it was me I would send out the middle of the road first. Then release cheaper (Slower) cards and higher priced (faster) cards out later on. That way they get the whole deal to last a lot longer. If they put every thing out now in a few months they would have to be able to release a whole new batch in order to keep their names in peoples minds.

Just think they will be having Pro, XT, SE and who knows what else come out later on. ATI will do the same and no one can blame them for it. I would say in a few months both companies will release cards that are a LOT faster than the first ones out. Could even be 2x what we are seeing now or more. These cards are about 2x faster than what was out. So who knows maybe they have enough head room built in that we are not seeing yet to really be able to kick it up a bunch.
 
Seeing the PSU requirements it may very well be to keep the PSU requirements on less than 550 W :rolleyes:

I wonder if ATI have the same trouble with the power requirements?
 
The only reason the 9800XT even comes close to the 6800 performance wise in those low resolution games is because of the CPU bottleneck.
 
On the original topic. At 400Mhz its core speed is a little slower that some cards... however its got 16 (more-complicated) pipelines. Therefore, its like comparing a one lane road with a speedlimit of 65 mph to an 8 lane superhighway with a 55mph.

The wider superhighway will move A LOT more traffic.

the 6800U has 4x as many pipes as the 59xx and 2x as many as the 9800. It many tests it could produces between 50% to 300% better results.

This card is going to need a 5Ghz processor to really let it fly. When you see it reaching the same fps across 3 resolutions (and/or AA/AF settings) or all of the tested cards having similar max or min fps you're probably hitting a CPU bottleneck.

Given a nice 4Ghz overclocked rig and I bet a lot of the differences would be larger between it and the last generation cards. (9800xt/5950u)
 
Because 8xFSAA is broken [personally I think it's uncomplete since it is EXACTLY the same as the FX 8xFSAA]. There's no way around it. That kind of performance nose dive is just not normal. However, I must say that the Hexus review is complete asshit. Half those tests are CPU limited at 1024x768 [both cards in fact] and we don't get good 1600x1200 tests. Also, I have no respect left for the editor after reading the forum thread on the article. When someone asked him why he didn't run the tests at 1600x1200 he said his monitor didn't support the res and he then went on to say he preffered his article that way since he think most people will run it at 8xAA/16xAF/1024x. I don't know, maybe it's just my little own world, but I'd bet most people with a NV40 will be trying to run it at 1600x1200/4xAA/16xAF [which better reviews like xbitlabs and beyond3ds have shown is possible with 60+ FPS in a good deal of games] originally and will drop the res only when forced to.
 
He actually said that? Ha, I don't read the forums, I just checked out the review when someone gave me the link. Still I think it's pretty interesting since SOME people do run more than 4xaa. I remember playing KOTOR at 1280x1024 with 6xaa a few months ago because my old monitor didn't support 1600x1200. Now that I have a better one I run 1600x1200 of course ;) .
 
Something smells like horseshit with that Hexus review. If 8xFSAA is broken with the driver release they used then so be it. I'm sure it will get fixed because I have a hard time believing this gorilla of a card wouldn't be able to pull that off.
 
Are there any other reviews out there that compare the nv40 against a r360 with fsaa above 4xaa?
 
I would like to see some reviews at Ultra resolutions like 1920x I mean that's what the 23" Apple Cinema Display is for, and the new LG monitor that just came for only $1600 :eek: you get 23" wide screen + 16response time :)
Lets see how FarCry and UT2004 run at 1920 res, with 2xAA + 8xAF
 
It does look like 8xaa is broken. Hopefully it will be fixed before it gets to consumers.

My question is why would anyone want 8xaa? I always thought 4xaa was pretty good. Just personal preference I guess.

Hexus did manage to get his up to 460mhz. Hexus seems to have digged a little futher, and found the temperature monitor. Which the 6800ultra is set to shut off at 125 celsius, but ran at a cool 44 celsius in his tests.
 
i know once i get my hands on one of these ill be using 2048*1536 alot more :) I prefer 1600*1200 with 4x fsaa in games like everquest but my 9700 pro runs out of fillrate alot and chugs in alot of heavy spellcasting situations.

as far at 8x aa goes.... i dont really see a situation where I would chose 1024*768 with 8x over 1600*1200 with 4x but someone playing an older game with no support for very high resolutions might be forced to. I am satisfied that nvidia has somewhat fixed 4X fsaa but if the 8x fsaa performance hit proves to be permanent i will ultimately have to say that nvidias fsaa implementation is still inferior to the implementation ATI introduced in the 9700 series. is 8x fsaa going to break this card... most definitely not, however if R420 produces slightly faster performance in 4x mode and greater performance at 6x compared to nvidia 8x i see alot of fanboys beating there chests about how everyone should have waited for better speed with AA modes that will rarely be used.

i recently put a geforce 1 in my mothers computer and installed the latest detonators. its been a long time since i had any experience with nvidia drivers and i have to say that i was pretty impressed with all the extra features nvidia has added to their driver package. IMO ati needs to do a little more than add a few gimmick shaders that nobodys ever going to use for more than wow factor when showing off a system. I think ati should work on providiing digital vibrance controls as well as the magnifying lens tool that comes with nvidia drivers.
 
the reason is that they compared

6xAA on the 9800 vs. 8xAA(mixed mode) on the 6800.

And justified it by say the quality was similar.
Its not really a fair comparison.

I dont know why they didnt just do 1600x1200 with 4xAA/16xAF, instead of ultra high 6x & 8x AA at 1280x1024.

1600x1200 with 4xAA/16xAF should look better.

Maybe they were tryin to cater to people with flat panels with max resolutions of 1280x1024 ???
 
Originally posted by chrisf6969
the reason is that they compared

6xAA on the 9800 vs. 8xAA(mixed mode) on the 6800.

And justified it by say the quality was similar.
Its not really a fair comparison.

I dont know why they didnt just do 1600x1200 with 4xAA/16xAF, instead of ultra high 6x & 8x AA at 1280x1024.

1600x1200 with 4xAA/16xAF should look better.

Maybe they were tryin to cater to people with flat panels with max resolutions of 1280x1024 ???

You do realise that lcds outsell crts by a large margin these days, and on a lcd. So there is relevance for all areas of the market. Just because Nvidia fared badly doesnt mean its biased, be thankful at least you know the full picture of the card. It still is a very good card. Much better you know the problems that find them out when you purchase the card. They included 8xAA reviewers have every right to test it. Look beyond reviews, they are guides, they are not the gospel.
 
Originally posted by Mister.Auto
You do realise that lcds outsell crts by a large margin these days, and on a lcd.

No. 2004 will be thie first year in which LCD sales outstrip CRT sales. The CRT is still very much alive and kicking.
 
Furthermore, nearly all of those LCDs are going to corporate environments, where small footprint and low power consumption are paramount considerations. No gamers there.
 
Really sucks when people make shit up. Let's not write things as if they are facts. LCD's outstrip CRT sales by a large margin... you sure thats true?

On topic, I say lets not chastise NVidia for a broken 8x mode on a pre-release card. Who can play 1600*1200 on farcry with 4xAA? Only the 6800Ultra. It does 4x and 6x well. Maybe the radeons look a bit better to some but its definitely down to spliting hairs, and we should all admit there really is no current comparison to this card. No one would take 9800XT for the AA over the 6800Ultra speed so wait until R420 before waving the ATI flags please.
 
Originally posted by ZenOps
It does look like 8xaa is broken. Hopefully it will be fixed before it gets to consumers.

How is it broken? Just because it has a large performance hit doesn't mean it's broken, it just means it uses SSAA...
 
I dont think 8x is broken, its just costs a LOT of performance b/c of the exponential amount of work it takes to do it.

4xAA is very comparable quality wise(both use a rotated grid, etc) and the 6800 wipes the floor with the 9800 (as it should - next gen part)

6xAA 6800 is much quicker than 9800's 6xAA, but not as good of quality (different metthods of AA ?)

8xAA nvidia is slow, but is doing a lot more work and actually looks better than 6xAA on the ATI

Well, when the X800 pro/xt comes out then we will see how things match up.

I'm just wondering about the REAL clock speeds on the release products.

Since the 6800 samples given to game developers were all clocked at 475Mhz and supposedly the 420 will be clocked at 600Mhz(rumors). I'm betting we may be somewhere in the middle for both.

Nvidia may tweak their silicon and release an A2/A3 version for the 6800 Ultra, and the 6800 regular may run at 400Mhz. The temps looked low enough, they may have the voltage turned down in the bios on purpose to fool people in to thinking it wont be as fast.
 
the problem with 8x is that it uses both multisampling and supersampling, multisampling has a far less performance hit, that is why it is used for all the other modes, so in a way it is broken because it should be 8x rgmsaa instead and have a lesser performance impact, but IIRC the NV40 architecture isn't designed to do multisampling at that high a level, so mixed mode has to be used. if it was 8x multisampling it would look almost as good and have a performance hit only slightly larger than 4x.
 
Back
Top