Anyone else have a 6800LE?

For Nvidia it should be relatively easily to produce AGP 6600GTs using their HSF Bridge. Looks like the bridged solution WAS better than native PCI-E support. At least for us people still on AGP. :D
 
Disarray said:
If I am not mistaken, I saw this card in best buy this weekend.

What card exactly? I'm trying to do a search on best buy. I only see the $400 Geforce cards?
 
pxc said:
the numbers came from this article and yes they are directly comparable: http://www2.hardocp.com/article.html?art=NjQyLDQ=

Comparison to HardOCP official doom3 benches
=====================
1024x768 0/8
==========
5950U: 43.4
9800XT: 45.7
6800LE: 53.3

1024x768 4/8
==========
5950U: 32.1
9800XT: 26.9
6800LE: 39.2

1024x768 0/0 "Medium"
===========
5950U: 50.6
9800XT: 50.8
6800LE: 73.9


How are they? [H] used Fraps as far as I know, and he used the timedemo. They are not comparable.

Quote from the B3D interview;

B3D: It appears that benchmarking demos (using the "timedemo" command) results in higher performance stats than actual gameplay. Can you explain to me why is this so, regarding the "timedemo" command? Is it because "timedemo" do no calculate AI and physics? What else? Also, if I run in a straight line in normal gameplay and have that entire run logged, total frames is higher than a recorded demo of that same run. Why is this?

Carmack:Timedemo doesn't do any game logic.

Demos are always recorded at exactly 30Hz.

edit, it seems [H] did use the timedemo. I thought they always used fraps.
 
The Batman said:
For Nvidia it should be relatively easily to produce AGP 6600GTs using their HSF Bridge. Looks like the bridged solution WAS better than native PCI-E support. At least for us people still on AGP. :D
The 6600GT is faster in 3DMark, but the Doom3 scores were revised down from what was in the pdf leaked yesterday:

Doom 3
High Quality, 1024x768 4xaa/8xaf 45fps
High Quality 1600x1200 0xaa/0xaf 42fps

That's a huge difference from 56fps in the PDF.

6800LE > 6600GT, just as I initially thought it would be. bad leaked information :mad:
 
Uh, if you'd bothered to click on the link pxc provided you'd have noticed that [H] was using the timedemo as well.
 
fallguy said:
How are they? [H] used Fraps as far as I know, and he used the timedemo. They are not comparable.

Quote from the B3D interview;
Let's see... did you click the link to the timedemo demo1 scores that [h] posted? Probably not. :p

From the first page of that article where he posted the results from:
Today we are sharing with you framerate data that was collected at the id Software offices in Mesquite, Texas. Both ATI and NVIDIA were present for the testing and brought their latest driver sets. While an extensive amount of data was taken, what we want to focus on is the high end video cards that are currently making their way to market. That means we will be showing you frames per second rates taken using the DOOM 3 timedemo "demo1" that will be included in your boxed copy of DOOM 3. The version of the game used to test is the same version you will be loading onto your own computer.

It's silly to argue that the [h] results Robstar used were not comparable since both he and the scores posted in the [h] article used timedemo demo1 benchmarks. Robstar didn't compare timedemo demo1 scores to fraps scores. It was apples to apples timedemo demo1 to timedemo demo1 results.

The overall relative performance didn't change when Kyle/Brent used fraps compared to the timedemo scores, so it's a moot point anyways.
 
Read my edit. I thought they used Fraps, as they always do when getting frames from a game. People make mistakes, as I just did. Do what you gotta do to make yourself feel big though. Its a little hard to read the link, when the front page wont load. Also, they may have not used the same driver options, so you cant be 100% sure. I think [H] uses the default driver config.
 
fallguy said:
Read my edit. I thought they used Fraps, as they always do when getting frames from a game. People make mistakes, as I just did. Do what you gotta do to make yourself feel big though. Its a little hard to read the link, when the front page wont load. Also, they may have not used the same driver options, so you cant be 100% sure. I think [H] uses the default driver config.

How can not clicking on a provided link be considered a 'mistake'. Strikes me more as LAZY than 'oops I did it again'. :rolleyes:

You're just sweating buckets because this card could potentially end the dominance of the 9800 pro at the $200 price point. It's pretty much the last peice of the pie that ATI has.

Oh and disregard this post...the Devil made me do it.
 
amheck said:
You sure there's no AGP? From http://www.nvnews.net/previews/geforce_6600_series/

"Initial product launch will be for the PCI Express architecture (with AGP versions to follow)."
Sounds like a bad idea. And oops... the 6800LE is faster in Doom3. nvidia revised the timedemo scores (was: 56fps in 1024x768 HQ 4xAA/8xAF and 1600x1200 HQ 0xAA/0xAF)... new scores: 45fps in 1024x768 HQ 4xAA/8xAF and 42fps in 1600x1200 HQ 0xAA/0xAF. That makes a lot more sense.
 
The Batman said:
How can not clicking on a provided link be considered a 'mistake'. Strikes me more as LAZY than 'oops I did it again'. :rolleyes:

You're just sweating buckets because this card could potentially end the dominance of the 9800 pro at the $200 price point. It's pretty much the last peice of the pie that ATI has.

Oh and disregard this post...the Devil made me do it.

It was a mistake, because BEFORE he posted the link, I mistakenly assumed they used Fraps. Because they use it for everything. And as I said, the front page wont load for me. I thought they used Fraps, for both of the Doom3 articles. Maybe just the other one, I dont know because as I said, it wont load.

You can quit assuming, though. Nice tude though, pretty mature. :eek:
 
BTW, would you mind terribly testing that thing in some other game? I'm thinking about getting one of these, butone thing worries me a bit. Remember, they explicitely said this card was MADE for Doom3. nVidia and ATI have both been caught at cheating so much it's sad. How about something like Far Cry?
 
Nazo> I have far cry available. I can test it in farcry if you tell me which/what kind of benchmark to run.

LET ME KNOW. I'm more than happy to help my hard brothers out!

Rob
 
Just anything that you can compare results with. Right now HarcOCP.com won't load for me, so I can't look around to see if they have a good official thing to compare with. I noticed that HardwareOC.hu has a benchmark program that people appear to be using for Far Cry which you can find at this location:
http://www.hardwareoc.hu/index.php/p/news/cid/3/y/5851.html

I'm still trying to find a good comparison table of results with other cards though. d-: Maybe someone else here with a good card wouldn't mind testing and giving us results.

EDIT: Not official by any means, and they used an older version, but, maybe this is a start:
http://discuss.futuremark.com/forum...=techdisplayadapters&main=4113017&type=thread

In particular, I admit to being curious as to how this thing compares to the 9800 Pro more than anything else.

Be sure to back up your configuration files, just in case. That was with an older version of the benchmark program, but, just to be sure in case they didn't fix the sound thing.
 
Lol, that's the same thing at a different location.

Now if you could find some kind of semi-official results, that would be sweet. Lol, all I can do is compare whatever he comes up with with those forum results.


BTW, be sure to test high 1600x1200 too. Don't worry if your monitor can't display it, your card can. ^_^

EDIT: Lol, here's an ultra-low end comparison for you: (Note all settings highest, control panel set to high quality. All this was done for the sake of comparing. I chose the HardwareOC Steam test as it seemed a good balance for the most part.)

-= provided by HardwareOC =-


The benchmark started at 8/13/2004 4:50:55 PM
Comment: absolute max details, FSAA6,


Resolution: 640×480
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: 6×
Anisotrophic filtering: 16×

Score = 24.60 FPS



Resolution: 800×600
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: 6×
Anisotrophic filtering: 16×

Score = 17.62 FPS



Resolution: 1024×768
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: 6×
Anisotrophic filtering: 16×

Score = 12.27 FPS



Resolution: 1280×1024
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: 6×
Anisotrophic filtering: 16×

Score = 8.14 FPS

Lol, if you can't beat those scores, throw that card in the trash. ^_^ Of course, I was throwing the absolute maximum stuff at it, but, you get the idea. I didn't bother with the 1600x1200 test because it would take forever and the results are predictable. Oh, and this was with the 1.1 version of Far Cry and the beta Catlyst 4.9 drivers placed in the Far Cry directory.
 
This is with the FarCry 1.2 patch...I snagged it before it was recalled.

Enjoy :)

=================
6x/16x ULTRA
=================
The benchmark started at 8/14/2004 9:43:08 AM

Resolution: 640×480
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: 6×
Anisotrophic filtering: 16×

Score = 44.16 FPS (Run 1)
Score = 47.80 FPS (Run 2)
Average score = 45.97 FPS


Resolution: 800×600
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: 6×
Anisotrophic filtering: 16×

Score = 35.55 FPS (Run 1)
Score = 37.06 FPS (Run 2)
Average score = 36.30 FPS


Resolution: 1024×768
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: 6×
Anisotrophic filtering: 16×

Score = 25.88 FPS (Run 1)
Score = 27.46 FPS (Run 2)
Average score = 26.67 FPS


Resolution: 1280×1024
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: 6×
Anisotrophic filtering: 16×

Score = 19.05 FPS (Run 1)
Score = 19.82 FPS (Run 2)
Average score = 19.43 FPS


Resolution: 1600×1200
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: 6×
Anisotrophic filtering: 16×

Score = 14.16 FPS (Run 1)
Score = 13.00 FPS (Run 2)
Average score = 13.57 FPS
=================
0xAA/0xAF ULTRA
=================
The benchmark started at 8/14/2004 10:18:01 AM

Resolution: 640×480
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: None
Anisotrophic filtering: 1×

Score = 52.57 FPS (Run 1)
Score = 52.70 FPS (Run 2)
Average score = 52.63 FPS


Resolution: 800×600
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: None
Anisotrophic filtering: 1×

Score = 49.98 FPS (Run 1)
Score = 51.07 FPS (Run 2)
Average score = 50.52 FPS


Resolution: 1024×768
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: None
Anisotrophic filtering: 1×

Score = 40.41 FPS (Run 1)
Score = 42.43 FPS (Run 2)
Average score = 41.41 FPS


Resolution: 1280×1024
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: None
Anisotrophic filtering: 1×

Score = 29.44 FPS (Run 1)
Score = 31.04 FPS (Run 2)
Average score = 30.23 FPS


Resolution: 1600×1200
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: None
Anisotrophic filtering: 1×

Score = 22.02 FPS (Run 1)
Score = 23.11 FPS (Run 2)
Average score = 22.56 FPS

=========================
0xAA/0xAF MAX
=========================
The benchmark started at 8/14/2004 11:05:57 AM

Resolution: 640×480
Maximum quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: None
Anisotrophic filtering: 1×

Score = 59.45 FPS (Run 1)
Score = 58.79 FPS (Run 2)
Average score = 59.11 FPS


Resolution: 800×600
Maximum quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: None
Anisotrophic filtering: 1×

Score = 59.02 FPS (Run 1)
Score = 58.42 FPS (Run 2)
Average score = 58.72 FPS


Resolution: 1024×768
Maximum quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: None
Anisotrophic filtering: 1×

Score = 51.37 FPS (Run 1)
Score = 53.70 FPS (Run 2)
Average score = 52.53 FPS


Resolution: 1280×1024
Maximum quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: None
Anisotrophic filtering: 1×

Score = 38.32 FPS (Run 1)
Score = 40.65 FPS (Run 2)
Average score = 39.48 FPS


Resolution: 1600×1200
Maximum quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: default model
Antialising: None
Anisotrophic filtering: 1×

Score = 28.39 FPS (Run 1)
Score = 29.84 FPS (Run 2)
Average score = 29.11 FPS
============================
0xAA/0xAF MAX PS 3.0
============================
The benchmark started at 8/14/2004 11:29:07 AM

Resolution: 640×480
Maximum quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: model 3.0
Antialising: None
Anisotrophic filtering: 1×

Score = 63.17 FPS (Run 1)
Score = 63.08 FPS (Run 2)
Average score = 63.12 FPS


Resolution: 800×600
Maximum quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: model 3.0
Antialising: None
Anisotrophic filtering: 1×

Score = 62.14 FPS (Run 1)
Score = 62.53 FPS (Run 2)
Average score = 62.33 FPS


Resolution: 1024×768
Maximum quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: model 3.0
Antialising: None
Anisotrophic filtering: 1×

Score = 54.75 FPS (Run 1)
Score = 57.24 FPS (Run 2)
Average score = 55.99 FPS


Resolution: 1280×1024
Maximum quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: model 3.0
Antialising: None
Anisotrophic filtering: 1×

Score = 39.93 FPS (Run 1)
Score = 42.47 FPS (Run 2)
Average score = 41.20 FPS


Resolution: 1600×1200
Maximum quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: model 3.0
Antialising: None
Anisotrophic filtering: 1×

Score = 29.94 FPS (Run 1)
Score = 31.33 FPS (Run 2)
Average score = 30.63 FPS

======================
6x16 ULTRA, SM 3
======================

Resolution: 640×480
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: model 3.0
Antialising: 6×
Anisotrophic filtering: 16×

Score = 47.16 FPS (Run 1)
Score = 51.02 FPS (Run 2)
Average score = 49.09 FPS


Resolution: 800×600
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: model 3.0
Antialising: 6×
Anisotrophic filtering: 16×

Score = 35.85 FPS (Run 1)
Score = 38.42 FPS (Run 2)
Average score = 37.13 FPS


Resolution: 1024×768
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: model 3.0
Antialising: 6×
Anisotrophic filtering: 16×

Score = 27.40 FPS (Run 1)
Score = 28.81 FPS (Run 2)
Average score = 28.10 FPS


Resolution: 1280×1024
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: model 3.0
Antialising: 6×
Anisotrophic filtering: 16×

Score = 19.55 FPS (Run 1)
Score = 21.14 FPS (Run 2)
Average score = 20.34 FPS


Resolution: 1600×1200
Ultra quality option, Direct3D renderer
Level: Steam, demo: hocsteam.tmd
Pixel shader: model 3.0
Antialising: 6×
Anisotrophic filtering: 16×

Score = 14.53 FPS (Run 1)
Score = 13.40 FPS (Run 2)
Average score = 13.96 FPS
 
you were mistaken about seeing that 6800le at bb. the 6800le is an oem part only so they won't be sold in stores only places like newegg and from manufacterers you probably saw a similar card and where can we get these online
 
Those scores definitely aren't bad (don't let those low framerates fool you.) The ultra-detail mode is designed to try to bring any card to it's knees. With the addition of 16xAF and 6xFSAA to factor into the equation, those scores are quite nice. I think that it looks as if it is at least equal to the 9800 Pro according to that test. If someone wishes to test this for certain it wouldn't hurt mind. Heck, if you are running 1600x1200, you don't really need FSAA IMO. At a resolution like that, even my picky eyes probably wouldn't notice the aliasing so much. (I wouldn't know since this peice of crap monitor doesn't do 1600.) Even 1280 is pretty darned good. This more or less alleviates my worries that nVidia was using Doom3 to market their cards with those little tricks both of those two have been caught at in the past.

The only thing I'm wondering about is the fact that in my thread about which card to get for $200, someone suggested that the 6600GT might have an AGP version and should be in the $200 range, about the exact same price as this card in fact. They believed that the 6600GT would be better than the 6800LE. Would any of you know more about this than I do? I noticed you people talking about it earlier, but, what about the actual capabilities of the core itself? You only discussed the memory. Also, how about overclocking. Would an overclocked 6800LE be stronger than an overclocked 6600GT? I know you probably could only guess at all this, but I'm a bit curious.
 
Nazo said:
Those scores definitely aren't bad (don't let those low framerates fool you.) The ultra-detail mode is designed to try to bring any card to it's knees. With the addition of 16xAF and 6xFSAA to factor into the equation, those scores are quite nice. I think that it looks as if it is at least equal to the 9800 Pro according to that test. If someone wishes to test this for certain it wouldn't hurt mind. Heck, if you are running 1600x1200, you don't really need FSAA IMO. At a resolution like that, even my picky eyes probably wouldn't notice the aliasing so much. (I wouldn't know since this peice of crap monitor doesn't do 1600.) Even 1280 is pretty darned good. This more or less alleviates my worries that nVidia was using Doom3 to market their cards with those little tricks both of those two have been caught at in the past.

The only thing I'm wondering about is the fact that in my thread about which card to get for $200, someone suggested that the 6600GT might have an AGP version and should be in the $200 range, about the exact same price as this card in fact. They believed that the 6600GT would be better than the 6800LE. Would any of you know more about this than I do? I noticed you people talking about it earlier, but, what about the actual capabilities of the core itself? You only discussed the memory. Also, how about overclocking. Would an overclocked 6800LE be stronger than an overclocked 6600GT? I know you probably could only guess at all this, but I'm a bit curious.

Don't know the spec comparisons, but this is one big consideration. The 6800LE is NOT for retail sale -- good luck getting one (unless you get it from an OEM some how). The 6600GT will be for sale (at less than $200.00). The 6600 GT will 1st come out with a PCI Express version and then AGP (probably a few weeks later). Late September early October probably. So the difference in specs might not matter -- if you can't buy the 6800 LE separately.
 
Except that people are saying Newegg will have it. Newegg does carry OEM parts after all. My CPU was OEM for example. At least, I'm pretty sure it was. d-:

EDIT: Yep. Probably OEM only I'd bet.

EDIT2: I'm still kind of wondering how the two really compare. Yes, the memory sounds better, but surely even nVidia isn't dumb enough (if I sound insulting, I feel the same way about ATI, so sorry, that's just the way they are) to release two cards in the same price but with one truly better than the other without some catch. Especially since the 6800LE won't exactly be hard to get if it really does appear on newegg. The number of people using newegg for stuff like this is growing exponentially.
 
It's OEM only but that doesn't mean it won't start to show up on some online shops.
 
Anyone know when these are supposed to be released to the OEM channel?

Rob
 
More information on the comparison front. I saw a comparison saying that the 6800LE has a 256-bit memory bus versus the 6600GT having a 128-bit memory bus. The 6800LE has much lower clocks for the core/memory, but it depends on the quality of those whether that higher clock is truly better. (Plus the 6800LE might be pretty overclockable.) The same site said that a chinese site had a leaked 6600GT or something and showed some 3Dmark03 scores to the effect of 8K. Of course, that was PCI-E on a 3.2GHz P4 so I'm not entirely sure how it compares.

Are you sure you want to trade that card for a 9600 Pro though? 9600 Pro isn't going to be very good much longer. Already things like Far Cry tend to have to be pretty much quality crippled. Even if the 6800LE isn't as good as the 6600GT, it's certainly better than a 9600 Pro. This card is surely worth the equal of the 9600 Pro at least. I know you get the extra 150 though, so whatever.
 
$150 + 9600 is a damned good deal. He could sell of that 9600 later for at least a $100, buy himself a 6600GT and pocket the rest.
 
Hey Robstar did XFX tell you they where sending you a 6800le because I just had an RMA 5200fx that xfx shipped out today and I was hoping I might get really lucky and get a 6800le. I just wondering because they still list that they carry the 5900 so there might be a chance that I get 6800le for my 5200.

I can always dream right.
 
Well, after like 5 weeks of waiting for a 5900XT, they called me and said they were having problems getting 5900XT parts, so they were gonna ship me a 6800LE instead.

I said "cool with me :) "

Rob
 
i think that they are gonna release an agp version of the 6600s about a month after the pci-e version comes out

i have a dead 5700 ultra right here hmmm.... hope they dont have anymore parts for those
 
Beware just trying to find a dead XT..

They _WILL_ ask you for your original receipt which shows the date you purchased it ;)

Rob
 
The reason I asked if he was sure is that it's about to get relatively hard to sell a 9600 Pro with Doom 3 and Half-Life 2 (supposedly early next month) as well as soon to have new things using their engines naturally (HL2 already has at least two things using it's engine being developed right now.) Sure, the 9600 Pro can pull off Doom 3 with low quality settings and presumably HL2 with similar settings, but, there are cards in the $150 and $200 range that will do FAR better easily. Heck, a 9700 should do better. Ok, it is almost certain you will eventually find a buyer, as there are probably still some poor saps using worse than this, but, you might not really get $100 for it.

Personally, I don't mind waiting a bit, but it's driving me crazy that I'm going to have to wait months just to even have the CHANCE to get a 6600GT. I'm tempted to get a 6800LE just out of pure impatience. I do wish I could find out for certain which is truly better though. Things aren't entirely clear in that area. Especially since the 6600GT tested was PCI-E which I would assume will do a bit better than AGP and on a fast P4 meaning that it wasn't at all CPU limited.

Besides, that 6800LE got some nice results there when you factor in the fact that ultra mode was designed to bring any card to it's knees as well as the fact that it was set to 16xAF and 6xAA... In fact, the 6800 might have an advantage since, from what I understand, AF and AA are rather memory dependant, and I would assume a 256-bit bus would do better than a 128-bit bus in that sort of area.
 
Robstar said:
Well, after like 5 weeks of waiting for a 5900XT, they called me and said they were having problems getting 5900XT parts, so they were gonna ship me a 6800LE instead.

I said "cool with me :) "

Rob


I am probably not getting one then because it the dead card arrived yesterday and the new on eshipped today. So it seems they would at least ask if a different card was ok.
 
Nazo said:
The reason I asked if he was sure is that it's about to get relatively hard to sell a 9600 Pro with Doom 3 and Half-Life 2 (supposedly early next month) as well as soon to have new things using their engines naturally (HL2 already has at least two things using it's engine being developed right now.) Sure, the 9600 Pro can pull off Doom 3 with low quality settings and presumably HL2 with similar settings, but, there are cards in the $150 and $200 range that will do FAR better easily. Heck, a 9700 should do better. Ok, it is almost certain you will eventually find a buyer, as there are probably still some poor saps using worse than this, but, you might not really get $100 for it.

Personally, I don't mind waiting a bit, but it's driving me crazy that I'm going to have to wait months just to even have the CHANCE to get a 6600GT. I'm tempted to get a 6800LE just out of pure impatience. I do wish I could find out for certain which is truly better though. Things aren't entirely clear in that area. Especially since the 6600GT tested was PCI-E which I would assume will do a bit better than AGP and on a fast P4 meaning that it wasn't at all CPU limited.

Besides, that 6800LE got some nice results there when you factor in the fact that ultra mode was designed to bring any card to it's knees as well as the fact that it was set to 16xAF and 6xAA... In fact, the 6800 might have an advantage since, from what I understand, AF and AA are rather memory dependant, and I would assume a 256-bit bus would do better than a 128-bit bus in that sort of area.

Aye, don't believe the hype. A 6800LE should *theoretically* roll over a 6600gt. If the 6600gt has the same clockspeed on the core, it should need 1400mhz memory to be equivalent to a 6800LE, and I don't see that happening.
 
Aren't both the 6600GT and 6800LE suppose to sell for around $200? Gotta figure they'd be close in performance if that's the case. Maybe the 6800LE will jump to $225 or something.
 
I've seen reports that the 6600GT could be > 200. In the area of 220 or so. Undeniably, it is kind of dumb to market two cards that are so close in many ways at the same prices. Theoretically the better one would be more expensive. In practice, that can kind of differ. You can market on some stuff like clock speed sometimes. And I would assume PCI-E is more expensive, so perhaps the AGP version would be cheaper.

Anyway, the 6600GT has higher core and memory clocks. However, I'm not entirely certain a 6800LE couldn't be overclocked to pretty high up as well. Plus, if the core and memory are just plain weaker, then the 6600GT can still be worse at higher speeds theoretically (see athlon versus pentium 4.) d-: I want to know which is better by the time the 6800LE is out so I can go ahead and buy it if it's the right one. )-:

So, it's very hard to find out just which one truly IS better. I'm inclined to think that the 6800 might have certain advantages though. Especially when it comes to memory intensive things. Lower clock or no, a bus with twice the bandwidth might still do better. Then again, the 6600 series has a newer GPU (NV43 versus 6800 being NV40 if I recall my numbers correctly.)

What I'm afraid of is that we just won't know until the 6800LE and 6600GT are both out on the shelves. Plus, just how much difference is there between AGP and PCI-E? Doesn't PCI-E have certain advantages beyond SLI? (BTW, I read that the plain 6600 will NOT support SLI. Just a little heads up on that.)
 
Nazo said:
Yes, the memory sounds better, but surely even nVidia isn't dumb enough to release two cards in the same price but with one truly better than the other without some catch.
Weren't the FX5900XT and the FX5700U priced similar, with the 5700U having higher core/RAM speeds yet the 5900XT outperforming it?
 
Back
Top