Even More Performance increase with Doom3/OpenGL for X1800?

Give it a few more driver releases and everyone will have forgotten the XT was so close in performance to the 7800GTX.
 
R1ckCa1n said:
Give it a few more driver releases and everyone will have forgotten the XT was so close in performance to the 7800GTX.
Hell the card is not even out yet :rolleyes:

Who's to say that by the time it does come out it won't have something else to compete with.
 
PRIME1 said:
Hell the card is not even out yet :rolleyes:

Who's to say that by the time it does come out it won't have something else to compete with.

like the precious 7800gtx 512 that you're holding out for.
There is no competition so with november 5th as a release date....sorry nvidia.
 
Netrat33 said:
like the precious 7800gtx 512 that you're holding out for.
There is no competition so with november 5th as a release date....sorry nvidia.
Well seeing as how the reviews never gave the XT a clear nod over the GTX and the fact that the GTX has been out for months WITH SLI......NVIDIA has nothing to be sorry about.
 
concidering this is out for the XL as well and has the option of making the XL equal to GTX speeds? i say ATI has a clear winner for generations to come as far as performance is involved
 
R1ckCa1n said:
Give it a few more driver releases and everyone will have forgotten the XT was so close in performance to the 7800GTX.

QFT! I'm really anxious to see how far they can push it with further tweaks. It took me a minute to get my jaw off the floor after seeing an ATI card leading in an OGL title and it keeps on getting better.

On a sidenote: Does anyone know when ATI is going to migrate their fireGL line to the r520 cores?
 
Trimlock said:
concidering this is out for the XL as well and has the option of making the XL equal to GTX speeds? i say ATI has a clear winner for generations to come as far as performance is involved


? where did this come from

both cards have thier strengths and weaknesses. Both companies will be fixing thier weakness, and increasing thier strengths. The competition is close it will remain so for a quite some time, until someone falters. Like ATi did this round and last with yeilds and delays and nV with the fx line. Just because ATi is marginally beating its competition, being 6 months late and 190 mhz higher with 300 mhz on the ram doesn't mean they have a better core. Its all relavent from what you look at, thats why I say somethings are good with ATi somethings are good with nV.
 
It just keeps getting better and better. Can't wait until they're available here in oz.
 
razor1 said:
? where did this come from

both cards have thier strengths and weaknesses. Both companies will be fixing thier weakness, and increasing thier strengths. The competition is close it will remain so for a quite some time, until someone falters. Like ATi did this round and last with yeilds and delays and nV with the fx line. Just because ATi is marginally beating its competition, being 6 months late and 190 mhz higher with 300 mhz on the ram doesn't mean they have a better core. Its all relavent from what you look at, thats why I say somethings are good with ATi somethings are good with nV.

you are comparing the lead in MHZ with a 16pipe card vs a 24pipe card, this way of thinking applies to both companies, no where did i state any of this in a previous post, and yes i do believe ATI has a better core right now, i'm not saying NVidia's offering is bad, but right now the r520 offers alot more enthusiasm then the G70 does

being 6months late has nothing to do with anything, if a company is having a problem with their cards randomly dieing then of course they are going to be late, thats not saying the core is bad, its just that they decided to wait and give good offerings, get the problem fixed and then get it out the door, it sucks it happens but it happend and now we have a solid product to compete with another solid product, i don't see why people are mad at ATI for being late

both companies were plagued with supply demands last generation, i still don't see why this was brought up, seeing as both are doing exceptionaly well in keeping supply in with demand

and ATI having a clear winner as far as performance? it seems ATI has a system that they rely on and keep putting out products that have similar scaling in performance, i'm not stating that ATI is going to be the king performer for ever, just that they have a system they can rely on for a while
 
This isn't news, the 8.183 driver simply includes the OpenGL hotfix improvements we've already seen here. Dave Baumann says so in his post......

Wavey Davey said:
The first review driver was 8.173 and the "Hotfix" ATI released is from the next version, 8.183
 
PRIME1 said:
Who's to say that by the time it does come out it won't have something else to compete with.

What so now the inquirer are a reliable source for release dates? Wait, you could of gotten it from gibbo with the remarkable synthetic benchmark scores :rolleyes:

I personally just feel it's wishful thinking, but say the 7800GTX does come out, the 'reliable' inq don't even think there will be a speed boost just more vRAM...
 
Does anyone think NVIDIA isn't going to have a 512MB GTX ready for early November? Seriously, NVIDIA, the company that has always been such a fierce competitor, is just going to let ATI's XT take the performance crown that easily? Pffft, not very likely. IMO we're going to see GTX chips clocked around 500MHz and with 512MB of comparable memory that's going to help close the performance gap the XT has over standard GTX boards in high res. + AA testing.

In terms of overall functionality and performance, when was the last time the graphics market was this close? Great for competition, but apparently harder on the consumer when it comes to making a purchase decision (based on all the recent "which to buy" threads).
 
DAAAAAAMN. It's too bad there's nothing OpenGL that is worth caring about but maybe that'll change with Prey.
 
you are comparing the lead in MHZ with a 16pipe card vs a 24pipe card, this way of thinking applies to both companies, no where did i state any of this in a previous post, and yes i do believe ATI has a better core right now, i'm not saying NVidia's offering is bad, but right now the r520 offers alot more enthusiasm then the G70 does

No I'm not nV is getting about the same or close to the same overall performance without dropping down a process and with no low-k, and still have less heat and less power consumption. Thats one of the strengths of the gf7's core. And vice versa ATi's strength is clockablity on a 16 pipe chip. But it produces more heat, uses more power. Then the fact they already dropped down to .09 and they also have more transitors for a 16 pipe chip then a 24 pipe chip.

being 6months late has nothing to do with anything, if a company is having a problem with their cards randomly dieing then of course they are going to be late, thats not saying the core is bad, its just that they decided to wait and give good offerings, get the problem fixed and then get it out the door, it sucks it happens but it happend and now we have a solid product to compete with another solid product, i don't see why people are mad at ATI for being late

People arn't mad at ATi, ATi just lost all the marketshare they gained from the 9x00's

both companies were plagued with supply demands last generation, i still don't see why this was brought up, seeing as both are doing exceptionaly well in keeping supply in with demand

nV only had a short time where they had supply issues due to GDD3 ram issues at the proper speed

and ATI having a clear winner as far as performance? it seems ATI has a system that they rely on and keep putting out products that have similar scaling in performance, i'm not stating that ATI is going to be the king performer for ever, just that they have a system they can rely on for a while

i say ATI has a clear winner for generations to come as far as performance is involved

It sure sounded like that to me

Its not really a system they have, they already knew the limitations of a traditional pipeline structure this is why the structure has been changed so it will account for increased dynamic branching performance. But games that will be using extensive branching that will over take a performance bottleneck of simple shaders are not going to be out for another generation possibly even more, since Unreal Tournment 2007 will be the engine for the next 2 or 3 possible years.
 
razor1 said:
Then the fact they already dropped down to .09 and they also have more transitors for a 16 pipe chip then a 24 pipe chip.

Well for starters generally speaking more transistors equals more heat generated and power consumption so really it's no suprise that the X1800's generate more heat than say a 7800GTX, also previous companies have already shown that the transistion from 110nm to 90nm is a hard one (re: Intel and AMD) and as far as I know nVidia haven't took this step yet for retail videocards...
 
tornadotsunamilife said:
Well for starters generally speaking more transistors equals more heat generated and power consumption so really it's no suprise that the X1800's generate more heat than say a 7800GTX, also previous companies have already shown that the transistion from 110nm to 90nm is a hard one (re: Intel and AMD) and as far as I know nVidia haven't took this step yet for retail videocards...

Using low K should give ATi a 30% advantage in power consumption (also this is a 30% reduction in heat production) or a 30% advantage of higher clock speed. ATi has 15% more transitors then nV's g70. And then there is the process drop which should give them addtional increases then this 30% advantages in the previous situations I mentioned.

Its wasn't a real issue for AMD because they didn't go for high clock rates where there would be more leakage and this is where the .09 micron process has its problems. So that gave them time to fix the process issues and now they can clock higher.

Both these points are in favor of nV's design at the moment.
 
coz said:
This isn't news, the 8.183 driver simply includes the OpenGL hotfix improvements we've already seen here. Dave Baumann says so in his post......

It is news, since there are a performance increase from the .173+hotfix vs .183 that includes the hotfix. This driver also increase performance without AA

some 1800XL numbers

Code:
Quake 4 HQ (Trilinear)   10x7   12x10   16x12 
X1800 XL 8-173-1         67     63      57
X1800 XL 8_183_gl_fix    76     71      65

Quake 4 HQ (4xAA 16xAF)  10x7   12x10   16x12
X1800 XL 8-173-1         58     49      38
X1800 XL 8-173-1+HofFix  59     52      43
X1800 XL 8_183_gl_fix    67     61      52

source: http://www.guru3d.com/newsitem.php?id=3208
 
razor1 said:
Using low K should give ATi a 30% advantage in power consumption (also this is a 30% reduction in heat production) or a 30% advantage of higher clock speed. ATi has 15% more transitors then nV's g70. And then there is the process drop which should give them addtional increases then this 30% advantages in the previous situations I mentioned.

Its wasn't a real issue for AMD because they didn't go for high clock rates where there would be more leakage and this is where the .09 micron process has its problems. So that gave them time to fix the process issues and now they can clock higher.

Both these points are in favor of nV's design at the moment.
I don't think TSMC even offers low-K on processes below 130nm - that was one of the key reasons for the relative lack of clock potential on R430 (X800XL)
 
PRIME1 said:
Is ATI a reliable source for release dates.....nope.

Just because NVIDIA has their sh** together and they don't feel the need to paper launch all of their products, does not mean they don't have anything else coming out. It's not like they released the GTX and closed shop. Lately the trend has been for them to spoil ATI's party and so far they have been doing a pretty good job.
AtomicMoose, Dr Evil, fugu, p[H]ant0m, myself, and the Admins will take care of anything that violates the Rules. If you see a matter that requires our attention, hitting the "Report Bad Post" button will bring one of us down on it faster than a X1800XT rendering frames in 3DMark2001. I'm going to suggest to both sides in this argument that remarks addressed to other members are against the [thread=760666]Rules[/thread]. - DL

first off how is nvidia spoiling ati's party? are they both still in business? are their cards the same in preformence?

i remember kyle saying he was disipointed in both cards cause one isnt SO much better than the other so its kinda weaksauce (both the new gen cards) which hes right in some reguards

and the point is, ati is working on fixing the bugs of their cards, and so is nvidia........

you know when the 5xxx series came out, everyone laughed at nvidia.......they modded their bios to get better 3d03 marks and just crappy cards

some times you get a good card, sometimes not so much..........

benchmarks dont always prove what card is what ranking.......

i could put you (and anyone else) in a room with 2 computers, not tell you the specs or rez, or aa af or what have you.......

and not knowing which one is which id let you game on both for as long as youd like (no peeking at specs)

you couldnt tell the diffrence between a intel dual core high end system/speced, vs amd dual core high end sysmte/speced the amd has the x1800xt and intel has the 7800gtx (single card or sli/crossfire)

you CANT tell me youd be able to tell which is which........thats why benchmarks show who........in some peoples words is "the better card"

because one card beats it in one game means you should buy it?

hell i remember when the 6xxx series came out and doom3........people still bought ati's cards and played doom3 (once it was able to play it that is)

make up your own opinions.........

soulsaver~
 
razor1 said:
Using low K should give ATi a 30% advantage in power consumption (also this is a 30% reduction in heat production) or a 30% advantage of higher clock speed. ATi has 15% more transitors then nV's g70. And then there is the process drop which should give them addtional increases then this 30% advantages in the previous situations I mentioned.

Its wasn't a real issue for AMD because they didn't go for high clock rates where there would be more leakage and this is where the .09 micron process has its problems. So that gave them time to fix the process issues and now they can clock higher.

Both these points are in favor of nV's design at the moment.

are TSMC's 90nm product Low-K?
 
Trimlock said:
http://www.tsmc.com/download/english/a05_literature/90nm_Brochure.pdf (its in pdf)

yes the 90nm is offered in low-k, the 110 was not

i still don't see how the reason the ATI's core is using low-k while the nvidia's isn't is giving the nvidia an advantage, i'd say thats a point for ATI


Low-k gives a huge advantage for uping clocks just like any insolulator does. But it increases costs by 2 fold for the same process without any type of insolulator. So ATi is paying more but can only clock to 625 for safe yields on .09 (most likely due to heat production and leakage). nV on the other hand hasn't stepped up to low-k yet and are competitive, they too will see that 30% boost when they go with low-k or some other insulator. Lets see what the ultra ships out at probably around 500 mhz. future generation will be utilizing similiar technology just with more features, so 500 mhz + 30% of that gets it up to 666 mhz, then you have the process drop, of course I'm sure nV will be doing something that will increase transistors for next gen, so clocks won't be goin much past that. This is speculation but its very much possible. Because nV has to start using some type of insulator (all processes .09 micron and lower) from all manufacturers.
 
razor1 said:
Low-k gives a huge advantage for uping clocks just like any insolulator does. But it increases costs by 2 fold for the same process without any type of insolulator. So ATi is paying more but can only clock to 625 for safe yields on .09 (most likely due to heat production and leakage). nV on the other hand hasn't stepped up to low-k yet and are competitive, they too will see that 30% boost when they go with low-k or some other insulator. Lets see what the ultra ships out at probably around 500 mhz. future generation will be utilizing similiar technology just with more features, so 500 mhz + 30% of that gets it up to 666 mhz, then you have the process drop, of course I'm sure nV will be doing something that will increase transistors for next gen, so clocks won't be goin much past that. This is speculation but its very much possible. Because nV has to start using some type of insulator (all processes .09 micron and lower) from all manufacturers.

This is very true but I dont think nv will get a boost in clock as much as ATI has because of the more pipes the card has.

I also dont think they will waste there time and money to redo this gens core on it.
 
skratch said:
This is very true but I dont think nv will get a boost in clock as much as ATI has because of the more pipes the card has.

I also dont think they will waste there time and money to redo this gens core on it.


the 30% boost will be the smallest amount of boost they will get from going to .09. Even if they add 20% more transistors. IMHO
 
The real performance increase is coming from the new archictecture, in alot of benches the x1800 xl performs better than a 7800 gt which has virtually the same clock speeds and 4 more pixel pipes, changing to the .90 process will help them clock the 7800 to higher speeds. tbh no one can tell what kind of performance increase they will get from higher clock speeds. lol strange thing about it is people always say which card is better based on game benches, but the x1800's perform better on media as well.
 
razor1 said:
Low-k gives a huge advantage for uping clocks just like any insolulator does. But it increases costs by 2 fold for the same process without any type of insolulator. So ATi is paying more but can only clock to 625 for safe yields on .09 (most likely due to heat production and leakage). nV on the other hand hasn't stepped up to low-k yet and are competitive, they too will see that 30% boost when they go with low-k or some other insulator. Lets see what the ultra ships out at probably around 500 mhz. future generation will be utilizing similiar technology just with more features, so 500 mhz + 30% of that gets it up to 666 mhz, then you have the process drop, of course I'm sure nV will be doing something that will increase transistors for next gen, so clocks won't be goin much past that. This is speculation but its very much possible. Because nV has to start using some type of insulator (all processes .09 micron and lower) from all manufacturers.


i was very un-aware of the additional costs with low-k, i'm glad we didn't see that be a big factor in the price of the card as to date
 
douglite's edit said:
hitting the "Report Bad Post" button will bring one of us down on it faster than a 7800GTX rendering frames in Quake II.
First, LOL!
Second, this is the ATi forum. Let's try to keep the discussion to ATi, please? eh?
douglite's edit said:
hitting the "Report Bad Post" button will bring one of us down on it faster than a X850XTPE rendering frames in Tomb Raider 1
fixed. and one-upped :p - DL

Correct me if I'm wrong, but it appears this is basically the "hotfix" being wrapped into a Cat release, right? It seems to be a bit more mature though, since some of the improvements are spread to non-AA situations. I'm a bit confused; is this:
Code:
Quake 4 HQ (Trilinear) 10x7 12x10 16x12 
X1800 XL 8-173-1 67 63 57
X1800 XL 8_183_gl_fix 76 71 65
Quake 4 HQ (4xAA 16xAF) 10x7 12x10 16x12
X1800 XL 8-173-1 58 49 38
X1800 XL 8-173-1+HofFix 59 52 43
X1800 XL 8_183_gl_fix 67 61 52
telling me that this new driver is showing improvements over the already improved numbers from the hotfix?
I lost track of what exactly "8_183 vs 8-173" and all means.
 
jebo_4jc said:
telling me that this new driver is showing improvements over the already improved numbers from the hotfix?
I lost track of what exactly "8_183 vs 8-173" and all means.

173 was the driver available from ati when the hotfix was released, so the first tests of the hotfix was using the 173 drivers. So yes, those new 183 drivers add more performance over 173+hotfix

I hope brent will be using the 183 drivers for his xt review.
 
Spank said:
173 was the driver available from ati when the hotfix was released, so the first tests of the hotfix was using the 173 drivers. So yes, those new 183 drivers add more performance over 173+hotfix

I hope brent will be using the 183 drivers for his xt review.

Exactly. The new drivers show 8.183 in the CCC.
 
Spank said:
173 was the driver available from ati when the hotfix was released, so the first tests of the hotfix was using the 173 drivers. So yes, those new 183 drivers add more performance over 173+hotfix

I hope brent will be using the 183 drivers for his xt review.

yes

they are Catalyst 5.11 non WHQL's
 
tertsi said:


thats my post at rage3d which that nvidia addict in the above link linked to. Its NOT 16bit color depth bug, the fire wouldnt render correctly with AI set to advanced. AI set to standard was fine. Shoot, judging by how clean and vibrant the colors on my x1800 look over my 7800 GT, its like the x1800 has 64 bit cd.
 
LawGiver said:
thats my post at rage3d which that nvidia addict in the above link linked to. Its NOT 16bit color depth bug, the fire wouldnt render correctly with AI set to advanced. AI set to standard was fine. Shoot, judging by how clean and vibrant the colors on my x1800 look over my 7800 GT, its like the x1800 has 64 bit cd.

NVIDIA "addict" heheheh nice complain... I'm just pointing out that there are some Quake 4 quality issues on ATI cards with newer drivers (mostly CatAI enabled). actually I meant that this "bug" is 16bit color depth bug probably in the alpha textures.... wonder why? Correct me if I am wrong, but did ATI say that CatAI Advanced doesn't affect to image quality :)

I did say that thread on R3D was a funny thread, nothing more - nothing less... bigger issue still are blurred textures on FS review with newer drivers.

btw. Quake 4's timedemo mode is broken (game issue) - shadows and some effects (e.g. gun rails) doesn't work in timedemo mode on any cards. So, use Fraps for benchmarking.
 
I will admit that the ATI x1xxx line will be better than the Nvidia 7x line. There top of the line card atleast, highly doubt there mid to low range will put a dent in when looking at all the early multiple benchmarks out. However, nvidia can just sit back now since that gotten a huge grip on the marketshare.The Damage was already done for the market year. Almost all major computer retailers ship there systems with an Nvidia sli configuration. Nvidia probally dominates when it comes to portable GPU's, laptops & so forth. So I see no real panic for them as of yet. I would of prefer that both companies would of stuck with releasing there newer line at near identical times, but it looks like it will be a back & forth situation now. ATI will have there run, then nvidia will come withs omething bigger & better. This will happen until both get back to there usual ways.

-DarkLegacy
 
tertsi said:
NVIDIA "addict" heheheh nice complain... I'm just pointing out that there are some Quake 4 quality issues on ATI cards with newer drivers (mostly CatAI enabled). actually I meant that this "bug" is 16bit color depth bug probably in the alpha textures.... wonder why? Correct me if I am wrong, but did ATI say that CatAI Advanced doesn't affect to image quality :)
You're wrong. :)

CatalystAI Advanced applies significant optimisations and uses extensive analysis of texture content to decide where such optimisations are appropriate and can be applied. The intent of the analysis is to try to avoid noticable effects on image quality. We have stated in the past that no such algorithm is likely to be perfect and that we would like to be informed when anyone sees any significant image quality issues so that we can try to make improvements. If we reproduce this problem internally then we will take a look at it.
I did say that thread on R3D was a funny thread, nothing more - nothing less... bigger issue still are blurred textures on FS review with newer drivers.
I have no idea where any such blurred textures could be coming from, and our internal image quality checks have certainly showed no such issues - we'll definitely be taking a look at this to see if we can reproduce it. None of the changes that were made to the new drivers should cause them to produce worse image quality than the older ones.
 
Back
Top