id Software's Official DOOM3 Benchmarks

yes just remember that 900 dollars for one game is stupid and even though the 6800 GT has better results that is NO reason at all to ditch your X800s 30 fps is fine to run a game, you can hardly even tell the difference between 30 FPS and above


as for me, im gonna sit here with my new 9600XT and wait...

GAH!

I'm only 13

GAH!

I pity the fool who spends 900 dollars for one game
 
Mr.T said:
yes just remember that 900 dollars for one game is stupid and even though the 6800 GT has better results that is NO reason at all to ditch your X800s 30 fps is fine to run a game, you can hardly even tell the difference between 30 FPS and above


as for me, im gonna sit here with my new 9600XT and wait...

GAH!

I'm only 13

GAH!

I pity the fool who spends 900 dollars for one game

My little sister who's only 6 can tell the difference between 30 FPS and above... :p
 
Mr.T said:
I pity the fool who spends 900 dollars for one game

Well if they have the money and want to spend it, so be it, it's their cash. However remember that only Doom 3 itself is one game, the Doom 3 gaming engine will be used by a multitude of other companies to build games on in the future, just like the HL2 engine, so overall it affects more than a single game.
 
yes, i had not thought of that at the time but i still stand by 30 fps being a reasonable minimum for games
 
I won't disagree with that, but also keep in mind that if a piece of hardware produces approx 30FPS running time demos it's possible that in actual gameplay with larger scenes, more monsters, and additional players all running around and firing weapons with explosions going off the FPS can drop below that reasonable minimum.
 
yes but its is possible that as it is released drivers can catch up to the game and possibly keep that reasonable minimum of 30 fps


i do not see a lot of people who run games on 1600x1200 i run Far Cry with everything turned up, including aa on 1024x768 and everything runs perfectly so i do not see how running D3 with everything on high or medium will be too hard for my 9600XT considering that D3 looks so good anyway
 
hehe, I'm not going to jump too far one way or the other guessing which hardware will do what until Kyle gets that data posted. After that most of the guesswork will be a moot point :)
 
heheh, yeah

I heard Kyle will be benchmarking all the way down to the GeForce 3's do you know if this is true

then i can actually tell if i made a stupid 300 dollar decision

(Aus dollars)
 
Mr.T said:
A warning, do not quite ditch yor X800s yet, Doom 3 is one of a few Open GL games around lately and as it stands, nVidia beats ATi in Open GL, when you put to test the same cards on a DX9 game such as Far Cry have a look at the results then, I am thinking that everyone who is selling their X800s is making a big mistake and in some cases that could cost you instead of 500 dollars for just the X800 you will end up paying 900 dollars just for one game that is a major Open GL game so think about it, is one game really worth 900 bucks?

One of the few OpenGL games? lol

Did you go to sleep and miss 2003? Remember the two games both nominated for GOTY last year? Call of Duty and KOTOR? Both OpenGL.

And there are dozens of other very popular OpenGL games like City of Heroes, Neverwinter Nights, and the Hitman series.

The 6800's are faster in alot of DX9 games as well. The X800XT PE definitely doesn't win them all and it usually doesn't win to start with unless you have AF enabled. Have you seen Far Cry benches at 1600x1200 without AF enabled? The 6800u wins the majority of the time.

And when Doom 3 and HL2 come out, Far Cry will be the last game people are worried about performance in lol.

Mr.T said:
I pity the fool who spends 900 dollars for one game

Where are you pulling this $900 figure from? The X800XT PE is the only card selling for over $900. The 6800u's are selling in the $500 range and the 6800GT is just $400.

You could make a bundle off of an X800XT PE on ebay. Well over $500. And the X800Pro will go for at least $300. So where is this $900 coming from? lol
 
this 900 dollars is coming from people who are selling there X800s to buy a 6800 GT

and its was A not THE

anyway, look at all the Direct X games coming out compared to OGL anyway
 
Mr.T said:
heheh, yeah

I heard Kyle will be benchmarking all the way down to the GeForce 3's do you know if this is true

then i can actually tell if i made a stupid 300 dollar decision

(Aus dollars)


This is what was posted in the thread in General Hardware, you might want to check it out.

GF3, TI4600, 5800, 5900, 6800, 680GT, 6800U are all coming this week. As
well as 8500, 9600XT, 9800Pro, X800Pro, X800XT
 
Mr.T said:
this 900 dollars is coming from people who are selling there X800s to buy a 6800 GT

and its was A not THE

anyway, look at all the Direct X games coming out compared to OGL anyway
Interesting math. Lets say someone spent $500 on a X800XT, sold it for at least $500 on ebay, then bought a 6800GT for $400.
500 - ~500 + 400 = 900?

Anyway, you like to use the word 'anyway' to much anyway.

I couldn't resist.
 
Mr.T said:
this 900 dollars is coming from people who are selling there X800s to buy a 6800 GT

and its was A not THE

anyway, look at all the Direct X games coming out compared to OGL anyway

The bottom line man...there's no excuse for ATi to have shitty OpenGL performance. I know that most games are D3D these days..but there are still some very important games in OpenGL and ATi, who loves to preach about pure gaming performance, simply doesn't seem to care that some of the most important games out there are written in an API that they don't pay much attention to.

There's no excuse, ATi should have decent OpenGL drivers...I'm not going to make any excuses for them
 
Come on folks, quit nickpicking every little number. If you cannot understand the point he's trying to make, spending several hundred dollars on hardware, then quite honestly take a break from the computer.
 
No there is an excuse for ATI to not have good support ofr opengl, the fact is ATI was not a big player in the industry up until a few years ago and thus they had a VERY limited budget when they first coded their opengl drivers.

With that said, the OpenGL re-write should have been recognized as a major issue long before the 9800 series came out... IMO.

~Adam
 
CIWS said:
Come on folks, quit nickpicking every little number. If you cannot understand the point he's trying to make, spending several hundred dollars on hardware, then quite honestly take a break from the computer.

Are you talking about Mr. T? lol

His reasoning is completely off the wall. $900 to switch from an X800 to a 6800? You'll come out with MORE money then you spent if you sold an X800XT right now lol.

You could probably sell a PE on ebay and get enough to buy a 6800u with $100 left over.
 
CleanSlate said:
No there is an excuse for ATI to not have good support ofr opengl, the fact is ATI was not a big player in the industry up until a few years ago and thus they had a VERY limited budget when they first coded their opengl drivers.

With that said, the OpenGL re-write should have been recognized as a major issue long before the 9800 series came out... IMO.

~Adam

LOL

Not a major player until recently?

ATi has, at least, been trying to compete since day 1. Going all the way back to the rage series...I can't believe you just said that...they were competing wildly with the GeForce 3 with the 8500 when Quake 3 was THE benchmark...if anything...that's when they should have realized the importance of OpenGL...I maintain there is no excuse...2 years of poor OpenGL drivers for r300 speak for themselves...
 
burningrave101 said:
What i want to know is when were suppost to see these OpenGL rewrites. DOOM 3 ships in less then 2 weeks.

What I thought was really odd was they didn't even have a beta to run the initial benchmarks on...now if they were trying to make any date, that would be the one to make...so I really seriously doubt they'll be out before Doom3...but again...it's not like Doom3 is unplayable on the X800 cards...it just isn't as playable as it should be...
 
^eMpTy^ said:
What I thought was really odd was they didn't even have a beta to run the initial benchmarks on...now if they were trying to make any date, that would be the one to make...so I really seriously doubt they'll be out before Doom3...but again...it's not like Doom3 is unplayable on the X800 cards...it just isn't as playable as it should be...

using an x800 card on doom3 just isn't 'the way it's meant to be played' ;)
 
ohgod said:
No thats juast someones opinon of how its suppoed to be played.

yes but it's my opinion. and by laws of most forums i'm supposed to flame the hell out of you because you have in some way made a remark (good or bad) about me. ahhh interweb forum politics
 
burningrave101 said:
Are you talking about Mr. T? lol

His reasoning is completely off the wall. $900 to switch from an X800 to a 6800? You'll come out with MORE money then you spent if you sold an X800XT right now lol.

You could probably sell a PE on ebay and get enough to buy a 6800u with $100 left over.

Really.....? Did you ever consider there might be another possibility out there, that would justify the 900.00 ? Like someone upgrading their system for Doom 3, or perhaps in my particular case where I have purchased both a XT-PE and 6800GT for two different machines for Doom 3 that total 900.00 plus ? Some of you guys are getting to the point of wanting to jump on every comment, every number, and analyze it to death. Relax, enjoy the posts and comments, quit trying to pick every little thing to counter or argue. The guy's point was simple he didn't believe spending several hundred dollars on a single game such as Doom 3 was justified. That's all.
 
Ardrid said:
So you wanna play at 20 FPS? Is that what you're telling us?
Well, not everyone is going to play at 1600x1200 with 4xAA 8xAF. As much as I want to replace my 9800 Pro with a 6800GT, pretending that X800 can't run Doom3 is silly.
 
ChiMan said:
Well, not everyone is going to play at 1600x1200 with 4xAA 8xAF. As much as I want to replace my 9800 Pro with a 6800GT, pretending that X800 can't run Doom3 is silly.

i agree. however i want to see some IQ comparisons as all we've been shown are statistics which are good and all, but if you play a game at 20FPS its still enjoyable and hardly noticed the stuttering until you have a program like FRAPS with your FPS in the corner
 
tornadotsunamilife said:
i agree. however i want to see some IQ comparisons as all we've been shown are statistics which are good and all, but if you play a game at 20FPS its still enjoyable and hardly noticed the stuttering until you have a program like FRAPS with your FPS in the corner
Well, everyone perceives differently. But when it comes to a game like Doom3, I wouldn't go as far as say 20FPS is enjoyable.:) *Maybe* for slow-paced, strealth games like Thief 3 or Splinter Cell, but not for a action-packed game. (Yes I'm aware of the gameplay of D3 is slower than it's predecessors)

But the point is, stop acting like X800 couldn't handle Doom3. Yes it's sub-par when standing next to a 6800, but it's still a capable card. Even Carmack himself said that.
 
ChiMan said:
Well, everyone perceives differently. But when it comes to a game like Doom3, I wouldn't go as far as say 20FPS is enjoyable.:) *Maybe* for slow-paced, strealth games like Thief 3 or Splinter Cell, but not for a action-packed game. (Yes I'm aware of the gameplay of D3 is slower than it's predecessors)

But the point is, stop acting like X800 couldn't handle Doom3. Yes it's sub-par when standing next to a 6800, but it's still a capable card. Even Carmack himself said that.

lol im sorry if you thought i was suggesting that the x800 could not handle doom 3, i was being as ^empty^ put: annoyingly obvious because of nvidias marketing ('the way it's meant to be played'), however ati are just as bad with 'get in the game'. im sure the x800 will manage doom 3 extremely well on that note however as recent benchmarks suggest this is a game that the 6800 are the clear leaders of.
 
Sorry I didn't make myself clear. I wan't saying that you're bashing ATI. I was talking about people who weren't being objective.
 
tornadotsunamilife said:
yes but it's my opinion. and by laws of most forums i'm supposed to flame the hell out of you because you have in some way made a remark (good or bad) about me. ahhh interweb forum politics
Lol, people can play Doom 3 on anything they want to.
 
What i dont get is why people are trying to downplay the DOOM3 performance results and make a big freaking deal about D3D games like Far Cry and Half Life 2.

The X800's and 6800's will both be able to play DOOM3 just fine. I doubt the X800Pro will be able to run as high of settings, but still enjoyable at lower settings.

If the DOOM3 benchmarks aren't important then nither are the Far Cry and other D3D benchmarks. There isn't a D3D game out there that the 6800's can't play with high settings. The 6800u usually takes as many D3D wins as the X800XT PE as well. It just depends on the level of the game you test and what settings your using as to which is a little faster then the other.

nVidia's drivers are still pretty much beta's too. They've finally released the 61.76 WHQL drivers but it will take a few releases before nVidia really tweaks the performance of the 6800's to their full potential.
 
burningrave101 said:
What i dont get is why people are trying to downplay the DOOM3 performance results and make a big freaking deal about D3D games like Far Cry and Half Life 2.

The X800's and 6800's will both be able to play DOOM3 just fine. I doubt the X800Pro will be able to run as high of settings, but still enjoyable at lower settings.

If the DOOM3 benchmarks aren't important then nither are the Far Cry and other D3D benchmarks. There isn't a D3D game out there that the 6800's can't play with high settings. The 6800u usually takes as many wins as the X800XT PE as well. It just depends on the level of the game you test and what settings your using as to which is a little faster then the other.

nVidia's drivers are still pretty much beta's too. They've finally released the 61.76 WHQL drivers but it will take a few releases before nVidia really tweaks the performance of the 6800's to their full potential.

True, wouldn't it be better to be faily equal in D3D games and have great Ogl performance not just, equal D3D performance?
 
Right...

So for Doom3 we have the Geforce 6800s at present winning out over the x800s. That is clear.

We are also aware that Nvidia have always have had very well optamised OpenGL drivers for their cards...that is also clear, and also in respect to ATI, their OpenGL drivers have always taken a back seat to that of d3d...

What comes to mind, and seems sensible, is this. Maybe, ATI have crappy OpenGL support for the reason that in truth the majority of games utilise Direct3D more over than OpenGL. Which therfore indicates the sense in simply optamising the performance for the majority, and not so for the relative minority. However, once OpenGL becomes an indication of what is likely to soon become the majority, I think things will most certainly change in the case of ATI and their OpenGl support in their driver set.

The above is clear, as you have all indicated, where the ATI cards win out against the Nvidia ones in respect to the non-opengl games, where ATI have concentrated their optimisations, whilst probably only "sorting" OpenGL drivers on an adhoc basis (where seriouse opengl driver bugs become very apparent amongst their customer base) as opposed to any real optimisations in terms of OpenGL performance.

Doom3 will serve as more of a wake up call for the Catalyst team at ATI HQ to get things in to perspective with respect to the future of OpenGL in furture games. I can vouche for the fact that I am currently testing a future game which is in it self based arount OpenGL, my current card (9800pro 128) doesnt perform too well, which I feel is partly in regard to the drivers avail at present being not all that for heavily OpenGL based titles (saying that given the game code is yet to be optamised, this could also contribute, though same testers with Geforce cards show a markedly better performance in same game being tested.)

Bottom line is - dont throw away your x800 base card for the sake of Doom3, its apparent you bought that card for its merits before now, all it takes is some driver optimisations for OpenGL (which I feel is definatly the case and past and present evidence also indicates this) and I think things will become more on a level or at the very least improve some what. That said it depends on wether Nvidia are able to push any further optimisations out of their already heavily optimised OpenGL drivers...who knows, new generation of cards, new sets of boundaries, in both cases.

I must state though, I am in no way an ATI fanboi or NVIDIA junkie, I wouldnt stick to a brand unless I knew of its pros and cons, and factor in all the info I can get. - For the now, given what is known, and what I have deducted from all the information available, you cant go wrong with either, but if I was given the choice, to be honest, I will stick with my pre-ordered x800XT...but thats me :)... maybe I will sing a different tune in 6 months time... when we will see faster cards supporting SM3.x and what not, with a better base of games to take advantage of the goodies..who knows, thats another story altogether.

ohh and btw:
Quote:
"As DOOM 3 development winds to a close, my work has turned to development of the next generation rendering technology. The GeForce 6800 is my platform of choice due to its support of very long fragment programs, generalized floating point blending and filtering, and the extremely high performance."

John Carmack, President and Technical Director, id Software
End of Quote

the above quote from Nvidias own site: here-->http://www.nvidia.com/page/pg_20040412333689.html

Read what you like into it, wether he talks of the new gfx engine, I doubt, as it is by the time that engine is ready and used in a released game, we would have passed through 2+ cycles of the GFX-Card generation game.
 
Arminius said:
Right...

So for Doom3 we have the Geforce 6800s at present winning out over the x800s. That is clear.

We are also aware that Nvidia have always have had very well optamised OpenGL drivers for their cards...that is also clear, and also in respect to ATI, their OpenGL drivers have always taken a back seat to that of d3d...

What comes to mind, and seems sensible, is this. Maybe, ATI have crappy OpenGL support for the reason that in truth the majority of games utilise Direct3D more over than OpenGL. Which therfore indicates the sense in simply optamising the performance for the majority, and not so for the relative minority. However, once OpenGL becomes an indication of what is likely to soon become the majority, I think things will most certainly change in the case of ATI and their OpenGl support in their driver set.

The above is clear, as you have all indicated, where the ATI cards win out against the Nvidia ones in respect to the non-opengl games, where ATI have concentrated their optimisations, whilst probably only "sorting" OpenGL drivers on an adhoc basis (where seriouse opengl driver bugs become very apparent amongst their customer base) as opposed to any real optimisations in terms of OpenGL performance.

Doom3 will serve as more of a wake up call for the Catalyst team at ATI HQ to get things in to perspective with respect to the future of OpenGL in furture games. I can vouche for the fact that I am currently testing a future game which is in it self based arount OpenGL, my current card (9800pro 128) doesnt perform too well, which I feel is partly in regard to the drivers avail at present being not all that for heavily OpenGL based titles (saying that given the game code is yet to be optamised, this could also contribute, though same testers with Geforce cards show a markedly better performance in same game being tested.)

Bottom line is - dont throw away your x800 base card for the sake of Doom3, its apparent you bought that card for its merits before now, all it takes is some driver optimisations for OpenGL (which I feel is definatly the case and past and present evidence also indicates this) and I think things will become more on a level or at the very least improve some what. That said it depends on wether Nvidia are able to push any further optimisations out of their already heavily optimised OpenGL drivers...who knows, new generation of cards, new sets of boundaries, in both cases.

I must state though, I am in no way an ATI fanboi or NVIDIA junkie, I wouldnt stick to a brand unless I knew of its pros and cons, and factor in all the info I can get. - For the now, given what is known, and what I have deducted from all the information available, you cant go wrong with either, but if I was given the choice, to be honest, I will stick with my pre-ordered x800XT...but thats me :)... maybe I will sing a different tune in 6 months time... when we will see faster cards supporting SM3.x and what not, with a better base of games to take advantage of the goodies..who knows, thats another story altogether.

ohh and btw:
Quote:
"As DOOM 3 development winds to a close, my work has turned to development of the next generation rendering technology. The GeForce 6800 is my platform of choice due to its support of very long fragment programs, generalized floating point blending and filtering, and the extremely high performance."

John Carmack, President and Technical Director, id Software
End of Quote

the above quote from Nvidias own site: here-->http://www.nvidia.com/page/pg_20040412333689.html

Read what you like into it, wether he talks of the new gfx engine, I doubt, as it is by the time that engine is ready and used in a released game, we would have passed through 2+ cycles of the GFX-Card generation game.

This quote at the bottom is one of the big reasons I like Nvidia...nobody ever won out in the graphics business by making a card that was good enough for the present...if you're not making forward looking hardware you're going to pay for it because the games of tomorrow are being developed on today's hardware.

Also...sure ATi clearly has spent a lot more time on D3D than OpenGL...but they should have had more than enough time to do both...Nvidia has good OpenGL drivers and good D3D drivers...not to mention Linux...basically ATi's driver team has gotten a lot of press for their quality D3D drivers...but now we see they were neglecting OpenGL...and that's just unsatisfactory in my opinion...
 
Very true, your right, hence my indication to that effect in terms of negelct of the OpenGL drivers on the part of ATI... but as I said, it was and is to a degree due to the market climate... now they have to wake up and smell the moutain dew and pick up pace on them Open GL drivers.

True, it seems that the graphic chip giants Nvidia and ATI should indeed leed the games market in terms of raising the bar, but then this does also occur in reverse, a game comes along, pushes right up against the boundaries of the available top end GFX cards, and the likes of Nvidia and ATI have to sit up and take note, as to see where it is all going, and try to gear their next gen in the "hopeful" right direction to rise above it adn then some. So for either case they either get it right or not so right.

Fact is it is not viable for a company to make a card that only satisfies the absolute present games available, your right there. But to what degree do you put out a cards whos full potential will only be realised 6months later (or more) (i.e SM3.0) where games in the majority wont be taking this into full swing to about the same period, by which time, you will have ATI and Nvidia touting the next generation of cards plugging SM3.x support and prob better technologies (like SM4.0 if that is the case, that depends on Microsoft and development of DirectX).

I personaly think that ATI have made the right decision in holding back on SM3.0 support in hardware, whereby they will see how the market plays out over the next 6m + watching how it impacts on Nvidia, and then if all goes well, learn from Nvidias, trials and errors, and develope around that.
 
Arminius said:
I personaly think that ATI have made the right decision in holding back on SM3.0 support in hardware, whereby they will see how the market plays out over the next 6m + watching how it impacts on Nvidia, and then if all goes well, learn from Nvidias, trials and errors, and develope around that.

ATI is losing sales by not supporting SM 3.0. You dont realize the amount of people that will buy the 6800's simply because they support SM 3.0 and nVidia is marketing heavily on it. 90% or more of the people that buy video cards dont know a percentage of what we know.

ATI made a really smart move by working with Microsoft on PS 2.0 and supporting it fully for DX9. By not supporting SM 3.0 and FP32 precision, ATI has stepped back and allowed nVidia to re-establish their market hold.

I can think of very few reasons to buy an X800 over a 6800. The 6800 support the new features, the 6800's have better drivers (as is shown in their OpenGL and Linux support), the 6800's are just as fast as the X800's in D3D, and the 6800's support the ability to enable full Trilinear filtering with no optimizations.

The majority of ATI's performance is held in Tri and AF optimizations using Brilinear filtering. You can noticebly see how the X800's lose the majority of the time to the 6800's in D3D games when AF is not enabled.

And 3Dc is a joke. Take a look at these IQ comparisons of DXT5 and 3Dc. There is almost no difference and you could never notice it in-game in a million years.

http://www.nvnews.net/vbulletin/showthread.php?t=30772&page=4

3Dc will bring almost nothing over the current standard DXT5. ATI tried to push DXT5 already and noone used it. 3Dc is just an extension to DXT5.

Check out the games ATI is getting 3Dc support in lol.

http://www.hardocp.com/image.html?image=MTA4MzAzNjk0NGJ3UHFVM2NZQWJfNl80X2wuanBn

HL2 is the only one worth mentioning. Far Cry which is not on that list will also support 3Dc.
 
Back
Top