ATI on Doom3 benchmarks

wizzackr

[H]ard|Gawd
Joined
May 5, 2000
Messages
1,579
here you go. well, hopefully they got their shit together and deliver some decent and (finally) completely overhauled OGL-driver once Doom3 is released or otherwise it's gonna be nVidias market for the next time to come... AND hopefully they do better than this half-precision lame-excuses ;)
 
wizzackr said:
here you go. well, hopefully they got their shit together and deliver some decent and (finally) completely overhauled OGL-driver once Doom3 is released or otherwise it's gonna be nVidias market for the next time to come... AND hopefully they do better than this half-precision lame-excuses ;)

the only thing i didn't like in that statement was how they kept mentioning partial precision shading. damn i think its a good idea
 
Yeah I actually laughed when I read the PP thing, it is such a pr thing to say. Personally I've never had a problem with ati's opengl driver as far as iq/speed, stability on the other hand ..

I don’t understand why ati even released a statement, the nv40 was designed to play doom 3 so no shit it’s going to run faster than on the ati equivalent. I guess they saw a few too many “OMG 6800gt r0x0rz your b0x0rz in d00m 3” threads around and had to say something.
 
Yeah gooder Doom3 drivers, and better LINUX drivers would be nice ;)

Right now their Linux drivers are ass - roughly 1/2 the performance of the Win32 drivers. YES, I know we are lucky to have ANY Linux drivers for ATI hardware, but there's no excuse for over a year's worth of poor performance.

Also, ATI RELEASE YOUR DRIVERS IN TAR.GZ FORMAT! Not everyone needs RPMs or can work with them!

Sorry to hijack your thread, but that really irks me...
 
Glad they see it, recognize they need to improve the OpenGL performance, and have committed to do so. All we get to do now is wait and see if they do in fact deliver on it. I certainly hope so because I own several of their cards. :)
 
I concur with Josh B. As long as NVIDIA supports Linux and ATI doesn't I will never put an ATI card in my box. Even though I don't play games on linux much, I like knowing I can. However, it is nice to see that Doom3 is finally forcing ATI to put out a better OpenGL driver.
 
Hopefully if they can get the rewrite or at least improvements out before doom3 is retail then it will be intresting to see the benchmarks once the game is retail :)
 
"Hi Richard - this is a non issue - Doom 3 isn't even available yet" :eek:

That's almost complete dismissal...at worst for ati a business tactic, at best cofident truth.
 
I guess the whole Doom3 game totally caught them off guard. How were they supposed to know such an obscure title was going to come out.
 
yes better opengl drivers are a must, but these events are good. keeps companies in check. if ati can get their x800 series out of this mess, i hope they remain competitive and drive prices down. less i have to pay for a GT or x800Pro, the better.
 
IMO there will never be better ATI OpenGL drivers because there's not much room for improvement. I agree that Doom3 was built for NV40 hardware and we will never see better performance on current ATI boards.
It's just too bad that a lot of new games will be coming out based on Doom3 engine.
But on the other hand... we have HalfATI-Life 2 to look forward to :)
 
id love to see better OGL support.. i dont know if i will be able to afford a 6800gt any time soon.
 
I doubt ATI won't improve it's doom3 performance - they will, before it's out imo. However I expect the same from nvidia.
 
veLhi said:
It's just too bad that a lot of new games will be coming out based on Doom3 engine.
I think it is too bad ATI doesn't have better OpenGL drivers.
 
what's really funny is that he claims this is a none issue that doom3 isn't out yet leading to suggest new drivers will improve perfomance but yet they provided the 4.8 drivers while nvidia provided whql driver that are already out
 
obs said:
I guess the whole Doom3 game totally caught them off guard. How were they supposed to know such an obscure title was going to come out.

lol, comedy
 
they're trying how to figure out how to 'optimize' their opengl driver so it doesn't do any shadows.
 
i really dont like they way the responded. seems to me they dont care how ati cards pefrom in doom3.
 
@trapine said:
Hopefully if they can get the rewrite or at least improvements out before doom3 is retail then it will be intresting to see the benchmarks once the game is retail :)

I wouldn't hold my breath for the rewrite...maybe improvements if we're lucky...but the rewrite...hell it's a week away and thier next driver release was used in the benchmarks already...so unless those benchmarked drivers come out before the game you will probably see doom3 benchmarks get even worse for ATi...

If they didn't get a beta of the driver to the id benchmarks...there's no way in hell they'll have a final release ready by August 3rd.
 
"Hi Richard - this is a non issue - Doom 3 isn't even available yet, and we all know that some of our competitors use partial precision where possible. We expect to have updated drivers available in the coming weeks."

1. Doom 3 will be out in just over a week.
2. Other than the fact that they can't take advantage of it, why is it bad that PP is used if the dev wants to and it doesn't change the output?
3. The fruits of the long rumored OpenGL rewrite won't come for quite a while.



UPDATE:"...And btw, let's not lose sight of the fact that ATI performance isn't relatively poor at all. I think Kyle himself said that even the X800pro delivered 'great' performance, and Carmack said in the HardOCP article that there's more to consider than just frame rate. The frame rate difference even today is so minor, it's impossible to tell without diagnostic tools - ie: the end user experience isn't affected. And with ATI you get full-precision enabled all the time - we don't do PP (on R3XX and R4XX) like some of our competitors. It's also important to note that most of today's games play faster on ATI hardware, and you can expect that to extend to other 'big title' games expected this summer. Chris"

1. If ATI's performance isn't "relatively poor at all", then performance of GFFX 59xx must have been damn near identical to Radeon 9700/9800/etc
2. Nowhere in the article did I find the X800 Pro paired with "great" performance. There is a sentence saying the 9800XT/5950 will deliver "great" performance.
3. I guess it depends on what you consider "diagnostic tools". I would bet most people could tell the difference between 36.3 fps (6800GT) and 21.5 fps (X800 Pro) and depending on the game, I can feel the difference between 56 fps (6800GT) and 36.8 fps (X800 Pro). Unless of course he considers vision/playing with your eyes open "diagnostic tools".
4. Again with the PP. If the dev uses it and it doesn't change the output, why not?
5. Many games are faster on the X800 XT PE vs 6800U, but not most OpenGL games, and certainly not with an X800 Pro vs 6800GT.


Seems to me ATI recently has become more and more like nvidia of a few years past.

1. Doing "funny things" with trilinear filtering (which is ok since it doesn't really change the output) but not telling anyone and not allowing anyone to disable it (which is not ok, even nvidia seems to have learned this)
2. Totally dismissing any advantage the competitor has instead of just talking up your own strenghts and taking the high road and saying little to nothing about that competitor.
3. Enabling features in drivers once new hardware is out, even when older hardware has been able to support it for over 1 1/2 years (TAA, geometry instancing)
 
the@ntipop said:
He kept refering to nVidia partial precision, but didn't ATi get caught doing that as well?
I think he was just talking about Nvidias use of it in Doom 3.
 
it would have been better if he would have said "our opengl drivers are not up to our high standards we will be working hard to provide our customers with high performance and stable opengl drivers, this is our goal we would like to see this happen before doom3 releases" as opposed to "whatever man doom3 aint out yet who the hell cares, nvidia is using partial precision man and you don't see nvidia users crying, GET OVER IT, we'll come out with some new drivers as ID said in the past "when it's done", so chill *thought* dam it i thought these beta drivers would do the trick.
 
Doom 3 gamers will get an excellent experience across the whole range of RADEON cards, right from entry-level X300 to the high-end X800 XT Platinum Edition. Our driver and hardware teams will continue to analyze Doom 3 across the range of resolutions and RADEON gamers can expect to see even better performance in future. The X800 remains the fastest card for the majority of games played today and for games being launched in the near future.

Around the time of the 3dmark/cheating/optimizations, I seem to remember quite a few people claiming ATI didn't optimize for individual games and only do things for performance if they have an effect on multiple games. Looks like that's changed as soon as they lost a benchmark by a large margin.
 
ohgod said:
I think he was just talking about Nvidias use of it in Doom 3.

I believe the@ntipop was confusing partial precision with the whole brilinear/trylinear thing.
 
(shout out to iron chef!)

:D


Did Carmack or the ID folks say that the 9800pro was THE recommended hardware for Doom 3 last year? - personally I think it's a stretch to think we are going to spend 400~500 dollars to run a 49.99 game (let alone all the money you have to sink in to get the rest of the rig up to snuff) -

"I spent 4000 dollars on my rig and I can run Doom3 at 1600x1200 with all the eye candy turned on!!"

"What else do you do with it?"

"Surf the internet, read email.."

"Do you need a 4000 rig to do that?"

"Shut up"
 
groebuck said:
(shout out to iron chef!)

:D


Did Carmack or the ID folks say that the 9800pro was THE recommended hardware for Doom 3 last year? - personally I think it's a stretch to think we are going to spend 400~500 dollars to run a 49.99 game (let alone all the money you have to sink in to get the rest of the rig up to snuff) -

"I spent 4000 dollars on my rig and I can run Doom3 at 1600x1200 with all the eye candy turned on!!"

"What else do you do with it?"

"Surf the internet, read email.."

"Do you need a 4000 rig to do that?"

"Shut up"

LOL Some truth there. But, honestly i bought my $400 card for more than just Doom 3. It is great to know that Doom 3, and games that will use its engine, will run very well on my system though. And it doesn't take $4000 to run Doom 3, and all the other top end games, at 1600 X 1200 with all the eye candy turned on. My rig didn't cost anywhere near that, and is plenty fast enough for any game i can throw at it. And i do in fact use it for alot more than games. Things like photoshop, and media encoding, as well as surfing the internet. :) So, while your point was funny and does apply (to a certain extent) to alot of people; there are still some flaws with it. The bottom line will always be: If you want top-notch performance from your computer, you will have to pay for it.
 
Where did all the doom3 built for nv40 come from? In the interview with john carmack on icons he said the engine was based on the capabilities of the nv25 , which is two generations ago. Did i miss something?

Also I think its pretty week that ATi's first argument is the game hasnt been released.
 
Well Nvidia Touts SM 3.0 but they tell devlopers to use 16 bit precision. Since SM 3.0 Spec says it should be 32 bit precision. Even at 2.0 and 2.0b SM ATI uses 24 bit. Both companies realy didn't do a good job of implementing full standard SM.
 
Dosomo said:
LOL Some truth there. But, honestly i bought my $400 card for more than just Doom 3. It is great to know that Doom 3, and games that will use its engine, will run very well on my system though. And it doesn't take $4000 to run Doom 3, and all the other top end games, at 1600 X 1200 with all the eye candy turned on. My rig didn't cost anywhere near that, and is plenty fast enough for any game i can throw at it. And i do in fact use it for alot more than games. Things like photoshop, and media encoding, as well as surfing the internet. :) So, while your point was funny and does apply (to a certain extent) to alot of people; there are still some flaws with it. The bottom line will always be: If you want top-notch performance from your computer, you will have to pay for it.

:D sez the person who's sig totals to about 1300 in parts (today's prices :lol)
 
What I dont get is that after D3 benchmarks arrive suddenly ATI cards suck because they perform very poorly. As if D3 is the only game on earth! The game isnt even out yet and all people can talk about is poor ATI performance in D3! Don't forget that the X800 still kicks ass in almost every other game. FarCry is still faster on the X800 and nobody can deny that Farcry is one of the "big games" of 2004. And then there is HL2 that was showcased on ATI hardware.

Performance isnt the only factor. Personally, I dont want a videocard that requires two molex connectors in my box. In fact, I dont even like the one molex connector that the X800 uses.
 
Mr. D. said:
What I dont get is that after D3 benchmarks arrive suddenly ATI cards suck because they perform very poorly. As if D3 is the only game on earth! The game isnt even out yet and all people can talk about is poor ATI performance in D3! Don't forget that the X800 still kicks ass in almost every other game. FarCry is still faster on the X800 and nobody can deny that Farcry is one of the "big games" of 2004. And then there is HL2 that was showcased on ATI hardware.

Performance isnt the only factor. Personally, I dont want a videocard that requires two molex connectors in my box. In fact, I dont even like the one molex connector that the X800 uses.


Where do you people think the x800 is in the lead in all other games? Damn go through the non bias benchmark sites again. the cards are even matched in all Dx games, ATi wins some, nV wins some. Even in Far Cry depending on the level. Then add in the fact that the gf 6 line has more features, and a hell of lot better Ogl performance.

Performance isn't the only facter you are right, but how many people today don't have a 350 watt power supply? Its pretty much standard now. An extra molex connector isn't going to kill anyone. If ya don't want to use an extra molex, fine the card can handle 1 just fine.
 
I also don't think the case is that "Nvidia is using partial precision", but rather "John Carmack CHOSE to use partial precision, in some cases, on Nvidia cards",

The more I read about it, the more I think the x800pro is an fx5800 equivalent... a dud.
 
hordaktheman said:
I also don't think the case is that "Nvidia is using partial precision", but rather "John Carmack CHOSE to use partial precision, in some cases, on Nvidia cards",

The more I read about it, the more I think the x800pro is an fx5800 equivalent... a dud.
hahahaha
 
Mr. D. said:
What I dont get is that after D3 benchmarks arrive suddenly ATI cards suck because they perform very poorly. As if D3 is the only game on earth! The game isnt even out yet and all people can talk about is poor ATI performance in D3! Don't forget that the X800 still kicks ass in almost every other game. FarCry is still faster on the X800 and nobody can deny that Farcry is one of the "big games" of 2004. And then there is HL2 that was showcased on ATI hardware.

Performance isnt the only factor. Personally, I dont want a videocard that requires two molex connectors in my box. In fact, I dont even like the one molex connector that the X800 uses.

you're obviously a biased towards ATi since you own an X800 :rolleyes:
Farcry sucked ass. I played the game for 15min, then uninstalled it. If you step away from the eye candy for ONE SECOND, you'll realize how CRAPPY the gameplay is. It's like a dumb blonde, nice to look at, and nice to play with, but it has NO DEPTH. If all you want to do is stare at eye candy all day, then good for you, keep up the good work, but I'm sure at least 95% of all other gamers think GAMEPLAY is more important.

On another note, I'm also 90% sure that Doom3 DOES NOT USE OPENGL. I remember reading some carmack interview months ago, and he said D3 uses a proprietary API. I don't remember if it was only for nVidia cards or for both, but if someone can dig up the interview, you'll see. If I'm wrong, then I take it back.
 
isn't it all just bragging rights? If you buy an x800pro or a 6800GT your getting a damn fast card for 500 bucks - They both trounce my 9700 pro but hey I'm keeping it - I paid 300 for it less than two years ago!!
 
HongKongPolice said:
you're obviously a biased towards ATi since you own an X800 :rolleyes:
Farcry sucked ass. I played the game for 15min, then uninstalled it. If you step away from the eye candy for ONE SECOND, you'll realize how CRAPPY the gameplay is. It's like a dumb blonde, nice to look at, and nice to play with, but it has NO DEPTH. If all you want to do is stare at eye candy all day, then good for you, keep up the good work, but I'm sure at least 95% of all other gamers think GAMEPLAY is more important.

On another note, I'm also 90% sure that Doom3 DOES NOT USE OPENGL. I remember reading some carmack interview months ago, and he said D3 uses a proprietary API. I don't remember if it was only for nVidia cards or for both, but if someone can dig up the interview, you'll see. If I'm wrong, then I take it back.

I don't have an x800 and I enjoyed Farcry. It's a really good game that has a lot of things going for it other than the eye candy. Also unless you progressed through a good deal of the game in the 15 mins, you really haven't played it at all.

With that said, doom3 is an opengl game.
 
Mr. D. said:
What I dont get is that after D3 benchmarks arrive suddenly ATI cards suck because they perform very poorly. As if D3 is the only game on earth! The game isnt even out yet and all people can talk about is poor ATI performance in D3! Don't forget that the X800 still kicks ass in almost every other game. FarCry is still faster on the X800 and nobody can deny that Farcry is one of the "big games" of 2004. And then there is HL2 that was showcased on ATI hardware.

Performance isnt the only factor. Personally, I dont want a videocard that requires two molex connectors in my box. In fact, I dont even like the one molex connector that the X800 uses.

Far Cry is a big title, but its still not on the same pedestal that Doom3 and Half-Life2 will sit on. There wasnt 4 years of pent up desire, it wasnt part of a hugely recognizable/hugely popular franchise, there doesnt seem to be a lot of (any?) games coming out based on its engine etc etc. Plus the frame differences in Doom3 are amplified because the base framerate is so low to begin with.
 
Back
Top