id Software's Official DOOM3 Benchmarks

rancor said:
4.9? weren't those the beta drivers used in the orginal benchmarks?

Yep they were, there is no huge boost, there won't be a huge boost for a long time for ATi Ogl drivers.

doesnt look like it matters either..LOL.
 
is there a doom3 timedemo thread around here somewheres..i havent seen it lately...

btw 1280x1024 High quality...58.0 fps..... damn this card sucks huh...
 
theelviscerator said:
is there a doom3 timedemo thread around here somewheres..i havent seen it lately...

btw 1280x1024 High quality...58.0 fps..... damn this card sucks huh...


great... another fanboy. :rolleyes:
 
rancor said:
4.9? weren't those the beta drivers used in the orginal benchmarks?

Yep they were, there is no huge boost, there won't be a huge boost for a long time for ATi Ogl drivers.


Ah yes the usual forum member that has his crystal ball allowing him to see into the future. :rolleyes:
 
Chris_B said:
Ah yes the usual forum member that has his crystal ball allowing him to see into the future. :rolleyes:

Crystal Ball? gander over to ATi's dev rel, and talk to thier driver engineers, its not going to happen for a while. The engine I'm working on right now is purely Ogl, and it ticks me off that ATi isn't going to do anything about Ogl performance in the short run.
 
Well obviously it takes time to rewrtite an opengl driver, people seem to be thinking its just not gonna happen at all though even though they started this a month or so back.
 
Thats true, its going to take a good 6 months at least, by then this generation of cards, Doom 3 all old news.
 
ATI has no excuse for their current situation. They knew Doom III was coming, they knew it used Open GL. They've had a year to get their shit straight and they flat blew it. Lest you think I'm an ATI basher, the last 3 video cards I've bought have all been ATI. If I were ATI's management I'd tell my driver team to do whatever it took to get theri Open GL driver updated and released, and I don't mean Beta drivers either.
 
CrimandEvil said:
They had three years not one
Well then they REALLY have no excuse, and I don't feel a bit sorry if all this hurts their sales. Coupled with their slipshod "paper" launch of the X800 series, they've haven't learned the lesson from Nvidia about stepping on their wang.
 
Grinder66 said:
great... another fanboy. :rolleyes:


listen fanblade...if you had to put up with all the crap I have around here telling me how bad my x800 pro was gonna do, nm you dont have a clue.....

then after all is said and done ,, I game at 1280 1024 high settings 2x AA and my fps counter stays pegged at 60 level to level....

kma.....
 
Kyle said on the HardOCP mainpage news items today that he was using the 4.9 Beta Catalysts.
 
I also saw that he mentioned that they used the same drivers that was released yesterday, i think this are a later driver build then the ones used at the tests at id since this 4.9 betas have a build date of 27/7 and they tested 20/7
 
qdemn7 said:
ATI has no excuse for their current situation. They knew Doom III was coming, they knew it used Open GL. They've had a year to get their shit straight and they flat blew it. Lest you think I'm an ATI basher, the last 3 video cards I've bought have all been ATI. If I were ATI's management I'd tell my driver team to do whatever it took to get theri Open GL driver updated and released, and I don't mean Beta drivers either.

they have NO excuse.....if I were ATIs manager....there be a massive sudden increase in programmers looking for jobs....and also anyone else that told me "doom3 shouldnt be too big"
 
xSyzygy666x said:
they have NO excuse.....if I were ATIs manager....there be a massive sudden increase in programmers looking for jobs....and also anyone else that told me "doom3 shouldnt be too big"
I could have been due to stupid managers (atleast thats the usual in my experience).
 
theelviscerator said:
is there a doom3 timedemo thread around here somewheres..i havent seen it lately...

btw 1280x1024 High quality...58.0 fps..... damn this card sucks huh...

Consdering the GTs can do 16x12 Ultra 4xAA...ya...it kinda does.
 
The Batman said:
Consdering the GTs can do 16x12 Ultra 4xAA...ya...it kinda does.

laffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff

from what i am seeing they cant run cod more than 30 fps half the time....
 
According to the timedemo on my above average machine (not high end), I get 36 fps 1600x1200 Ultra Quality and 4xAA with my GT @ 415/1170.
 
theelviscerator said:
listen fanblade...if you had to put up with all the crap I have around here telling me how bad my x800 pro was gonna do, nm you dont have a clue.....

then after all is said and done ,, I game at 1280 1024 high settings 2x AA and my fps counter stays pegged at 60 level to level....

kma.....

Well Im very happy for you....really I am.

Get a life. :rolleyes:

I have never put down you're card there are more important things in life than if I am getting 10 FPS more than you....I don't really care. It just seems to me like the fanboys (on both sides) that constantly put down the competitors card are doing so to try to justify their purchase albeit good or bad. You sig is a perfect example....

Quote [theelviscerator's sig] : "6800 reminds me of a fast car that craps out 10 laps into a 500 miler. They are buggy and break down easily if they run at all."


FFS. Get serious.
 
kcthebrewer said:
According to the timedemo on my above average machine (not high end), I get 36 fps 1600x1200 Ultra Quality and 4xAA with my GT @ 415/1170.

if you get that in the timedemo i guarantee you will see single digit framerates in game...

not playable...
 
Well I am playing the actual game at this setting and it fully playable for me so far. But of course anything 20fps or higher is completely playable for me (I have no idea what framerates I am getting in game). It does seem to me that the demo run actually runs slower than the game.
 
theelviscerator said:
if you get that in the timedemo i guarantee you will see single digit framerates in game...

not playable...


err what does time demos have to do with cpu? cpu isn't really being pushed to any extent as teh graphics cards are. That means in a real game test unless the cpu is being hammered down more then the graphics card then and only then will the frame rates drop below what he is getting now.
 
rancor said:
err what does time demos have to do with cpu? cpu isn't really being pushed to any extent as teh graphics cards are. That means in a real game test unless the cpu is being hammered down more then the graphics card then and only then will the frame rates drop below what he is getting now.

Time demo's aren't calculating AI on the fly...the demo has been recorded.
 
right so pretty much his frame rates really won't drop when he plays the game since its the gpu is already a bottleneck
 
rancor said:
err what does time demos have to do with cpu? cpu isn't really being pushed to any extent as teh graphics cards are. That means in a real game test unless the cpu is being hammered down more then the graphics card then and only then will the frame rates drop below what he is getting now.


actually what I meant certain levels/areas push your system harder than the timedemo, MUCH harder....
 
I verified with Carmack that the demos in no way really imitate realworld gameplay. Both physics and AI are not weighted on the CPU when running time demos. Even in talks with id about the best way to do our guide a couple of months ago, it was decided that doing manual run throughs in GOD MODE was even a bad idea...but I don't remember why. Robert Duffy was the one that had that to say.

Anyway, we just bound a key to giveall and started playing....
 
Thx Kyle thats what I had in mind, at least thats how I would have programmed a time demo to do :D . Just makes sense, pure graphics card benchmark.
 
rancor said:
Thx Kyle thats what I had in mind, at least thats how I would have programmed a time demo to do :D . Just makes sense, pure graphics card benchmark.


omg...dude have you even played the game......
 
theelviscerator said:
omg...dude have you even played the game......


Yes I do have the game, also I have programmed time demo software before too, its a freakin recording, the out puts are triggers. And you are telling me the developer of the game doesn't know what they are talking about? :rolleyes:

Ignorance is bliss till ya open your mouth.
 
rancor said:
Yes I do have the game, also I have programmed time demo software before too, its a freakin recording, the out puts are triggers. And you are telling me the developer of the game doesn't know what they are talking about? :rolleyes:

Ignorance is bliss till ya open your mouth.


I agree completely with you on this one.......I just usually record timedemos myself..I dont "program them"
 
this thread is a waste of time , lol

who gives a crap whose card runs better, as long as your card can run your games well , then who cares?

saying ati is about to go belly up because of nvidia, is like saying GM & FORD are going belly-up because of DODGE.

its like saying Navistar lied about their diesel engines when compared to Cummins.
each to thier own , but its a waste of time to argue the merits of one versus the other
when the software used to compare them is only 2 weeks out of the womb.

hype or not

oh and my sig runs the game very well , feel sorry for you guys that ran out and dumped loads of cash on upgrades.
 
ciggy50 said:
this thread is a waste of time , lol

who gives a crap whose card runs better, as long as your card can run your games well , then who cares?

saying ati is about to go belly up because of nvidia, is like saying GM & FORD are going belly-up because of DODGE.

its like saying Navistar lied about their diesel engines when compared to Cummins.
each to thier own , but its a waste of time to argue the merits of one versus the other
when the software used to compare them is only 2 weeks out of the womb.

hype or not

oh and my sig runs the game very well , feel sorry for you guys that ran out and dumped loads of cash on upgrades.


umm not to be hypocrite , but where or how do you get the benchmark for Doom3?

LOL funny how non-biased people can be when they own the inferior product (-; Kidding I totally agree ... upgrading for DOOM 3 is just plain stupid if you can run the game close to 30fps most of the time with decent settings you should be content and wait for games you can actually play more then once (-:
 
its been 2yrs and $300 later and this card STILL runs everything great
doom3 runs 35.7fps, 1024x768 @high settings with no jerkiness at all

ima happy camper :)
 
Back
Top