id Software's Official DOOM3 Benchmarks

SuperRob said:
How many times can you say the same thing, when people are AGREEING with you? Jesus Tapdancing Christ, we hear you, and WE AGREE WITH YOU. MAYBE if I type IMPORTANT words in ALL CAPS, you'll UNDERSTAND ME!

All we're saying is that OpenGL is something that can be improved. Can it be improved enough to take the lead? Probably not, but it will be improved. But if there is a hardware flaw, then that's something to take a little more seriously.

It isn't an ATI hardware "flaw", but hardware that is lacking. NVIDIA's Ultrashadow II tech is awesome.
 
evilchris said:
ATI is not "stomping" NVIDIA in anything. Sorry. The Doom3 engine will be THE MOST LICENSED NEXT GEN 3D ENGINE. It isn't just ONE GAME. Ever hear of QUAKE? QUAKE 4 IS COMING, USING THE D3 ENGINE.

A MEASLY 20 fps? ROFL. 20 FPS when the average is 60 IS A LOT. AHAHAHAHA. You think when one card is getting 56 fps and one 36 fps that's MEASLY? ROFL!! Is ATI 20 fps ahead of NVIDIA in ANYTHING? NOPE! PWNED.


Fanboy? Look at your sig ATI FANBOY.


I'm sorry,but the fact is I don't feel "Pwned" as you would say.I am definantly not going to enjoy the game any less then you will just because I have to use settings a bit lower then you.Yes there's going to be some games based on the Doom3 engine,but when?And do your eally think ATI is so stupid that they won't have much better Open GL drivers by the time those games show up.I'm not saying anything against Nvidia and if I had $400+ when I bought my X800 Pro I might have bought a 6800 GT,but the fact of the matter is I only paid $315 for my card and am very pleased with it so please just let people enjoy the game even though some of us have to use a crappy ATI card to do it.
 
I have to say it is funny to hear people calling on ATI drivers to save the day. That is what all the NV fans did a couple weeks back, and all the ATI fans gave them sh1t about it. At the same time all the NVers said FarCry is just one game so who cares. All the ATIers said I'm sure lots of dev will use the Cry engine. It is funny.
 
evilchris said:
It was a mess wasn't it! LOL! I had a Viper v330. Even though the colors were all borked, and I got the drivers from Rivazone.com, I was stoked to be playing it in Default OpenGL mode!

I remember going through a hundred different drivers and hacks trying to get it to work. But eventually Nvidia released official drivers, and it was all sweet as hell from there on out.

I still had that card up until a few weeks ago.
 
^eMpTy^ said:
I'll bet you a coke ATi will get its ass kicked in Q4 too. It's called a reasonable assumption. You don't have to be a fortune teller to know that a game based on the same engine will have similar performance, you just need an un-biased brain.

You love Nvidia and will never purchase an ATI card if I interpret your comments correctly and you are saying that I am biased? I wait for real world numbers on released games that don't come from the company that made the game. Once a company tells people what card to buy to play their game I won't put trust into the numbers they provide. And I say telling people which card to buy by their endorsement of the Nvidia cards.

Will [H] be providing benchmarks with the hardware guide for D3?
 
^eMpTy^ said:
I remember going through a hundred different drivers and hacks trying to get it to work. But eventually Nvidia released official drivers, and it was all sweet as hell from there on out.
.

I was right there with ya man, old times =)
 
CrimandEvil said:
Now I just have to find a good GF3 or 4 card :) .... :( (too poor right now for a $300 card).

I have a PNY Ti500 gathering dust if you are interested. :D



As to the topic at hand.

HOLY MOMMA! I am stoked.
 
so ya i wonder how my 9700 will deal with this, all this talk about ati isnt the slightest bit worry some considering nvidia has been given the opportunity to dev with id for much longer

it will be the same situation backwards when hl2 comes out everybody knows that.

i predict one week after d3 comes out ati will release a new driver set that will restore order to the universe.
 
BoogerBomb said:
You love Nvidia and will never purchase an ATI card if I interpret your comments correctly and you are saying that I am biased? I wait for real world numbers on released games that don't come from the company that made the game. Once a company tells people what card to buy to play their game I won't put trust into the numbers they provide. And I say telling people which card to buy by their endorsement of the Nvidia cards.

Will [H] be providing benchmarks with the hardware guide for D3?

If you are questioning John Carmacks integrity, then, LOL. If you think "real world" will show ATI is faster, you have another thing coming.
 
Dyslexic said:
I have to say it is funny to hear people calling on ATI drivers to save the day. That is what all the NV fans did a couple weeks back, and all the ATI fans gave them sh1t about it. At the same time all the NVers said FarCry is just one game so who cares. All the ATIers said I'm sure lots of dev will use the Cry engine. It is funny.

I don't understand this attitude.If Nvidia can drag itself out of the mud after their last generation of cards and come up with their great Forceware drivers why would it be so impossible for ATI to improve their drivers?They aren't complete idiots I don't think,even if they did make a mistake of not having a card in between the XT and Pro that would be competivelypriced against the GT.
 
coffee33 said:
Come on ladies and gentlemen! Don't let a small thing like 20fps get in the way of
[H]ardness.
I don't care what company does with framerate, but I bet you all a months pay that I will be the first one to not only buy, but become a fanboy of the first video card that will make my online porn videos longer! :D

20fps is MONUMENTAL when it's 50 fps vs 30 fps.
 
Speaking of truly shitty graphics cards ...

I owned one of the original Diamond Viper PCI cards, back in the day. I had a difficult time playing fantastic games like "Sam and Max Hit the Road", because of flickering colors all over the place, and even though they fixed some issues with drivers, it was a hardware flaw deep down. Eventually, they admitted the cards were fundamentally broken, and offered a whopping $20 discount on a new card. Never bought a Diamond product again after that.

This is one of the reasons why I'm more willing to cut a company slack over poor drivers, because those can always be fixed. Broken hardware is broken forever. I'm not saying ATI will be able to get the OpenGL up to snuff, but at least they could.
 
PoW said:
so ya i wonder how my 9700 will deal with this, all this talk about ati isnt the slightest bit worry some considering nvidia has been given the opportunity to dev with id for much longer

it will be the same situation backwards when hl2 comes out everybody knows that.

i predict one week after d3 comes out ati will release a new driver set that will restore order to the universe.

Funny, their OGL driver has sucked ass for FIVE YEARS and you think 1 week after D3 a good one will come out? When Call of Duty launched all ATI users got was crash to the desktop!
 
evilchris said:
ATI has *never* been able to catch up to NVIDIA in OpenGL. That's why 6800 GT's are FASTER than X800 XT PE's in QUAKE 3 ENGINE GAMES.

Wow like that means much :rolleyes: every OPENGL game I've ever played has worked fine on my radeon, I could careless if a GT gets 400 FPS in Q3 while I get 300, I'd rather have slower OPEN GL for a minority of games then a slow D3D for the majority.
 
BoogerBomb said:
You love Nvidia and will never purchase an ATI card if I interpret your comments correctly and you are saying that I am biased? I wait for real world numbers on released games that don't come from the company that made the game. Once a company tells people what card to buy to play their game I won't put trust into the numbers they provide. And I say telling people which card to buy by their endorsement of the Nvidia cards.

Will [H] be providing benchmarks with the hardware guide for D3?


face the facts...it is as simple as that

the benchmarks were shown today and they favor nvidia obviously

this isnt a CONSPIRACY for god sakes.....

the numbers speak for themselves as ive said before

i would love to see the response carmack would give a person like you who spoke so unintelligently about the subject

omg omg omg omg conspiracy...john is out to screw us...he cheats ..puts in optimized code for only nv hardware
 
PoW said:
so ya i wonder how my 9700 will deal with this, all this talk about ati isnt the slightest bit worry some considering nvidia has been given the opportunity to dev with id for much longer

it will be the same situation backwards when hl2 comes out everybody knows that.

i predict one week after d3 comes out ati will release a new driver set that will restore order to the universe.

If they didn't already have it in the works I hope they do now and I say that because I am going to have my 9800Pro til at least next year. I just can't afford the $400 video cards of today. I just hate to see how much they will be next year, $600?
 
evilchris said:
CA MEASLY 20 fps? ROFL. 20 FPS when the average is 60 IS A LOT. AHAHAHAHA. You think when one card is getting 56 fps and one 36 fps that's MEASLY? ROFL!! Is ATI 20 fps ahead of NVIDIA in ANYTHING? NOPE! PWNED.

really, i like the 6800gt too but i think you're getting too into the results chris :rolleyes:.
 
KayossZero said:
Wow like that means much :rolleyes: every OPENGL game I've ever played has worked fine on my radeon, I could careless if a GT gets 400 FPS in Q3 while I get 300, I'd rather have slower OPEN GL for a minority of games then a slow D3D for the majority.

Man I felt the same way, COD@1600x1200 with 4xAA and 8AF is like a brand new game. I would have to hand my 9800pro a tissue if I played at those settings. And what do you mean slow 3d? All of the new cards are pretty hard hitters in the D3D, nvidia winning some and ati winning some. Usually the margin is within a handful of frames that you wouldn't ever notice.
 
cstafs said:
I don't understand this attitude.If Nvidia can drag itself out of the mud after their last generation of cards and come up with their great Forceware drivers why would it be so impossible for ATI to improve their drivers?They aren't complete idiots I don't think,even if they did make a mistake of not having a card in between the XT and Pro that would be competivelypriced against the GT.

Well I can tell you why people have this attitude.

ATi has a looooooooooooooooooong history of sucky drivers. Going back 6 or 7 years. Nvidia on the other hand, has a history of drastically improving the performance of their cards via driver updates with nearly every major new core they have ever made.

ATi has a looooooooooooooooooong history of shitty OpenGL support. And think of this, starting with the 9700 Pro, released 2 years ago, ATi has been working with the same core, and yet they still have not produced a decent OpenGL driver. TWO YEARS

So I'm not saying that it's impossible for ATi to release a great OpenGL driver and fix things up in Doom 3, it's just seriously unlikely.
 
scientificTHEgreat said:
face the facts...it is as simple as that

the benchmarks were shown today and they favor nvidia obviously

this isnt a CONSPIRACY for god sakes.....

the numbers speak for themselves as ive said before

i would love to see the response carmack would give a person like you who spoke so unintelligently about the subject

omg omg omg omg conspiracy...john is out to screw us...he cheats ..puts in optimized code for only nv hardware

So now I am immature and unintelligent for arguing my point the same way that all the other ATI and Nvidia people have been?
 
BoogerBomb said:
I dont see how its an ureasonable request. Not everyone gets to play with all the newest stuff that very few people have although I do understand that these weren't your benchmarks. Most seem to be done solely on the most expensive cards avaialable. Im sure there are a lot of people out there with 9800 Pro's or 5700's that might like to see some benchmarks on D3. I cant see how these cards are considered outdated enough to not be included considering they still costs $200.

Am I the only one who doesn't have a $400 - $600 card?
I was being serious. In fact we already have them done. They will be FRAPped runthroughs on different systems. Two systems down with 11 different video cards so far.
 
This is my fav John Carmack quote:

but it should be noted that some of the ATI cards did show a performance drop when colored mip levels were enabled, implying some fudging of the texture filtering.

ATI IS CHEATING LOL.
 
So I'm not saying that it's impossible for ATi to release a great OpenGL driver and fix things up in Doom 3, it's just seriously unlikely.

And I'll say that if there is any one game that is important enough to convince ATI to make OpenGL support a priority, it's Doom 3. It's just a matter of whether or not it's enough.
 
KayossZero said:
Wow like that means much :rolleyes: every OPENGL game I've ever played has worked fine on my radeon, I could careless if a GT gets 400 FPS in Q3 while I get 300, I'd rather have slower OPEN GL for a minority of games then a slow D3D for the majority.

NV doesn't have slow D3D, sorry.
 
Dyslexic said:
People are crazy man. I am getting a 6800GT and am happy about it. I think and hope you will have a kickass time playing DOOM3 and lots of other games on your X800pro. It is a sweet card and so is the 6800GT why cant people just be happy with what they have, and be happy for other people that are lucky to have any of the high end cards. Best of luck and good gaming. :D

Thanks I will,and happy gaming with your 6800 GT!
 
boy, arn't we getting a little too worked up on one single game? It'll be quite sometime before games using D3 engine fully comes out. Think about recent Q3 engine game; Call of Duty or Soldier of Fortune 2. Games using D3 engine fully will take probably another 2 or 3 years to come out. By that time, we'll have Nvidia's uberultrasper 9999 XGT and ATI's ZX8000. Who knows what will happen then? I agree that D3 would be an awesome game, but to me, it's just another game. If anyone here still have a sane mindset, they'll wait just a few more weeks when dust are settled before they make purchasing decisions. By then, reviews of the game would be out (giving the REAL truth about how D3 is), benchies( giving the REAL analysis on how D3 will run on different system/cards), and driver optimization ( a chance for both companies to improve the numbers.). As for me? I'll wait out. If I can manage to pull medium setting at 1024 using my Athlon 1700+ > 2Ghz, 512mb, and trusty 9500>9700, I'll be very happy. If I really have to watch the eye candy now, I'll go over to my buddie's house to play it, or wait till prices on XT and GTs to come down.
 
BoogerBomb said:
So now I am immature and unintelligent for arguing my point the same way that all the other ATI and Nvidia people have been?


sorry ....how i feel about this subject is about how i feel about politics...pretty strong

i just see all these responses with half ass thinking remarks included in them

talking about omg omg nvidia cheats...oh its Hs fault...oh carmack must have optimized the games code for nvidia cards only
 
razor2050 said:
boy, arn't we getting a little too worked up on one single game? It'll be quite sometime before games using D3 engine fully comes out. Think about recent Q3 engine game; Call of Duty or Soldier of Fortune 2. Games using D3 engine fully will take probably another 2 or 3 years to come out. By that time, we'll have Nvidia's uberultrasper 9999 XGT and ATI's ZX8000. Who knows what will happen then? I agree that D3 would be an awesome game, but to me, it's just another game. If anyone here still have a sane mindset, they'll wait just a few more weeks when dust are settled before they make purchasing decisions. By then, reviews of the game would be out (giving the REAL truth about how D3 is), benchies( giving the REAL analysis on how D3 will run on different system/cards), and driver optimization ( a chance for both companies to improve the numbers.). As for me? I'll wait out. If I can manage to pull medium setting at 1024 using my Athlon 1700+ > 2Ghz, 512mb, and trusty 9500>9700, I'll be very happy. If I really have to watch the eye candy now, I'll go over to my buddie's house to play it, or wait till prices on XT and GTs to come down.

Hell no we aren't getting TOO worked up, do you understand how long we have been waiting for this?
 
razor2050 said:
boy, arn't we getting a little too worked up on one single game? It'll be quite sometime before games using D3 engine fully comes out. Think about recent Q3 engine game; Call of Duty or Soldier of Fortune 2. Games using D3 engine fully will take probably another 2 or 3 years to come out. By that time, we'll have Nvidia's uberultrasper 9999 XGT and ATI's ZX8000. Who knows what will happen then? I agree that D3 would be an awesome game, but to me, it's just another game. If anyone here still have a sane mindset, they'll wait just a few more weeks when dust are settled before they make purchasing decisions. By then, reviews of the game would be out (giving the REAL truth about how D3 is), benchies( giving the REAL analysis on how D3 will run on different system/cards), and driver optimization ( a chance for both companies to improve the numbers.). As for me? I'll wait out. If I can manage to pull medium setting at 1024 using my Athlon 1700+ > 2Ghz, 512mb, and trusty 9500>9700, I'll be very happy. If I really have to watch the eye candy now, I'll go over to my buddie's house to play it, or wait till prices on XT and GTs to come down.

Uh, yeah, I am sure John Carmack, the DEVELOPER OF THE D3 ENGINE, doesn't know how it REALLY PERFORMS ON EACH CARD. LOL!
 
creedAMD said:
You want to "high five" him, don't you?
I would, I would really love to but the sudden fling of both my arms around him after the shake my freak him out and get me banned. :(
 
creedAMD said:
Hell no we aren't getting TOO worked up, do you understand how long we have been waiting for this?

TOO TRUE

I have been waiting for this game for sooooooo long. You little bastards stop raining on my parade, if you can't get excited about this you either own an X800Pro or you're just a meathead....:)
 
^eMpTy^ said:
I think it's equally cute that after getting bitch slapped in the doom 3 benchmarks the ever hopeful ATi fanboys insist on downplaying a massive loss in the biggest game of the year and point to a new driver. After 2 years of optimizing the driver for this architecture, ATi will suddenly and miraculously be able to put out an OpenGL driver that doesn't suck ass.

I'm not saying it ain't gonna happen, I'm just saying take your lumps like a man. There's nothing you can say to counter the ass beating ATi took today, so why try? It just makes you sound desperate.


Last i checked iD them selves said 30fps was a god framerate in this game, and wtf ever hapened to the game being capped at a max of 60fps anyway? Carmack and hollenstead both said the game was hardcapped to a max of 60fps.
 
coffee33 said:
I would, I would really love to but the sudden fling of both my arms around him after the shake my freak him out and get me banned. :(

He only bans if you grab his ass. :p
 
KayossZero said:
Your one to talk about fanboys your the single most sickening one on here what a disgrace :rolleyes:

Its clear NVIDIA is better in OGL and doom 3 for right now can that change? Maybe, maybe not, no one can say for sure 100%

Very true, the only thing anyone can say for sure right now is that in the most foolproof, cheatproof, unbiased environment imaginable ATi got thier asses handed to them in the biggest game of the year.

And that's all that can be said.
 
Chris_B said:
Last i checked iD them selves said 30fps was a god framerate in this game, and wtf ever hapened to the game being capped at a max of 60fps anyway? Carmack and hollenstead both said the game was hardcapped to a max of 60fps.

I was wondering about that as well. Maybe their code didnt work?
 
BoogerBomb said:
I was wondering about that as well. Maybe their code didnt work?


Only thing i can think is theyve got a version or a command that overrides this...kinda takes away from the whole "hard capped" bit.
 
BoogerBomb said:
I was wondering about that as well. Maybe their code didnt work?

i'm sure if they wanted to make a fps cap they would have. "oops.. ok i guess the fps isn't limited, oh well." i hope thats not what you were talking about :p
 
Back
Top